A directory with 200k files, 140k of them duplicates.
I run MD5 duplicate search with Delete mode on and it completed OK.
But no files were marked for deletion and duplicates panel was really not usable - very slow scrolling, issues with drawing items, some empty space at the very end of the panel.
What was the CPU usage like? It may take a while to deal with that many items. 140k is a huge number of files, especially in that mode which is inherently slower.
Splitting things up to remove the duplicates in batches, then doing a final pass on the whole folder at the end, might be advisable, if possible.
Literally once in all the attempts DO actually selected files to delete - very strange.
After "Searching duplicates" message box disappears there is <15 s period of DO taking a full CPU core - in this period items are visibly still added to the duplicates list. After it's done, scrolling list is smooth and no artifacts appear.
And after trying to process duplicates directory after directory things got really weird (and the same with enabled and disabled MD5 cache) - this is attempt in the top directory after removing duplicates from all contained directories one by one (I didn't notice similar issues, but otoh I didn't pay attention).
Note:
incorrect group counts
one-element groups
sometimes duplicates are selected, and sometimes not
I'm not sure what's happening there, unless the results are taking a while to collate, but I would definitely split up the task either way. 140,000 items in a grouped collection is going to be painfully slow.