I don't think it is looping through the destination side (FTP site); it's all about the source folder.
Presumably the source folder is a mirror of, or very similar to, the FTP site? So the 14,709 items is just the count of things below the source side. Filtering will cause most of the files to be skipped, but they are still counted in the total number of source files to be processed, as are the folders. (The filtering hasn't happened yet at the time the progress dialog appears; it happens during the operation.)
What's slowing things down here is a combination of four things:
-
Lots of folders in the source. (You can't change this, but it's worth noting that simpler filtering methods will be OK if there aren't a lot of folders in the source.)
-
Creating/deleting folders on FTP is slow. (Again, can't change that, but it's worth noting that this isn't something to worry about with a normal file copy.)
-
The type of filter being used means Opus has to go into every folder, and consider every file. It can't skip whole folders. (More about this below.)
-
While copying, if Opus enters a folder it will check if it exists in the destination and created it if it doesn't, ready to copy things into it. Remember that it hasn't filtered the child items yet. If it then goes through the child items and finds there is nothing to copy, due to the filter, and that the filter does not match the folder itself, it will then delete the folder again.
The last point is something we could improve, although it's a complicated business as the file copying and filtering code does a lot of different things that all interact.
But the second last point may be something you can fix by changing the filter slightly, so that it can skip entire folders.
A full path clause will not allow Opus to skip entire folders, as it never knows if something deeper down in the folder might cause a match, and has to look at everything.
But a sub-folder clause can allow whole folders to be skipped. The only problem is you have to define the filter so it matches what you want to copy and all of its parent folders. (This is because as soon as any folder fails to match the sub-folder clause, it and everything below it will be skipped. That's good as it speeds things up immensely, but it also means the filter has to match the parents, not just the things within them.)
There's a bit more about sub-folder filters here: How to filter items by location of sub-folder.
Alternatively, you could speed things up by doing the filtered copy (using the original filter, unmodified) into a local temporary folder, so you have just the files you want to upload there, and then copy/move that folder to the FTP site with filtering turned off.
Your idea of a script that uses Find Results should also be possible, although you could also write a script which does the filtering itself and only copies the desired files. It could work in a few different ways so it's a bit open-ended, but we can help with scripting if none of the above ideas do what you need.