Using copy filter to upload to FTP server

For some reason running a copy filter from a local folder to an FTP server seems to cause the filter to also run against the files on the web server. I don't understand why, since surely the filter only needs to identify the correct local files and upload them? This is the screen I get:

It seems to try and loop through all 14 thousand file nodes on the server which as you can guess takes a stupidly long time. I'm not sure why it has to do this. Copying the same files through the same filter between local folders takes a couple of seconds, while I expect this will take 5 minutes.

If there isn't a direct solution to this, then I wonder if it is possible to write a script instead that would copy files from a Find Results tab (after the filter has been run on the local files) to the appropriate locations on the server? In essence I would need to do something like (psuedo code!):

foreach file in Find Results (
var oldpath = file.location
var newpath = oldpath - [first x segments]
copy file from path to newpath relative to FTP root directory
)

Is this possible?

What is the exact thing you are doing, including the filter definition?

I am trying to copy files from a local folder to an FTP server, with the following copy filter applied:

EDIT: The first date clause, within 1 week, is a mistake- I just noticed. I've since removed it. It shouldn't matter anyway.

I don't think it is looping through the destination side (FTP site); it's all about the source folder.

Presumably the source folder is a mirror of, or very similar to, the FTP site? So the 14,709 items is just the count of things below the source side. Filtering will cause most of the files to be skipped, but they are still counted in the total number of source files to be processed, as are the folders. (The filtering hasn't happened yet at the time the progress dialog appears; it happens during the operation.)

What's slowing things down here is a combination of four things:

  • Lots of folders in the source. (You can't change this, but it's worth noting that simpler filtering methods will be OK if there aren't a lot of folders in the source.)

  • Creating/deleting folders on FTP is slow. (Again, can't change that, but it's worth noting that this isn't something to worry about with a normal file copy.)

  • The type of filter being used means Opus has to go into every folder, and consider every file. It can't skip whole folders. (More about this below.)

  • While copying, if Opus enters a folder it will check if it exists in the destination and created it if it doesn't, ready to copy things into it. Remember that it hasn't filtered the child items yet. If it then goes through the child items and finds there is nothing to copy, due to the filter, and that the filter does not match the folder itself, it will then delete the folder again.

The last point is something we could improve, although it's a complicated business as the file copying and filtering code does a lot of different things that all interact.

But the second last point may be something you can fix by changing the filter slightly, so that it can skip entire folders.

A full path clause will not allow Opus to skip entire folders, as it never knows if something deeper down in the folder might cause a match, and has to look at everything.

But a sub-folder clause can allow whole folders to be skipped. The only problem is you have to define the filter so it matches what you want to copy and all of its parent folders. (This is because as soon as any folder fails to match the sub-folder clause, it and everything below it will be skipped. That's good as it speeds things up immensely, but it also means the filter has to match the parents, not just the things within them.)

There's a bit more about sub-folder filters here: How to filter items by location of sub-folder.


Alternatively, you could speed things up by doing the filtered copy (using the original filter, unmodified) into a local temporary folder, so you have just the files you want to upload there, and then copy/move that folder to the FTP site with filtering turned off.


Your idea of a script that uses Find Results should also be possible, although you could also write a script which does the filtering itself and only copies the desired files. It could work in a few different ways so it's a bit open-ended, but we can help with scripting if none of the above ideas do what you need.

Hi Leo, many thanks for an excellent detailed reply as usual, and sorry I never got back to this before now. This had to take a back seat while I dealt with work commitments (projects, deadlines...)

I'd like to explore what you said in your last two paragraphs if I may, since that sounds like what I'm aiming for.

Recap

To recap, this is directly related to another thread which I started. [ Start synchronisation comparison (with specified criteria) with single button ] But I'll summarise here:

My situation is this: I am a web developer who develops on a local server often using the same framework (Laravel), which obviously has a specific directory structure. At regular intervals, I want to sync all my changes with a live, online server via FTP. Since I will have made changes in a variety of different directories throughout the framework, it's a pain to have to click between the directories, locate each file and upload it. So-

  • I want to run a filter that scans all the directories in which I typically make changes and locate all files modified in the last day.
  • Then I want to display these files temporarily (in a file collection tab or whatever) so I have a chance to quickly review what I'm about to upload.
  • Then, with a final click, upload them all at once!

Current situation

I have got as far as the first two bullet points on that list. Specifically, I have a button which runs a filter and displays the results in a file collection:

Go {sourcepath} NEWTAB
Find NAME "Laravel FTP" IN {sourcepath} FILTER

(At this stage let's not worry what the filter is since that's all working).

The problem

So let's suppose I modify a couple of files and run the filter. Here's what I might get:

24

Now what I need to do is take these files and upload them to the corresponding location on the FTP server. The root directory of the FTP server will always correspond to the local directory on which I ran the filter- in this case, that's A:\Local\tmr. So for each file I need to somehow parse the location path (shown in Location column in the screenshot) by knocking the first 3 segments off the full path, and then copying to the same relative path on the FTP server.

Is this possible? Or perhaps there's a better way of acheiving the same result?

You could do that easily enough using a script. Either by search & replace on the path strings (if the parent folders will always be known in advance from a small set) or by removing the first 3 levels from the path and adding something else.

If you wanted to, the script could also look for the files to copy and do everything in one click, without having to do the Find command first. But you may not want that if you want an overview of what is going to be copied first.

Thanks, Leo. Yes, I would prefer to keep the intermediary step where I can just glance at everything that is about to be uploaded.

You could do that easily enough using a script. Either by search & replace on the path strings (if the parent folders will always be known in advance from a small set) or by removing the first 3 levels from the path and adding something else.

Yeah, this approach would be best since I won't know the names of the parent folders, but they will however all be in the same directory in my system- ie. 3 levels up.

OK, I'm gonna need some help... :roll_eyes: I guess it's going to involve Select ALL and then parsing {filepath}. But I don't know how to do the parsing/splitting to remove the first 3 segments. Would I have to use regex?

Scripts have access to objects that give them the selected files, paths, etc. No need to use things like {filepath} in a script.

Script Functions gives a quick overview of using scripts in buttons.

Scripting Reference has all the objects and methods that Opus provides. (You can also use various things built in to Windows which are accessible from VBScript/JScript.)

When you switch the button editor to Script mode, you'll get a default example script that shows how to do a lot of the basics. Have a look at that to get started and see how to loop through the selected files.

A regex probably makes sense for modifying the path strings, and if you're using JScript you can use the language's built-in regex support for that. (If using VBScript, Windows provides a regex object you can use, with a bit more verbosity.)

This is all assuming you're already familiar with some form of scripting, which I'm guessing you are from the context of the thread. If you aren't then it may be a lot to learn just for this, and we can help write the script (but may need some more detail).

Thanks, Leo. Not new to coding in general, though I'm new to VB script and the protocol for writing such scripts in Dopus. I'm also not new to Javascript. JScript I understand to be essentially (but not exactly) Javascript.

I guess I would prefer then to use JScript, though I'm finding it harder to get past the first hurdles. This doesn't work..?

function OnClick(clickData)
{
	var all = clickData.func.sourcetab.all
	all.forEach(function(file){
		DOpus.Output(file.path)
	})
}

I'm actually a little further using VBScript (I can get each file path) simply because I'm able to follow more useful examples, but as I say, would probably prefer to use JScript (I assume it's mostly a preference thing and there's no real advantage to using one over the other).

we can help write the script (but may need some more detail).

I'll love you forever if you do!

Basically all I want to do is copy each file to corresponding directory on the server (relative to the root folder on the local system- so that's the one after you knock the first 3 segments off the URL). Does that make sense?