Error message when passing a list of files?

Hi,

can you please help, is it possible to for rclone to return error if I pass a list of file, and that list of files is missing a file?

Is it possible to pass a list of files as list of files instead of filter?

--files-from or --files-from-raw don't return an error if a file is not missing from the list/doesn't exist?

You are correct and this is, as you've observed, because this is a filter.

Sometimes (depending on exactly what rclone is doing) it will just use that as a list of names to filter directory listings against, and sometimes rclone will actually find each file. This second usage could potentially generate error messages but that would only be on some of the uses of --files-from which would make it inconsistent.

What are you trying to achieve? Maybe there is a better way?

I need to do a daily backup of 2KK++ files.

I actually don't use rclone to do the query and verification, I do my own optimized code to do all that.

My goal is just to pass a list of files to rclone, to just copy those files.

It's not really crucial, but I'd like some sort of logging in case some file from my list that I pass is not copied because it was moved or deleted...

am I correct to assume that "filter" still errors if there's an issue with copying to destination, right?

I see.

What flags do you use on your rclone copy command? We might be able to put an ERROR message in.

You could run rclone rcd then send individual copy file commands. This will probably be less efficient than using --files-from depending on the backend you are using.

Yes it does. It just won't notice if the source file is missing.

What flags do you use on your rclone copy command? We might be able to put an ERROR message in.

example of what I currently run, let me know if there's something I should adjust?:
rclone --files-from-raw $someFile --fast-list --checksum --log-file=$logFilePath --log-level INFO copy $backup['source'] $backup['destination']

You could run rclone rcd then send individual copy file commands. This will probably be less efficient than using --files-from depending on the backend you are using.

I was thinking about that and I agree with you, is it not going to establish a connection for each iteration of a file? sorry not sure how it's working internally.

In order to give specifc advice I need to know

  • what backend you are using? (eg s3, google drive)
  • how many files do you have in your --files-from list normally?
  • I use s3 as back-end
  • --files from list I doubt that it will be bigger than 1-2k files, I'd say 5k tops

When I did 2kk rclone, I did get sometimes error message that the file I was trying to copy was no longer there. So I'm pretty much trying to achieve the same result.

Agian, no sweat if it's not do-able, it's just for my piece of mind.

With s3 using rclone rcd to copy individual files will be quick (possibly 1 extra transaction depending) so if you wanted a guaranteed OK/ERROR for each file you could do that with very little overhead.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.