Rclone copy files to one teamdrive, then move the same files to another team drive

What is the problem you are having with rclone?

I have 2 team drives, and one download server. What I want to do is once a file gets downloaded, that first this file is copied to gdrive1, and once the files get copied, the same files get moved to gdrive2.

For this I am using this command:


rclone copy /downloads/tv gdrive1:/tv --exclude="**partial~" --exclude="**_HIDDEN" exclude=".unionfs/**" --exclude=".unionfs-fuse/**" --min-age 2m -P -v --transfers=20 --checkers=20 && rclone move /downloads/tv gdrive2:/tv --exclude="**partial~" --exclude="**_HIDDEN" --exclude=".unionfs/**" --exclude=".unionfs-fuse/**" --min-age 45m -P -v --transfers=20 --checkers=20 --delete-empty-src-dirs

This command works in most cases, but the problem happens when a new file appears in folder /downloads/tv when copy command is already working. Then once move command starts, it also moves this new file, and then the gdrive1 doesn't get this file. (I hope I explained it well enough). I run this command in a script named rclone-tv.sh and it is always running using crontab . I am using --min-age flag, but the problem with that is often this new files are having modified date which is much older (for example when sonarr downloads an older tv show, modifed date isn't today's date, rather the date when this file was probably uploade online or something like that)

I know I can just move it one gdrive1 and then use a rclone sync for gdrive2, but the problem is I am having a large number of files, and If I run it few times per day, I get api ban.

That is the reason I prefer this command, but I would like to know is there anything else I can do, so rclone only moves files which were copied on gdrive1, not this new files which appeared during rclone copy.

Thank you for your help

What is your rclone version (output from rclone version)

1,51

Which OS you are using and how many bits (eg Windows 7, 64 bit)

ubuntu 18.04

Which cloud storage system are you using? (eg Google Drive)

Google drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy /downloads/tv gdrive1:/tv --exclude="**partial~" --exclude="**_HIDDEN" exclude=".unionfs/**" --exclude=".unionfs-fuse/**" --min-age 2m -P -v --transfers=20 --checkers=20 && rclone move /downloads/tv gdrive2:/tv --exclude="**partial~" --exclude="**_HIDDEN" --exclude=".unionfs/**" --exclude=".unionfs-fuse/**" --min-age 45m -P -v --transfers=20 --checkers=20 --delete-empty-src-dirs

I doubt you get an API ban. Have you created your own client ID? The best approach will be to sync drive to drive.

What you can also do is feed into each a list of 'includes' by parsing the log file from the previous command. Then you'll be sure that only what is copied is moved afterward.

But if the root of the problem, is the "api ban", i'd suggest you revisit that. You can 1,000,000,000 API calls a day. That is over 11,000 calls a second.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.