Expectation for --exclude


#1

I have a set of deeply nested directories that I am backing up to Google Team Drives. If I want to exclude a given dir, I can try --exclude. But when I do, it doesn’t do what I expected:

rclone copy /data5/analyses/Kavanagh_Terry Kavanagh_Terry: -vv --exclude "Kavanagh_Terry/18.11.01_pid1855/**" &> kav.log &

grep 18.11 kav.log | more
2018/12/03 14:09:40 DEBUG : rclone: Version "v1.44" starting with parameters ["rclone" "copy" "/data5/analyses/Kavanagh_Terry" "Kavanagh_Terry:" "-vv" "--exclude" "Kavanagh_Terry/18.11.01_pid1855/**"]
2018/12/03 14:09:46 DEBUG : 18.11.01_pid1855/.ignore: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 1/10
2018/12/03 14:09:48 DEBUG : 18.11.01_pid1855/.ignore: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 2/10
2018/12/03 14:10:01 DEBUG : 18.11.01_pid1855/data/CC_iPSC_iHEP_samplelist_Lu_DBedits.xlsx: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 1/10
2018/12/03 14:10:03 DEBUG : 18.11.01_pid1855/data/samps.R: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 1/10
2018/12/03 14:10:03 DEBUG : 18.11.01_pid1855/.ignore: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 3/10
2018/12/03 14:10:03 DEBUG : 18.11.01_pid1855/data/TAMU_Rusyn_All_rawfiles_received.xlsx: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 1/10

I don’t seem to get any files copied, which is reasonable I suppose, but after a while Rclone just keeps sleeping for longer and longer intervals because the user rate limit is exceeded, but it’s not - I have my own client ID and during this time I am well under 100 queries/100 seconds.

Is this the expected behavior?


#2

Google drive can be a bit fussy about listings… You can try --fast-list and you can also try --tpslimit 5 or something. (I think you might have to upgrade to v1.45 for --fast-list).

I’m not sure your exclude is right though

rclone copy /data5/analyses/Kavanagh_Terry Kavanagh_Terry: -vv --exclude "Kavanagh_Terry/18.11.01_pid1855/**" 

Assuming that you are trying to exclude the directory with the absolute path /data5/analyses/Kavanagh_Terry Kavanagh_Terry/18.11.01_pid1855/ your --exclude needs to be rooted to the root of the transfer, so

--exclude "/18.11.01_pid1855/**"

#3

I still get the same results, even with your suggestions:

grep 18.11 kavanagh.log 
2018/12/04 06:40:39 DEBUG : rclone: Version "v1.44" starting with parameters ["rclone" "copy" "Kavanagh_Terry" "Kavanagh_Terry:" "-vv" "--tpslimit" "5" "--exclude" "18.11.01_pid_1855/**"]
2018/12/04 06:40:51 DEBUG : 18.11.01_pid1855/data/TAMU_Rusyn_101018/.Rhistory: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 1/10
2018/12/04 06:40:54 DEBUG : 18.11.01_pid1855/data/TAMU_Rusyn_101018/TAMU_Rusyn_101018_rawfiles.xlsx: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 1/10
2018/12/04 06:40:56 DEBUG : 18.11.01_pid1855/data/TAMU_Rusyn_101018/.Rhistory: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 2/10

And on the Google Team Drive, all the directories are being generated, but no files are being transferred in.


#4

Thanks for your help. I think I have it figured out. I am trying to call Rclone from a bash script, and I am not doing a good job of sending a quoted string to the shell.


#5

Ah ha!

You might want to try --fast-list also.