I have a set of deeply nested directories that I am backing up to Google Team Drives. If I want to exclude a given dir, I can try --exclude. But when I do, it doesn’t do what I expected:
I don’t seem to get any files copied, which is reasonable I suppose, but after a while Rclone just keeps sleeping for longer and longer intervals because the user rate limit is exceeded, but it’s not - I have my own client ID and during this time I am well under 100 queries/100 seconds.
Google drive can be a bit fussy about listings… You can try --fast-list and you can also try --tpslimit 5 or something. (I think you might have to upgrade to v1.45 for --fast-list).
Assuming that you are trying to exclude the directory with the absolute path /data5/analyses/Kavanagh_Terry Kavanagh_Terry/18.11.01_pid1855/ your --exclude needs to be rooted to the root of the transfer, so
Thanks for your help. I think I have it figured out. I am trying to call Rclone from a bash script, and I am not doing a good job of sending a quoted string to the shell.