Hello! First post here on the rclone forum (though I’ve lurked it for help many times in past months – thanks, all!) as I’m a bit stumped with this issue. I do know that others have hit the infamous 403 rate limit with rclone/google before, but I didn’t expect it’d happen when running size
Since August, I’ve synced about 90TB of data across innumerable subdirectories from a mounted fileserver directory into Google MyDrive (G Suite) directory. This took many divide-and-conquer syncs from 7 different accounts configured as remotes, but they finally got the last of it synced.
Now, I’d like to weigh the file or object count in Drive against the count on the fileserver. I tried using rclone size remote:dest --tpslimit 8 --fast-list
. It works on a small scale, but when I try it on any higher-level folder that contains a little over a million files, it eventually dies:
error listing: couldn't list directory: googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded 2018/10/22 19:11:07 Failed to size: couldn't list directory: googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded
I’m sure there’s a way to check the file count without hitting the rate limit if I can get the data in there to begin with, but what am I missing? I used --tps-limit 8 and --bw-limit 8650k when transferring, for what that’s worth. Should I lower the tps-limit?
Any guidance is appreciated, happy to give more info and context as needed, too! Cheers.