I’m doing a massive (15TB) rclone copy from one Google Drive to another. I’m using rclone copy, but of course I’m hitting the 750 GB daily upload limit. Rclone seems to keep retrying and then skipping some. The error rate is slowly growing. I’m worried that as a result of the limit it’s missing files to copy.
Would this really be the case? I thought rclone would function by retrying until the limit was lifted.
Hope someone can help.
No it retries 10 times and continues on. You should set your bwlimit to something low so it just does ~700GB a day or something.
To stay under the 24hr rate limit, use the option:
--bwlimit 8650k or around there depending on your own network settings. Although that’s interesting that Drive rate limits folks moving data from drive A to drive B.
If you want to verify any data loss, after the copy finishes run an
rclone check command from the source to dest.
To stay under the 24hr rate limit, use the option: --bwlimit 8650k
Doesn’t that just limit his bandwidth? He is still under the 750gb daily limit.
Sorry, good call, I worded that poorly!
Using that bwlimit option has, in my experience, helped to skate just under the 750GB daily upload limit for cases where one account is syncing directories vastly exceeding 750GB over however many days it takes.
For a 15TB volume, a steady upload of 750GB every 24hrs per account is about as good an assurance one can ask for. Other variables come into play, of course, like the granularity of the data (15TB of small files is a bit more on-target than 15 x 1TB files).
Hope this makes sense!
Thanks for your help. Also, do you happen to know how the ban works? Is it 24 hours of blocking after you hit the limit or is there a time it resets?
I have not personally tested the hour-to-hour finish of the cooldown, but I want to say that 24hrs means 24hrs, given how many folks across different timezones etc. have reported it as being that duration. Google engineers, if you’re reading this, give us some insight!