I’m doing a massive (15TB) rclone copy from one Google Drive to another. I’m using rclone copy, but of course I’m hitting the 750 GB daily upload limit. Rclone seems to keep retrying and then skipping some. The error rate is slowly growing. I’m worried that as a result of the limit it’s missing files to copy.
Would this really be the case? I thought rclone would function by retrying until the limit was lifted.
To stay under the 24hr rate limit, use the option: --bwlimit 8650k or around there depending on your own network settings. Although that’s interesting that Drive rate limits folks moving data from drive A to drive B.
If you want to verify any data loss, after the copy finishes run an rclone check command from the source to dest.
Using that bwlimit option has, in my experience, helped to skate just under the 750GB daily upload limit for cases where one account is syncing directories vastly exceeding 750GB over however many days it takes.
For a 15TB volume, a steady upload of 750GB every 24hrs per account is about as good an assurance one can ask for. Other variables come into play, of course, like the granularity of the data (15TB of small files is a bit more on-target than 15 x 1TB files).
I have not personally tested the hour-to-hour finish of the cooldown, but I want to say that 24hrs means 24hrs, given how many folks across different timezones etc. have reported it as being that duration. Google engineers, if you’re reading this, give us some insight!