Best practice for a multi-day drive to drive backup

Is there a best practice for a multi-day backup of many TB's of files? I'm not trying to get around the 750MB daily upload cap, rather, I want to play as nice with it as possible.

I have --drive-stop-on-upload-limit set, and --drive-server-side-across-configs appears to be working properly since my files are being copied at about 24 GBytes/sec. I'm also using my own client ID's. I think the stop on upload limit flag is working, since I'm receiving these errors:

Received upload limit error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
Fatal error received - not attempting retries

My plan is to create a systemd service to run the copy command once a day, and then after a few weeks of daily 750MB batches, everything should be finished. Is this a reasonable method that plays nice with the API?

That's the best way to do it as it stops trying to upload once you hit the limit.

The limit isn't a perfect thing either as you can sometimes upload smaller files so it's not quite perfect but it is good enough.

I use:

 --drive-stop-on-upload-limit

Same as you.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.