Is there a best practice for a multi-day backup of many TB's of files? I'm not trying to get around the 750MB daily upload cap, rather, I want to play as nice with it as possible.
I have --drive-stop-on-upload-limit
set, and --drive-server-side-across-configs
appears to be working properly since my files are being copied at about 24 GBytes/sec. I'm also using my own client ID's. I think the stop on upload limit flag is working, since I'm receiving these errors:
Received upload limit error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
Fatal error received - not attempting retries
My plan is to create a systemd service to run the copy command once a day, and then after a few weeks of daily 750MB batches, everything should be finished. Is this a reasonable method that plays nice with the API?