I'm currently expanding a bit and preparing jobs which could lead to having more than 1TB a day to upload.
Now I have read somehwere that it should be possible to avoid the 750GB daily limit by using multiple accounts.
Now I ofc dont add more Gsuite accounts since they cost about 10 bucks each.
I tried common free Google Accounts with 15GB Storage, shared the folders from the Gsuite account and started a Rclone Upload Job which quickly stopped afterwards, it used the 15GB quota of the free account.
Is there a workaround or trick regarding what I tried?
Or could I just generate more OAuth ID's? But I think the limit is enforced per Google Account and not per OAuth ID.
We avoid public discussion of how to willfully break the Google TOS here, and that is likely why you haven't gotten a reply yet.
Not only would that place your account at risk (and as such would be generally inadvisable) but it would put the Rclone project in a bad light, which could be bad for everyone involved in the long term.
This is not a moral judgement. What you do with your account is between you and Google, but out of respect to rclone's author that discussion can not happen here.
If it is of any assistance, there are some flags that might help you manage the data-transfers to get the most out of your 750/day. That is a lot of data after all.
will allow for 24/7 uploads without ever hitting the limit (assuming no external uploads via the same account) --max-transfer 740G
Will forceful stop a rclone operation when 740GB is has been transferred (in that one rclone-command), which may be useful for scripts. --drive-stop-on-upload-limit
(for Gdrive only) Will stop a rclone command if it receives an error that indicates the quota has been exceeded for that day (thus preventing endless futile retries).
Let me know if there is anything else I can help with
Also thank you for your tips, it's just that I started autodl-irssi again and it -could- turn out to download more daily than I can upload. Specifically if I gonna reach out for new trackers as well now. My Upload Jobs are currently already set to 700GB and I usually calculate manually to not hit the limit. But I was planning to let it be more automatic without any worries.
BWlimit won't suit me since I often do the upload jobs manually to rescan with Plex quickly afterwards.
I also wonder, if I make use of --drive-stop-on-upload-limit, isn't it already too late? I thought I have read that as soon as you hit the limit you're banned for 24h? Which would still cause a loss of a full 750GB upload?
Altough, I need to ask, isn't the Team Drive setting all about using a drive as a "Team", thus multiple accounts?
The only thing that happens if you reach your daily 750GB upload allowance is that you are denied any further requests to upload more data until the quota resets (sometime around midnight'ish depending on a few factors). Everything else will still work normally (viewing and downloading files). For reference there is a daily download cap also, but that is a rather massive 10TB/day so it rarely a limiting factor.
So in short - if you want to maximize your uploads over time you can very easily just run a scheduled script lets say a few times a day - and simply aborts if it is hitting it's head against the limit (but will resume on the remainder the next day without any further actions required). Using --drive-stop-on-upload-limit that is. That theoretically allows you to upload a whopping 273TB of data over a year.
Even without that flag it would work, but your script would be left futilly hammering the Gdrive API for no good reason trying to keep uploading, so that wouldn't be very optimal for you or the Google server.
If you need a simple example of a script that would do this I can very likely assist you in that. Just let me know.