Interesting, after 10h and 8TB of data my gdrive account was locked (403 Forbidden) and Iam not accessing that google drive with any app, just doing /usr/bin/rclone sync acduk:/crypt gdrive:/crypt -v --bwlimit=50M --transfers=20 --checkers=20 --stats $STATS --log-file=$LOGFILE ( the 50M limitation is there as Iam accessing ACD from multiple servers and they are also touchy about bandwith )
and /usr/bin/rclone copy gdrive:/crypt acdde:/crypt -v --no-traverse --transfers=40 --checkers=40 --log-file=$LOGFILE
after this 403 errors started.
So it seems gdrive is not banning you just for API request but overall bandwith as well … anyone else had that problem ?
ERROR : Movies/xxxx.mkv: File.Open failed: open for read: bad response: 403: 403 Forbidden
Does that mean I got banned ? I can’t play files, but plex complains about Inaccessible and that I should “check the permissions of the file” so I’m getting crazy trying to know where the issue is.
And If I got banned it was because of API requests, I barely moved ~20GB today… Only added two libraries just fine, when I went to add another library that was only 28 GB and much smaller than the others the issues started…
They unlocked my account after 16h and during those I just copied data from my ACD UK account to ACD DE one.
Now I made the script that on gdrive account lock, kills rclone copy and restart copying from ACD UK instead of GDRIVE.
Iam doing slightly less transfers now, but still going strong ( until next lock )
When you use no-traverse, the destination gets scanned once for each file. It doesn’t scan the entire remote but only once for each file you are copying which in this case is a large number. This is inefficient and may be causing your bans. No-traverse would be great for when you are taking files from an upload directory and copying then to the main library since those will be a small number of deltas.
At least that is my understanding. If you use your own client id you could benchmark the API calls in the Google developer dashboard.
That’s odd. I just ran 15TB to Gdrive in 15 hours and did not receive a lock. I did this with 23 Digital Ocean servers running with their own rclone sync commands to Google Drive. The files being transferred were all around 150mb. At the time I was also copying to ACD from my google drive.
I wonder if Google Apps for Education has different Drive quotas than business? I doubt it but I don’t understand how some people have issues and others do not.
It will be a mistery till one brave soul decides to ask google… Lol
As for my experience with this, I got the ban(because of plex) but since it was only for downloading, i was still able to transfer my entire 17tb to my Gdrive during the ban. Now it has been over a month that I have ocamlfuse + rclone mount and i have had no bans and everything runs just as fast as when I was only using rclone mount.
No after a bit over 10TB they took all my credits and wanted me to enable billing, so I open another account with another credit card ( you need to enter it but they wont charge it ) and transferred around 15TB before they got that one as well.
Now Iam out of credit cards and siriously considering opening another one since speed is just amazing.
Atm my acdde is at 34TB out of 60TB that I have on acdUK/gdrive and transfering it with online.net ( but speed gets maxed at 70MB )
So I have attempted to use gdrive this week, but have been getting banned everyday. Right now the only thing I’m using Gdrive to do is serve plex and users playing content. All my scanning and backend processes are taking place to a union-fuse with ACD. This means Sonarr and Radarr don’t touch Gdrive when performing scanning. I’ve checked my API thresholds and I’m in no way approaching them.
Any ideas guys. Im thinking there has to be a download limit per day? When I get the forbidden message, I have no issues with uploading, only downloading. Here is my mount settings for Google.