GDRIVE LOCK (rclone sync and copy only - no mount for plex etc... )

Interesting, after 10h and 8TB of data my gdrive account was locked (403 Forbidden) and Iam not accessing that google drive with any app, just doing
/usr/bin/rclone sync acduk:/crypt gdrive:/crypt -v --bwlimit=50M --transfers=20 --checkers=20 --stats $STATS --log-file=$LOGFILE ( the 50M limitation is there as Iam accessing ACD from multiple servers and they are also touchy about bandwith )
and
/usr/bin/rclone copy gdrive:/crypt acdde:/crypt -v --no-traverse --transfers=40 --checkers=40 --log-file=$LOGFILE

after this 403 errors started.

So it seems gdrive is not banning you just for API request but overall bandwith as well … anyone else had that problem ?

damn, but i told you :stuck_out_tongue:

What about your credit?

Why would you use no_traverse here. That will cause more API overhead if you’re syncing a whole drive.

Haha you told me my Google Compute VM would be banned after 4TB, but thats not the case, atm I just changed copying from ACD UK to ACD UK on same VM, while waiting for unlock.

What do you mean ? The second line is using copy not sync, since when I transfer all data for the first time I usually do copy first, then once everything is done I run sync and check once.

With copy no-traverse the destination wont be scanned and matched against all files on source, or I misunderstood ?

I’m also getting this :slight_smile:

ERROR : Movies/xxxx.mkv: File.Open failed: open for read: bad response: 403: 403 Forbidden

Does that mean I got banned ? I can’t play files, but plex complains about Inaccessible and that I should “check the permissions of the file” so I’m getting crazy trying to know where the issue is.

And If I got banned it was because of API requests, I barely moved ~20GB today… Only added two libraries just fine, when I went to add another library that was only 28 GB and much smaller than the others the issues started…

@Lol340 yup that account lock

They unlocked my account after 16h and during those I just copied data from my ACD UK account to ACD DE one.
Now I made the script that on gdrive account lock, kills rclone copy and restart copying from ACD UK instead of GDRIVE.

Iam doing slightly less transfers now, but still going strong ( until next lock )

I can’t afford to pay more with this project :confused:

I’m poor and live in a 3d country, I already pay plex pass, a vps to stream and a vps to upload. That’s too many dollars for my currency lol.

I’ll try to learn the other solutions tomorrow and see what I can get

When you use no-traverse, the destination gets scanned once for each file. It doesn’t scan the entire remote but only once for each file you are copying which in this case is a large number. This is inefficient and may be causing your bans. No-traverse would be great for when you are taking files from an upload directory and copying then to the main library since those will be a small number of deltas.

At least that is my understanding. If you use your own client id you could benchmark the API calls in the Google developer dashboard.

@calisro yea makes sense.

That’s odd. I just ran 15TB to Gdrive in 15 hours and did not receive a lock. I did this with 23 Digital Ocean servers running with their own rclone sync commands to Google Drive. The files being transferred were all around 150mb. At the time I was also copying to ACD from my google drive.

I wonder if Google Apps for Education has different Drive quotas than business? I doubt it but I don’t understand how some people have issues and others do not.

So, finally, what’s the best “command line” for clone ALL entire ACD account to a new GDrive?

It will be a mistery till one brave soul decides to ask google… Lol

As for my experience with this, I got the ban(because of plex) but since it was only for downloading, i was still able to transfer my entire 17tb to my Gdrive during the ban. Now it has been over a month that I have ocamlfuse + rclone mount and i have had no bans and everything runs just as fast as when I was only using rclone mount.

Ahh so you got the default ban from google, sorry :stuck_out_tongue: Is your trial account still up?

No after a bit over 10TB they took all my credits and wanted me to enable billing, so I open another account with another credit card ( you need to enter it but they wont charge it ) and transferred around 15TB before they got that one as well.

Now Iam out of credit cards and siriously considering opening another one since speed is just amazing.
Atm my acdde is at 34TB out of 60TB that I have on acdUK/gdrive and transfering it with online.net ( but speed gets maxed at 70MB )

Yeah thats true. I have found something else: https://www.reddit.com/r/seedboxes/comments/64ukak/beta_deploy_quickbox_on_kvm_vps_instantly/

you have to ask for an invite. you can set up 2 machines for a month for free.

transfered 30TB so far (100MB/s)

Where do you ask for free invite ?

p.s. I made the account

on your dashboard -> Support, open a ticket and say that you’re from reddit.

So I have attempted to use gdrive this week, but have been getting banned everyday. Right now the only thing I’m using Gdrive to do is serve plex and users playing content. All my scanning and backend processes are taking place to a union-fuse with ACD. This means Sonarr and Radarr don’t touch Gdrive when performing scanning. I’ve checked my API thresholds and I’m in no way approaching them.

Any ideas guys. Im thinking there has to be a download limit per day? When I get the forbidden message, I have no issues with uploading, only downloading. Here is my mount settings for Google.

Rclone Mounts

#!/bin/bash
rclone mount
–config /home/plex/.rclone.conf
–allow-non-empty
–allow-other
–acd-templink-threshold 0
–buffer-size 500M
–stats 1s
-v
–log-file=/home/plex/log/rclonemount-gdrive.log
gdrive-crypt:/ /home/plex/gdrive &
exit

UnionFuse Settings

unionfs-fuse -o cow,allow_other,direct_io,auto_cache,sync_read /home/plex/upload=RW:/home/plex/gdrive=RO: /home/plex/union-acd-upload/

New Content Uploads - First copy to Gdrive then move to ACD. Scripts in the background tell plex to update for new content.

/usr/bin/rclone copy /home/plex/upload/ gdrive-crypt: -v --transfers=20 --checkers=20 --min-age 15m --stats 30s --log-file=$LOGFILE

/usr/bin/rclone move /home/plex/upload/ encrypt: -v --transfers=20 --checkers=20 --delete-after --min-age 15m --stats 30s --log-file=/home/plex/log/local2acdcrypt.cron.log

Do you have a screensnap of what your API use is and any errors? If you kick off any kind of refresh/scan on gdrive, the rclone mount is going to blow up the API calls.

With 7 days, I'm at this many calls with no bans or locks, but I'm not using rclone mount but google-drive-ocamlfuse for my mount.

Imgur