Google lock bypass?

has anyone got this to work??

i tried but then i think i messed up and now there are a bunch of sub folders that have shares but not the files inside and then some that are the otherway around.

does this idea work and would it be a good way of cloning google drives to make a few backups, also does anyone know a way to do it without having to do each file separate i rather copy the whole folder and not individual files but i could not figure out how to do so…

What are you trying to accomplish? This only gets around a per file download limit by making copies of files.

well more so trying not to triger a lock down any time i wanna clone my google drive that is 53TB in size so yeah… it is a bit big and just would be cool to make copy rather than have to rclone copy remote1: remote2: which makes a lockout every time gets about 3 tb in then locks up


So as of a few days ago, Google change something and you can’t do too much server copy (the code that you mentioned) any more, and you will keep getting error.

But this is not a lockdown on your whole account which prevent you to do anything else so don’t get those confused.

Lockdown or ban is when your account won’t be accessible for any download for around 24 hours, which you can go into your account but you can download anything and that link you mentioned, is telling you how kind of bypass that, which is only work per file, not good when you have a large library.

But if you want to clone your account, which we used to use rclone copy for it, you can’t use that as of now, unless Google change it back or something else happens, but you can use other site like or some other site and sync your 2 drives together, it’s not as fast as sever copy, but it do the job for you.

well that explains why it locks up after a short while they might be getting mad at us XD taking advantage of the “unlimited”
Thanks @Danial_Hanafian
was more just wondering if anyone had an idea for fast cloning cuz as we all know redundancy is a good thing dealing with clouds and unreliable things like ACD.

I wonder if ACD resets the BW meter for users every month or how they keep track and decide limits, aka how often can i copy my whole cloud without risk of being banned

I don’t know about ACD, but for Google you don’t get ban with copying your data, they just limiting it right, they only thing is ban you Google account for 24 hours is rclone mount, which don’t use it with Google (specially combination of Plex and rclone mount)

i use it with mount seems fine unless scanning so thats kinda why i had the idea of using one gdrive for scanning and the other for watching. or even like have load balancing kinda with the mounts would be cool scan from both at 1 time so that it does not do the 24hr lock. and i am thinking of just moving away from acd as i have a bad feeling they are trying to do some funky things and that’s what is going on with the breaking

Well, I moved from ACD for a few months now.

As for Google, I used to use node-gdrive-fuse and that has good performance but doesn’t update files, so each time you have to delete the cache and remount, which it will take a few minutes (depend on your library size), and then I moved to use ocamlfuse and that’s working good so far (except doesn’t have that good performance) but it’s kind of acting weird recently, it doesn’t show some of my files, even they are there and I can play them in Plex but I can’t see them and it’s kind of like empty folder, so Plex won’t refresh that folder, and now I’m trying something else, hope I find something stable.

just rclone mount to google is amazing speeds for things
the only draw back is during scan you get locked out thats the only bad i find

if i had the money id just buy a Big Ass storage server and install my plex on that. but they are not cheap even more so BW is not cheap so even if i could afford a big server i would lack BW

rclone doesn’t make good list of your file and because of that when Plex search for stuff, it cause that each file get download over and over and that’s the reason you get 24 hours ban.

has more to do with the opening the video to check for quality, time, codecs, and other info like that. if it was just the name it would be fine thats why emby does fine when scanning as it does not open 1 file 5 times to add it.
i think rclone mount is not to blame for that plex should only need the filename and the option for file processing but no thats too much to ask for from plex XD

It’s not Plex fault, because there are all of other option other there which I mentioned and they don’t cause any ban on the Google and if you read the forum you will find out that there is a bounty that asking to fix this problem, because rclone doesn’t make a list of the file from the storage, and each time Plex scanning, rclone download the file.

Other options they make a list of the files, so Plex will use that and it’s only one time and nothing will happens.

U can share your entire library with the other account and then from the other account select “add to drive”. Then you can do a server side clone in rclone like this:

Rclone sync remote1:/source_foldet remote1:/new_folder

All within the same remote after you share. I’ve clone to 3 gdrives with this method. I cloned 8TB without issue.


Wow that’s an awesome tip! How long does that take to clone 8TB? Also do you add any other options to the command (e.g. --transfers= , --checkers= , etc)?

still counts against the file reads for the origin account
but when did you do it before they made it harder or early on?

I used about 50 checkers/transfers. I don’t remember how long but I think it was around 6 hours or so.

I still keep them in sync today using this but I started doing it a month back.

well i tried the same way and and locked up after about an hr after moving around 3 tb
at around 5Gbit/s