How to know if 24-hour banned from Google Drive?

Simple question: how do you know if you’ve been 24-hour banned from Google Drive?

I’m suddenly unable to play any videos from my encrypted GDrive folder mounted on my server. I’m trying to troubleshoot this but suspect I may have been 24-hour banned. How do I know if this is the case?

I can access the web interface for the Google Drive and everything else on it seems to be working as normal. Just can’t play videos from it. In Plex I get “Check permissions for file.” These are files that played consistently until recently and permissions are 644.

Try dowloading one file trough mount in log file you would get error 404 Forbidden.

p.s. When your account is locked you wont be able to play any of the files, but you can still access and list them.

When I try to copy from the mounted folder to another folder on my server, I get:

cp: cannot open ‘Filmname.mkv’ for reading: Input/output error

When I try to download a file from within the Plex web interface, I get

500 Internal Server Error

Is there someplace else I should be looking? Or is this consistent with a ban?

I suspect it’s a 24 hour ban because everything was working fine and then suddenly nothing was (except listing), even after an unmount/remount. But there’s no warnings of a ban when I go to Google API dashboard.

In this case i think you have to try to unmount and remount you resource.

It still happens after unmount and remount.

To see if i’ve been banned i go to my google drive and try to download a big file. If i can download i’m not banned if i it says something like i exceed daily limit i’ve been banned.

Its account lock for sure.

Going to drive.google.com, I was able to download a small (non-rclone-related, non-encrypted) file. It downloaded fine but a message appeared in the lower left of the browser window that said “You are offline. Some functionality may be unavailable.”

Then I tried to download a 600MB file encrypted by rclone.

Got this message:

I guess that’s the 24-hour lock. Anyone know what the actual quota is?

Yes with this massege you have been banned

It should much more than 600mb, today a got my first ban but I’ve been all day long transferring all my media with a VPS (about 6tb, 1tb was encrypted) and updating my whole library of Plex (700 movies and 100+ tv shows)
I’ve seen on a website that is about 1.000.000.000 requests, but I can’t confirm.

I don’t think it’s simply a matter of transfer quantity. When I was banned (it definitely was a ban), it was just from repeatedly seeking in various files using Plex or VLC. The quantity of data transferred was minimal. A billion requests seems fanciful, however. I got banned for what couldn’t possibly be anything close to that.

They ban when max API requests are reached, regardless of data transfer.
With google drive I would strongly suggest not using sync but copy/move with –no-traverse flag.

Without –no-traverse flag when you copy files to your remote rclone will preform full scan of all files on gdrive and once you have tons it will auto lead to account lock.

Before I figure it out I had tons of problems when i was making copy of my amazon drive to google drive.
My ACD stats:

When you’re using the --no-traverse option, are you copying from the root directory or specifying down to a more specific directory? And if you’re copying the root directory, does it just copy everything again?

Just wondering, as I’ve been playing around with some scripts to keep my ACD and GDrive in sync. My ACD is encrypted, so it’s mounted on my VPS and then decrypted in another directory. Usually, I sync that directory to GDrive down to the specific movie/TV series folder (~/acdDecrypted/Movies/TitleA syncs to ~/GDrive/Movies/TitleA) and (~/acdDecrypted/TV/SeriesA/Season1 syncs to ~/GDrive/TV/SeriesA/Season1). Using the copy command will leave duplicates in the GDrive directories, so that’s why I use sync. I used to sync straight from the root directories (~/acdDecrypted syncs to ~/Gdrive) but obviously that’s overkill when I can pass only the changed directories to the sync script.

I’m still getting the 24-hour lock mostly every day, but I also have been downloading/syncing a lot, and each time a sync runs, my Plex server scans the respective library. So perhaps that’s the cause, but the limit still seems arbitrarily low.

The only other alternative I’ve tried is having a script monitor my PMS logs with tail, and anytime it throws a “file cannot be opened” error, it kicks off a script that swaps my media directory from the GDrive mount to the ACD mount, but this isn’t all that graceful because it only triggers when somebody “clicks” a movie/show and Plex says “unavailable” so they have to try again after the script is triggered. I’m wondering if there’s a more graceful way to accomplish this, like having a small text file stored on GDrive that the script can attempt to cat at an interval of something like 10 minutes, and if it’s unsuccessful it will swap mounts. I’m just worried that accessing that text file would just push me over the request limit for GDrive.

EDIT

Well I tried the text file, and I can cat it right now from the VPS even though my GDrive is still banned. In the web console of GDrive, attempting to download a media file still results in “daily quota exceeded” so it must only enforce the “ban” on larger files. Does rClone log access errors on a consistent basis, or only when somebody attempts to play something?

A full library scan each time you add something?

My method to avoid the ban which has been working very well since i started using rclone is to have a script that does the rclone copy or move and then generates a txt file that i use to launch the plexscanner with the directory flag to only update the show/movies added. Havent had a ban since. and i can sometimes add 10 to 20 episodes within the same hour and never got ban.

I did however get ban when i did the same process when i moved content from one library to another. I first moved everything on gdrive, Then i generated a list that i used with the plexscanner to target only the files moved. the script launch the scanner then slept for 5min before doing the next file. Result after the running the scan on 200 items, i got ban. This was yesterday at 1h50pm and it was lifted this morning… but i dont know at what time. But the good news is that even while banned you can still scan media into the library.

Wow I wasn’t aware you could have Plex scan deeper in like that. Can you provide an example of the script? Are you using the API function with curl or something? I’d appreciate seeing how you’re doing this.

This is my script on Plex server, I still need to upload the script that generate folder list eg the one bellow wont work if you dont provide .list files that have desired folders one in each line.

https://github.com/ajkis/scripts/blob/master/plex/plexrefreshfolderlist

Here is mine in python. It also needs a txt file to work

This is the first one…
Note check and recheck as i just cannibalized it from my script
https://github.com/ajkis/scripts/blob/master/plex/plexgeneratefolderlist

Thanks so much for the examples guys, those helped me get my syntax correct on my existing scripts. This appears to be a much better approach. How does it work though with new media? IE new series which don’t exist in Plex yet or new movies?

This is very interesting but what about when the files are not in their own subfolders? How does that work? If those files get “rclone moved” you can’t have Plex scan against them if they are in the root Movies folder for instance.

I am doing the following to put bare files in their own individual folders after the move on a rw rclone mount:

 for i in /home/redacted/gacd/Movies/*.mkv
 do
 d=$(basename "$i" .mkv)
 mkdir "$d" && mv "$i" "$d"
 done

Can this need somehow be incorporated into your workflow @Ajki? I feel we are so close to a perfect solution here. Btw I don’t use any unionfs or overlayfs. I only want to serve from the cloud.

Thanks for any feedback you can supply.