What is the maximum download quota for an Unlimited Google Drive account per day?

After syncing ~7TB from one Google Drive account to another Drive account, I am unable to download anything from the Google Drive Account I was syncing from. I did a bit of research, and the only information I could find was that there is a limit of 400GB that can be downloaded per day. This is obviously not accurate, so I was wondering if anyone has experience with exceeding the download quota for Google Drive.

Is there a concrete maximum that I can download from Google Drive in one day, and should the quota reset after 24 hours?

maybe https://support.google.com/a/answer/1071518?hl=en&ref_topic=6013516

1 Like

I was syncing 35GBs per minute, so I am pretty positive that is not accurate.

i see… guess those limits aren’t for google dirve then :slight_smile:

I don’t know whether there is a maximum or not, but I’ve heard from other rclone users that drive can stop working if you thrash it too much, but it starts working again after a while. Sorry to be a bit vague!

I’m facing the same issue with the same rate-limited downloads of large files (small text files didn’t have this issue) after hitting an unknown threshold limit. Situations where I saw this happen was:

  • using rclone copy on 1 host to upload an entire directory (it was around 1TB total that ran through the night). I used rclone copy without any options.

  • using rclone mount crypt with the following options: allow-other,dir-cache-time=2m
    With the mount, I had Sonarr go in and mass rename and put in the correct season folder for the files.

I was using rclone 1.33 beta 85

Any recommendation on any options I can use to fine-tune to prevent this?

It’s interesting you mention running into the same issue after using sonarr to mass rename files. The first time that this happened to me I had synced 7TBs of content to another Google Drive, at this same time, I was scanning some sections of my Plex library with an rclone mount, I did not have any directory cache time set. This was my my first time using rclone mount, but admittably, it was also my first time syncing more than 2TBs in any given day.

When the quota error lifted, I began scanning new sections of my library using rclone’s mount feature, and after using ~500GBs of bandwidth, I again, reached a quota. So, is there a download limit? Or, perhaps an API limit which when exceeded locks downloading from Google Drive?

Is it possible that the way rclone handles fetching directories contributes to the problem?

The only limit rclone has at the moment is --bwlimit though there is an issue about a files per second limit: https://github.com/ncw/rclone/issues/485 - perhaps an api calls per second might be a better limit.

is there an api/url from google where you can see how much bandwidth is still available or where you can see how long you’re banned?

not afaik, google do not seem share info on the limits.

very sad thanks for the quick answer

I have read around the web 750GB upload per day and 10TB download per day.

do you have the direct link to where you read it?

I do not but even if I did it would just be someone elses word. The 750GB/upload/day is pretty well known and proven at this point. I have seen heavy users quote 10TB/download/day

I can confirm 10TB/download/day
There is also a quota for number of times you can access the same file, but this I don’t have the data on.

Can’t wait for the caching to work an eliminate plexdrive all together. @jpat0000 be we have script that deploys on ubuntu that configures a startup script with the bandwidth controls and proper uploads to prevents api bans either. rclone has been great :smiley:

Hey AJ1252, been well? Still running that rclone we configured at https://plexguide.com

but there are still api limits or if I download as an example 10tb in large files he does not cause any problems, but if I should download 10tb and many small ones are yes more requests to the api.

I just hit the 750gb upload daily limit, despite uploading from google cloud compute persistent disk to google drive. I understand the bandwidth limit, but it’s a bit odd to apply it to files moving within what is presumably the same building?

Also as a test I uploaded a file via the web interface and it worked just fine. Weird. Does rclone have anyway to access google cloud compute persistent disks? Maybe if I ran rclone on my computer instead of google’s computer it would let me transfer between the cloud disk and the drive disk that way?

This doesn’t appear to allow access to google cloud compute persistant disks though
https://rclone.org/googlecloudstorage/
or at least the lsd command returns nothing. is rclone unable to access google cloud compute persistant disks?

rclone can access google cloud storage - that is different from persistent disks though. I think you’ll need to mount the persistent disk to get the data on or off it.

Sorry to resurrect an old thread but can you tell me where you got the info from regarding the “quota on how many times you can access a file” I think I ran into this problem myself.