After syncing ~7TB from one Google Drive account to another Drive account, I am unable to download anything from the Google Drive Account I was syncing from. I did a bit of research, and the only information I could find was that there is a limit of 400GB that can be downloaded per day. This is obviously not accurate, so I was wondering if anyone has experience with exceeding the download quota for Google Drive.
Is there a concrete maximum that I can download from Google Drive in one day, and should the quota reset after 24 hours?
I don’t know whether there is a maximum or not, but I’ve heard from other rclone users that drive can stop working if you thrash it too much, but it starts working again after a while. Sorry to be a bit vague!
I’m facing the same issue with the same rate-limited downloads of large files (small text files didn’t have this issue) after hitting an unknown threshold limit. Situations where I saw this happen was:
using rclone copy on 1 host to upload an entire directory (it was around 1TB total that ran through the night). I used rclone copy without any options.
using rclone mount crypt with the following options: allow-other,dir-cache-time=2m
With the mount, I had Sonarr go in and mass rename and put in the correct season folder for the files.
I was using rclone 1.33 beta 85
Any recommendation on any options I can use to fine-tune to prevent this?
It’s interesting you mention running into the same issue after using sonarr to mass rename files. The first time that this happened to me I had synced 7TBs of content to another Google Drive, at this same time, I was scanning some sections of my Plex library with an rclone mount, I did not have any directory cache time set. This was my my first time using rclone mount, but admittably, it was also my first time syncing more than 2TBs in any given day.
When the quota error lifted, I began scanning new sections of my library using rclone’s mount feature, and after using ~500GBs of bandwidth, I again, reached a quota. So, is there a download limit? Or, perhaps an API limit which when exceeded locks downloading from Google Drive?
Is it possible that the way rclone handles fetching directories contributes to the problem?
The only limit rclone has at the moment is --bwlimit though there is an issue about a files per second limit: https://github.com/ncw/rclone/issues/485 - perhaps an api calls per second might be a better limit.
Can’t wait for the caching to work an eliminate plexdrive all together. @jpat0000 be we have script that deploys on ubuntu that configures a startup script with the bandwidth controls and proper uploads to prevents api bans either. rclone has been great
I just hit the 750gb upload daily limit, despite uploading from google cloud compute persistent disk to google drive. I understand the bandwidth limit, but it’s a bit odd to apply it to files moving within what is presumably the same building?
Also as a test I uploaded a file via the web interface and it worked just fine. Weird. Does rclone have anyway to access google cloud compute persistent disks? Maybe if I ran rclone on my computer instead of google’s computer it would let me transfer between the cloud disk and the drive disk that way?
This doesn’t appear to allow access to google cloud compute persistant disks though https://rclone.org/googlecloudstorage/
or at least the lsd command returns nothing. is rclone unable to access google cloud compute persistant disks?