Gdrive Error 403: download quota for this file has been exceeded

What is the problem you are having with rclone?

I am unable to sync/copy/download files from one team drive to another team drive. I am getting the errors 'file has been downloaded too many times' and 'user rate limit exceeded' however this has been happening for over a week so i dont think there is any bandwidth issue.

I started getting this issue when i upgraded to a different server however some friends are also having similar issue and i have found a few posts on reddit with something similar. I should note that i have been using this setup for several months and has suddenly broke.

The 2 remotes are 2 different team drives on different accounts. I have made sure that both accounts have access to each team drive. I also have setup different API's for each remote so there should not be any issues there.

I realised that the backup account is not able to download directly from the main drive (download quota exceeded), however i am able to download this from the main drive account. As a quick fix i have setup a remote pointing to the backup drive using the main account, however this uses the bandwidth from the main account which i use for other purposes so this is not ideal.

What is your rclone version (output from rclone version)

rclone v1.53.1
- os/arch: windows/amd64
- go version: go1.15

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 10 64 bit and Ubunutu Desktop 20.04

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone sync gdrive: gdrivebackup1: -P -vv --exclude "907uc5jm8tvv25liof7onp7qlc/**"

I do not have the mount script i have been using as i lost this when i migrated server, however i was using the old cache remote with crypt (gdrive -> cache -> crypt), and did not have any issues with this for the few months this was used. In the past couple of days, i have been trying out the new rclone vfs cache mode full however this does not seem to have fixed any issues.

The rclone config contents with secrets removed.

type = drive
scope = drive
token = 
team_drive = 0AMflHQzZwgvQUk9PVA
client_id = 
client_secret = 

type = drive
client_id = 
client_secret = 
scope = drive
token = 
team_drive = 0AI9FcPMMePaFUk9PVA
root_folder_id = 

type = cache
remote = gdrive:
chunk_size = 5M
info_age = 2d
chunk_total_size = 50G

type = crypt
remote = gcache:
filename_encryption = standard
directory_name_encryption = true
password = 
password2 = 

A log from the command with the -vv flag

You can only copy 750GB per day.

Thanks for the response.

My issue is that i have not been able to copy files from the main drive to the backup drive for over a week, so im not sure if this is because of the 750GB limit. I was planning on making this thread a few days ago but i wanted to make sure that this wasnt because of the limit.

There are share download limits as well.

You'd have to wait for that to reset.

to enable server side copying,
you might want to add this

Is this something new because i have never had this problem and this has been setup for almost a year

Nope, the download / upload limits have been pretty much the same for years. Some service account changes people were talking about but I don't use them myself so I don't know.

yeah i dont use service accounts. Roughly how long until this bandwidth limit is lifted? ill stop all transfers through that remote and see if it gets fixed.

I get the same error with server side copying. From what i remember, google has different bandwidth limits from server side copy and uploading it. I'll give it a few days of not doing any transfers and see if my issue gets fixed

sorry i was not clear.

yes, that flag will not work around the quota limit.

just want you to know that you can do server side copy and not use local bandwdidth to download/upload files.

Yeah i normally use server-side copy but since ive been getting this error ive been messing around to find a fix. I omitted from the above script for simplicities sake.

hey dude, i have been having this issue for about a month now. previously i have been able to copy 750 gb / day fine without any problem. now I can't even copy 1 gb of data. any update on your end on this issue?

After i made my last post it seemed to be working fine, until last night. Yesterday i was adding a lot of media to my plex server, and i guess with the intro detection it was downloading a lot and caused the bandwidth issue. I believe that something similar happened last week when i first had the issue - i had added a lot of media and used plex to read the metadata in the file instead of getting it elsewhere. This mustve downloaded the files causing a bandwidth issue.

What's confused me is that im getting a bandwidth issue on my backups as well as my main drive. A few months ago, if i had gotten a bandwidth error on my main drive, i would just mount the backup and everything would be fine, however now, the backups seem to be linked to the main drive. Whether this is because of the way google deduplicate content on drive, i don't know. I use crypt so technically i should be the only one with that file stored in all of drive.

It seems like im getting throttled based on directory rather than the files. For example last night, I could not download anything from the 'TV Shows' folder, however the 'Youtube' folder in my drive was fine.

In terms of fixing your problem, i can only suggest waiting it out and hope that it gets fixed, i didnt do anything special or particular the first time. If you're also using plex (or alternatives), id recommend turning off extras like intro detection, or at least limit it to when maintenance occurs, as this seems to have caused my issues.

It seems like im getting throttled based on directory rather than the files. For example last night, I could not download anything from the 'TV Shows' folder, however the 'Youtube' folder in my drive was fine.

My experience has been that it is either /file or /account, but some very small files can still be downloaded. Maybe the ones in yt folder were below the size threshold?

Perhaps this was the case, as it stands, im getting a download quota error for all of the files in this drive now. Im not sure what i did last week to fix the issue, Ive not made any transfers to or from this remote for a few days now and im still getting bandwidth issues. Im 90% sure google has changed something behind the scene because ive not done anything different in the past 9 months

I am having the same issue. Even on new files i upload just to test i get the "user rate limit exceeded" when trying to server side copy from my team drive to my back up team drive. I cant watch certain files on plex either some of which i was previously able to view. Ive copied TBs of data before with zero issues and now suddenly even though i havnt copied any data in over a month im getting this weird issue. At first i thought it was an RClone issue, but i think this is something google has changed behind the scenes.

Do you have your drive encrypted using the crypt remote? im sure that this isnt a fault of rclone, but google drive something behind the scenes, we are not the only ones that are having this issue, and i cant find a way to fix it other than waiting and hoping for the best.

no but i did figure something out. Not sure if itll help you at all.

So the team drive im using is "owned" by email "A" and I usually upload things with that email. However, ive noticed that all of the files which arent working were uploaded with email "B". since i sometimes upload files to that team drive with whatever email address i happen to be logged in at at the time.

If i delete files which arent working and reupload them with email "A" they suddenly work. If i delete that same file and upload it with email "B" they no longer work on plex. which is weird since they both have the same managing rights. And like i said, files which used to work are now not working. So i dont know what changed, but anything uploaded with my other email to my team drive isnt being read properly. Rclone has access to both accounts so i dont know whats happening.

for now my only work around is to only upload stuff with my "A" email address.

Are you using your own business or enterprise gsuite acc or it is edu acc team drive?

Yeah ive noticed something similar where i can only copy to a different drive using a certain account, even though the bandwidth-limited accounts havent actually hit their bandwidth limit.