GDrive 403 downloadQuotaExceeded with Sync to TeamDrive(Shared)

What is the problem you are having with rclone?

I have old un-accessible edu Gsuite shared folder in my personal GDrive (MainDrive:/) that I am syncing using multiple destination GDrive accounts linked to a TeamFolder in the new Gsuite Destination (help1:/), It runs great and I am getting 235MB/ps and in about 40mins I exhaust daily 750gb upload limit and move to help2, help3, etc
But after around 2TB of transfer it begins to fail with 403 downloadQuotaExceeded and I am cannot download any files from edu Gsuite shared folder (MainDrive) on rclone or Gdrive GUI. I am using my own API Cient ID and not showing too many API hits. So does my cmd needs some more flags to work without error? I have 10 "Helper Accounts" trying to transfer 57TB and max out the 10TB daily limit to get this done ASAP in case the edu GSuite account is deleted on me.

What is your rclone version (output from rclone version)

v1.52.1

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 18.04.4 LTS on a Google Compute Engine

Which cloud storage system are you using? (eg Google Drive)

edu Gsuite shared folder added to Personal GDrive transfer to new Gsuite (Team Folder)

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone sync --progress --drive-chunk-size 64M --fast-list --ignore-existing --size-only --verbose --transfers 10 "MainDrive:/Chris' Movies/Chris' Movies MP4" "help1:/MP4"

The rclone config contents with secrets removed.

Paste config here

A log from the command with the -vv flag

Paste  log here

Those errors mean you hit your quota for the day. Using things to get around it are not recommended.

You'd have to wait for your quota to reset and continue.

There are API limits and those are insanely high. There are quotas documented and they are 10TB down and 750GB up. There are many other 'undocumented' things per file, per share, etc and all of that would reset at the same time.

The annoying part is all those set of errors show 403s.

There is literally no other services or programs accessing these files on Google Drive, since the share is private. So it doesn't seem the 403 is from too many download of a single file which is a common issue. But as you said an 'undocumented' limit, but is there any evidence that if I lower the number of transfer (currently set to 10), throttle the speed (currently getting 235MB/sec) or do you think it simply a hard quota of I am guessing from my problem around 2tb daily from a "shared folder", no matter how fast or slow you transfer it? I mean I of course could always throttle the transfer to be less than 2tb in a 24hr period but of course just doing 2tb in 90mins each day works too. But I believe my Google Compute bills by "on" hours, so less time on would be cheaper. Any input is appreciated. Thanks

I tend to stick within the 10TB download / 750 GB upload limits so not sure what else I could add.

If there are other limits on shared folders, that might be it too.

Sticking to 750GB up would fix the issue but adds time so you'd have to figure out the balance or ask someone that goes around the process.

So in an attempt to maximize but not violate Google Drive Quotas, I tried "removing" files in the GDrive source folder which is shared with me as read only. I am able to remove files in GDrive WebGui in the given folders but rclone ignores this? Is there a way have Rclone respect "removed" files?? I tried Read-only in the Drive Scope and also looked through the advanced config and nothing seems to work. I really don't want to have to build a huge exclude list just for each of my rclone commands if possible.

I'm not sure what you are trying to to or what you mean by violating quotas.

Can you share the command you are running and what you want to happen and we can help out.

You have a folder shared with you from another Google Drive account. You add this shared folder and all its files to your Google Drive "My Drive". With Google Drive WebGUI you can go to that folder and remove and files, (clearly you can't delete them since you don't own them or have write privilege) it removes the files from that folder and lets you know that others can still view them. Well if you have this shared folder in your "My Drive" and access it through Rclone it ignores that these files have been "removed" I was just hoping that maybe there was a setting or flag to have Rclone ignore these "removed" files. I was trying to use this as a much easier way to filter only the files I want to keep. I am trying to copy 58TB from a Read-Only Google Drive Share and you said I shouldn't try to violate Google Daily Quotas so I am not.

What is a file that was removed? Deleted from the source?

If you are syncing, that keeps the source like the destination, perhaps you want to copy instead.

Interesting Google starting today July 1st seems to have had a change to how Drive works. I think it is related to the change to them using the newer "Shortcuts" as opposed to "adding" shared folders to your own Google Drive. Never the less this makes my question moot. Plus I just figured out a way to do what I need with "-filter-from" flag. It isn't super elegant but since this is a temporary transfer project not a big deal. Sorry for any trouble. Thanks for responding.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.