Google Drive Quota Help

I currently have a library of games, and a lot of people downloading them from google drive.
Recently there's been alot of quota errors from google and nobody can download from the drive for 24 hours. I just started looking into rClone as it was said it would avoid quota errors from download.

Is there a specific flags I should use to upload to my drive that will avoid download quota?

I am simply using rclone copy to my drive.

EDIT:

rclone v1.50.1

  • os/arch: windows/amd64
  • go version: go1.13.4

using rclone copy file remote:

Set up my own API Key.

Error is on the application that I am hosting, due to traffic it shows "error quota exceeded"

You missed using everything in the quota template like version, command you are using and what the actual errors are in the debug log.

Can you please include all the information from the question template?

Sorry I just updated it

Can you share:

What is your rclone version (output from rclone version)

felix@gemini:~$ rclone version
rclone v1.50.2
- os/arch: linux/amd64
- go version: go1.13.4

and

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone ls GD:hosts -vv

A log from the command with the -vv flag (eg output from rclone -vv copy /tmp remote:tmp)

felix@gemini:~$ rclone ls GD:hosts -vv
2019/11/29 14:38:12 DEBUG : rclone: Version "v1.50.2" starting with parameters ["rclone" "ls" "GD:hosts" "-vv"]
2019/11/29 14:38:12 DEBUG : Using config file from "/opt/rclone/rclone.conf"
2019/11/29 14:38:13 DEBUG : Wednesday-2019-11-06: Excluded
2019/11/29 14:38:13 DEBUG : rclone.log: Excluded
2019/11/29 14:38:13 DEBUG : crypt: Excluded
      227 hosts
2019/11/29 14:38:13 DEBUG : 5 go routines active
2019/11/29 14:38:13 DEBUG : rclone: Version "v1.50.2" finishing with parameters ["rclone" "ls" "GD:hosts" "-vv"]
felix@gemini:~$

Sorry I'm a little new to rClone

rclone v1.50.1

  • os/arch: windows/amd64
  • go version: go1.13.4

rclone ls stash123:testing123.txt -vv
2019/11/29 14:55:45 DEBUG : rclone: Version "v1.50.1" starting with parameters ["rclone" "ls" "alex:testing123.txt" "-vv"]
2019/11/29 14:55:45 DEBUG : Using config file from "C:\Users\Alex\.config\rclone\rclone.conf"
2019/11/29 14:55:46 Failed to ls: directory not found>

Did you run rclone config? Are you using the same user you are copying with? The error means that it cannot find the rclone.conf

Yes, so it worked I just had to modify the parameters. Does this fix the quota issue?

D:\DesktopHDD\swHacks\rclone-v1.50.1-windows-amd64>rclone ls stash123: -vv
2019/11/29 15:20:04 DEBUG : rclone: Version "v1.50.1" starting with parameters ["rclone" "ls" "stash123:" "-vv"]
2019/11/29 15:20:04 DEBUG : Using config file from "C:\Users\Alex\.config\rclone\rclone.conf"
14689396059 NSZ/base/SDガンダム ジージェネレーション クロスレイズ プレミアムGサウンドエディション [010022900D3EC000][JP][v0].nsz
109298499 NSZ/base/SDガンダム ジージェネレーション モノアイガンダムズ[0100D7D00E5A8000][v0].nsz
2019/11/29 15:20:15 DEBUG : 9 go routines active
2019/11/29 15:20:15 DEBUG : rclone: Version "v1.50.1" finishing with parameters ["rclone" "ls" "stash123:" "-vv"]
c

hi, is your bucket named testing123.txt as that looks like a typical windows filename, not a folder?

no, that is a test file I am sending to the bucket. the bucket is called stash123:

ok. glad that you got it all working

Oh no but does this at all fix the quota issue I am facing with the bucket?

I was wondering if there is a solution for when the quota issue arises. Can I keep the same download links and swap to a different bucket account? Or is there any parameters I can add that will limit download speed and api calls to make sure the quota isn't reached.

Anything that can allow me to maintain the same GDrive download links and avoid the error.

No. Not likely.

You say that when you get the error - downloads are locked for 24hrs. Are all files affected when this happens, or just this file?

There are really only 2 possibilities here I think. Either you have managed to somehow use up your download quota (10TB/day), which is quite a feat if you did...

Or else you have hit the limit on how many times a spesific file can be downloaded in a short time. I do not think this lock is 24hrs however, but at least a few hours. I do not know the exact details of this except that it may occur if a file is shared hundreds of times in a short time. This is a pretty hard restriction to trip - unless you use Gdrive as some sort of mass-distribution platform (which is against TOS anyway). I have seen it happen sometimes by mistake however - if certain types of software on a mounted drive keeps accessing the file again and again rapidly, so it is not impossible that it could be a software issue if you know the other things to do not apply to you.

The actual API quota is probably not something you have to think about. There is technically a 24hrs limit on API calls too - but you can't even reach it unless you have something like a dozen users sending requests 24/7 on the same API key. I don't think I have ever seen anyone exhaust this quota. Aside from that there is only a short-term burst quota: 1000req/100sec, so you can not get locked out from the API for very long at a time.

Does it sound like any of these might apply to your situation?

This page will give you a lot of statistics about how the Drive has been accessed - how many download requests there have been ect. Usually you can see if a program is misbehaving here because the numbers will be way too high compared to what you expect. (feel free to share screenshots if it is hard to understand).
https://console.developers.google.com

To track down what the problem is, it would help a lot if you can supply us with a debug log. Add these two flags:

--log-level=DEBUG
--log-file=MyRcloneLog.txt

Try to catch the problem when it happens (ie. being unable to download). Then show us the log-file so we can analyze it (warning - the file may show names of some files and folders that are being accessed). The debug-log can grow very large quickly. You probably have to share it on pastebin for example.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.