[felix@gemini ~]$ rclone size -vv GD: --fast-list
2019/05/23 09:46:28 DEBUG : rclone: Version "v1.47.0" starting with parameters ["rclone" "size" "-vv" "GD:" "--fast-list"]
2019/05/23 09:46:28 DEBUG : Using config file from "/opt/rclone/rclone.conf"
2019/05/23 09:46:29 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=286927808882, userRateLimitExceeded)
2019/05/23 09:46:29 DEBUG : pacer: Rate limited, increasing sleep to 1.222269795s
2019/05/23 09:46:29 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=286927808882, userRateLimitExceeded)
2019/05/23 09:46:29 DEBUG : pacer: Rate limited, increasing sleep to 2.14151141s
2019/05/23 09:46:29 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=286927808882, userRateLimitExceeded)
2019/05/23 09:46:29 DEBUG : pacer: Rate limited, increasing sleep to 4.440615196s
2019/05/23 09:46:29 DEBUG : pacer: Reducing sleep to 0s
Total objects: 266487
Total size: 63.565 TBytes (69890811208237 Bytes)
2019/05/23 09:58:08 DEBUG : 4 go routines active
2019/05/23 09:58:08 DEBUG : rclone: Version "v1.47.0" finishing with parameters ["rclone" "size" "-vv" "GD:" "--fast-list"]
403 rate limiting happens when you request too much per second.
Run your command share a log with -vv of the command and share the log.
The issue can be caused by your account getting more object over time as if you have fewer files, you wouldn't hit rate limiting.
It hasn't produced the output after several minutes.
If I'm requesting too much per second I'm unsure how to reduce this other than --tpslimit which also doesn't produce an output either. What do I need to do?
Btw, mine took almost 20 minutes to come back. and I have higher per second limits than most people. ( I have more than double the number of objects as @Animosity022). So if you have a LOT of little files, expect to wait longer.
rclone size xxx: --fast-list -vv
2019/05/23 10:14:20 DEBUG : rclone: Version "v1.47.0-019-g3d475dc0-beta" starting with parameters ["rclone" "size" "xxx:" "--fast-list" "-vv"]
2019/05/23 10:14:20 DEBUG : Using config file from "/home/xxxx/.rclone.conf"
2019/05/23 10:14:20 DEBUG : xxx: Loaded invalid token from config file - ignoring
2019/05/23 10:14:21 DEBUG : xxx: Saved new token in config file
Total objects: 682719
Total size: 6.442 TBytes (7082535027913 Bytes)
2019/05/23 10:33:59 DEBUG : 19 go routines active
2019/05/23 10:33:59 DEBUG : rclone: Version "v1.47.0-019-g3d475dc0-beta" finishing with parameters ["rclone" "size" "xxx:" "--fast-list" "-vv"]
You had rate limiting due to the # of objects and the size command
Those rate limits are benign errors and retry
~250k objects take about 10 minutes
1.2m objects takes about 40 minutes
If you'd want to continue to use the size command, the cache backend would be a great use for this as it would take a long first time but depending on the object change rate, it would keep a db list locally so you don't make all those calls. The caveat is that you can only use 1 cache backend per process going.
That's a post from 2017 when I used the wrong term.
That's from @remus.bunduc not @ncw so that's also an incorrect item and I can submit a pull request to fix it. He actually means he gets a download quota exceeded, which is a different error, which is also, not a ban.
403s come in a few ways.
Rate limits, which we covered.
Quota items
The daily upload limits for regular uploads and server side copies
The daily download limit for files
Banned means you lose access to your entire account like for having pirated content / etc.