That message says "the download quota has been exceeded" - that is google's way of telling you that you've downloaded that file too many times and you'll have to wait 24h before you can download it again I think.
So if i limit request in 100 seconds, should be solved ?
Or there isn't any workaround to fix it ?
And is possible see the request limit for a file ?
limit are:10tb in download
750gb in upload
10.000 queries in 100 secs
and 1.000.000.000 query/day
but for a single file ?
The only thing Google documents is a 10TB download and a 750GB upload quota per 24 hours. There is also no way to see where you are on these quotas or specifically when they reset.
Everything else is not documented so you have no way to see if you are hitting something along those lines.
It's not a documented thing so it's hard to tell. It's really trial and error on how many times you can share something or download the same file without it being too much.
Correct me if I'm wrong, I guess a way to fix that would be changing --vfs-cache-mode to full and also change the --vfs-cache-max-age to something greater than the standard 1h? Of course, this will bring two more issues for @RobertusIT:
The files will be available once downloaded in full, which is not a big deal for small files.
Local storage availability to support these files being written to disk.
Is there any other way for rclone to write to disk besides in full mode?
Thanks!
Yes, sorry, I'm also using it, and wasn't thinking about it this way. Question on this: can we use the cache's parameters (ie --cache-chunk-total-size) when mounting an encrypted remote? (Gdrive > cache > Crypt > < Mount).