How Rclone API requests relate to download size

Hi All,

I've found the details of the absolute Google API transfer limits in terms of size, and also the daily budget for API requests. What I'd like to know is whether there is an estimate of how many API requests Rclone makes for uploading or downloading a given file size - say 1GB?

And, does that mean rate of API use mean that Rclone would hit the API limit before it hit the absolute upload and download limits?

Thanks.

Well you'll need to list the relevant folder (1 call), or if you used fast-list then many lists were bundled together so it's effectively much less than 1.

Then you start uploading (1 more call)
If chunking is used (which is typical) it will be one call pr chunk.
Let's say we used a high-performing 64M chunk size (can be adjusted in rclone for most remotes).
For 1GB that would need a total of 16 chunks to finish.

So about 17 for the whole thing - roughly? It really depends on a lot of things as you see... but the takeaway here is it's not a lot - and it's not necessarily that dependent on the size of the file.

PS: For Gdrive, modify the chunk size by setting chunk_size = 64M in your config file under the Gdrive remote, or using --drive-chunk-size 64M in your command as a flag. The default is 8M and I recommend using a higher amount (at the cost of a bit of memory). Not because it will save you API calls. It will, but that's not a problem. The good reason to use a higher chunk it will significantly boost your upload speed by utilizing your bandwidth more effectively - as much as 20-40% on larger files. (a side-effect of how TCP ramping works - which is a topic for another day).

Normal API limit is by default 1000 calls pr 100 seconds.
You have like 10 million or something like that total calls in a day technically, but a single user can only achieve 865.000 due to the previous limits, so the total quota is irrelevant unless you have many heavy users on the same clientID (such as is the case with the default rclone clientID you use if you do not make your own).

So in short, this is a LOT of API calls. You will not be limited by this unless you have scripts that go totally nuts and malfunction - or maybe if you Plex with very badly set auto-scanning features or something like that to hammer the API constantly. You can absolutely have short bursts of activity that hit the 100sec burst-limit (and that is normal), but that's very temporary and rclone also has a pacer for most remotes (including Google drive if that is what you use) that keeps this in check and gracefully throttles accordingly to not cause stalls or other problems. It is tracking that limit locally and trying to not exceed it (or at least not by much) as that is the most efficient. After all, rejected API calls have to be re-sent, wasting our API quota and our time, so we want to minimize this (even if it hard to avoid 100%)

TLDR: Don't worry about the API for your uploads. It will be irrelevant.
Assuming you are on Gdrive then your real limitation will be the 750GB/day upload limit.
You also have a download limit - but that is 10TB/day, so good luck trying to go over that...

Questions? :slight_smile:

That's great - thanks for taking the time to explain it clearly. I had imagined file access being done with far smaller blocks, and so much higher API use. Sounds like the API limit shouldn't be an issue. :slight_smile:

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.