Google drive QPS

Hello,

I have been in contact with google because of the hit limit I am having. I have like 100.000 files in my google drive and every time I need to copy something, rclone takes like 45 min to match all files because of this limit.
Google has sent me this feedback from their side:

There are more specific things we can recommend, like slow down on per-user operations, but compensate by doing more users in parallel to maximize throughput

I wanted to ask if this can be done as the Queries per 100 seconds is higher than the Queries per 100 seconds per user.
Also wanted to know if the api call drive.files.list only just list 1 file, or could be used to list more than 1 with the same querie.
Maybe as I am demanding a bit of optimization this should be handled as feature and not as question.

Please let me know,

Pedro

What’s the command you are running?

What’s the composition of those 100k files ? Are they in a lot of folders?

I only have 25k files and any copy runs instantly for me if I was to copy a file.

If you use the latest beta it does a much better job of matching the google rate limits by default.

It lists one directory by default. However if you use --fast-list (which will use more memory) then it will list multiple directories and be more efficient.