I have been in contact with google because of the hit limit I am having. I have like 100.000 files in my google drive and every time I need to copy something, rclone takes like 45 min to match all files because of this limit.
Google has sent me this feedback from their side:
There are more specific things we can recommend, like slow down on per-user operations, but compensate by doing more users in parallel to maximize throughput
I wanted to ask if this can be done as the Queries per 100 seconds is higher than the Queries per 100 seconds per user.
Also wanted to know if the api call drive.files.list only just list 1 file, or could be used to list more than 1 with the same querie.
Maybe as I am demanding a bit of optimization this should be handled as feature and not as question.
Please let me know,