What is the problem you are having with rclone?
I am limited by "Queries per 100 seconds per user" of google drive api, only 1000 requests can be made every 100 seconds.
I sent a letter asking Google to increase, and they answered me:
We have received your quota request for PROJECT_NAME.
Unfortunately, we are unable to grant you additional quota at this time. If
this is a new project please wait 48h until you resubmit the request or
until your Billing account has additional history.
Your Sales Rep is a good Escalation Path for these requests, and we highly
recommend you to reach out to them. In case you don't have a dedicated
Sales Rep, you can contact our Sales Team.
If you have any further questions, please reply to this thread or feel free
to reach out to us at email@example.com.
my english is not very good, they mean that I have to buy goods to allow the restriction to be increased.
but i see this in the Google api file:
Note: Per-user quotas are always enforced by the Drive API, and the user's identity is determined from the access token passed in the request. The quotaUser and userIp parameters can only be used for anonymous requests against public files.
I want to know how I should implement this method in rclone.
What is your rclone version (output from
rclone v1.53.1 - os/arch: linux/amd64 - go version: go1.15
Which OS you are using and how many bits (eg Windows 7, 64 bit)
GCP, ubuntu 16.04 LTS
Which cloud storage system are you using? (eg Google Drive)
Google Drive, not storage.
The command you were trying to run (eg
rclone copy /tmp remote:tmp)
rclone copy "gd:/lots-of-fragmentary-pictures" "gd2:/target" --fast-list
Now, I run rclone through multiple projects to solve a copy job.
But in most cases, I cannot split a copy job into multiple ranges.
Thanks in advance.