Getting 403 error when copying files from GDrive to the same GDrive. why?

It's a simple question about why I'm getting the "403 User rate limit exceeded error" copying 9TB of filder/files from one folder to another locally (inside the same GDrive account). I tried with rclone copy and rclone sync.
The only way I found is to move the data to another folder (it worked), but I need a copy, not move it.

Until now i thought the api was hit when the data transfer was from remote to remote, from local to remote or vice versa. I'm probably wrong and the API limit also apply at the local level.

Every file is between 4 and 7 GB each so maybe the error has to be with the 750GB a day limit?

Can I limit the copy speed of the rclone copy command to not reach API limits?

Thank you very much in advance,

I believe the upload quotas apply to files that are copied between folders also.

Remember the google backend is actually copying bits about the place when you do this and using up more disk space so it takes time and uses resources.

There is no way of limiting the speed of the server side copy at the moment.

You'll have to stay under the API limits with

  --max-transfer SizeSuffix   Maximum size of data to transfer (default off)
1 Like

Thank you very much @ncw !!! I really appreciate all of your hard work and passion you put into this project. What an amazing piece of software you have brought us!

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.