(error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)

Has anyone got a fix for this? I am copying from a Shared Folder to one of mine on Google Drive

I’d like to know too.

How are you copying it?

I am going:

rclone copy gsuiteencrypt: gdriveencrypt: --min-size 5M --no-traverse --checkers 6 --transfers 4 -v --min-age 5m --log-file=/home/me/logs/gsuite2gdrive.log

I just added the min size flag to cut down the volume of files, but obviously that wont suit everyone

rclone copyto Google:“Temp Folder”/“Hawaii Five O” Google:“temp3”

One time it copied 15-16 files from each of the season (has 23) and then the next time I tried it is when I get the User Limit error

When you cut down the volume of the files its just copying them in small chunks right?

I got this far in:

Transferred: 166.315 GBytes (108.993 MBytes/s)

And got the 403 again.

I just tried it in Ubuntu and still get the same error…

I wish I had more to add. I’ve done copies from ACD to GDrive using the same rclone copy or clone sync commands and never got any 403s on it.

I wonder if it because you are doing GD to GD and there is that hard API limit per user per 100 seconds. Are you using your own API key to see how it is doing?

I would hit 403 when doing a full scan/analyze against my library never with a copy/sync.

Looks like you are running into the issue of “server-side” copy not working any more
See: https://github.com/ncw/rclone/issues/1339

It seems to be limiting the transfer to 100GBs

I tried it not even server side, but copying one remote to another, and its still doing it.