Rclone move hit the Gdrive API Limit

I have a huge foldersystem on Gdrive and want to move files from a Server to Gdrive.
For imagine it looks like:

  • Folder 1
    • Folder 11
      • Folder 111 --> empty
    • Folder 12
      • Folder 121 --> empty
    • Folder 13
      • Folder 131 --> empty
    • Folder 132 --> file.txt
  • Folder 2
    • Folder 21
      • ...
    • ...

I do that with a Script that looks like:

if find $FROM* -type f -mmin +15 | read
echo "$(date "+%d.%m.%Y %T") RCLONE UPLOAD STARTED" | tee -a $LOGFILE

rclone move $FROM $TO -c --bwlimit 250M --transfers=300 --checkers=300 --delete-after --min-age 15m --no-traverse --exclude .sync/** --log-file=$LOGFILE
echo "$(date "+%d.%m.%Y %T") RCLONE UPLOAD ENDED" | tee -a $LOGFILE
'# echo "$(date "+%d.%m.%Y %T") RCLONE NOT STARTED" | tee -a $LOGFILE'
exit 1

I search with find files and move it to Gdrive.
But if there are files, i got an API ban and the log says:

error reading destination directory: couldn't list directory: googleapi: Error 403: User Rate Limit Exceeded.

Thats interesting beceause in that folder which rclone tried to list, was no new file. The folder was empty and the subfolder etc. I got a bunch of that messages. So the new files cant move beceause the api limit is hit bevore.

Did i miss a flag to don't hit the 1k Querry API limit? And Why is there a need to list empty folders that normaly don't get touched?

You aren't hitting an API ban.

Google is telling you that you are making too many requests and to slow down.

Those are normal items.

You can normally create 2-3 files per second and are limited to 10 transactions via the API per second. Your command is flooding it as you have way too many transfers and checkers.

You should remove the transfers and checkers and just use the default values.

1 Like

thanks, i will remove that and try it again.

On the other hand, 403s are very ambiguous. I got the same response when I exceeded my 750GB 24hr limit.

In my latest tests, it will work now. thanks.
it's a bit confusing beceause the script was runing since 3 years with that same flags.
Anyway. It worked now. I will come back when it happen again :smiley:

Not exactly the same response.

There is a new change to handle that is in the beta and should be released in the next stable.

What do you mean not exactly the same response? Should I receive something else other than 403s if I exceed the limit?

That's all documented in the issue. If you give it a read, it explains how the flag is implemented.

I'm talking about the present, i.e. stable, not the beta.

Yes, it's documented in the issue as you can see the different responses...

Ok, so for this error, how is it characterized?

2020/01/30 16:32:57 ERROR : Movies/Pretty Woman (1990)/Pretty.Woman.1990.1080p.BluRay.x264-Japhson.mkv.partial~: WriteFileHandle.Flush error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded

How hard is it for Google to truly, separate them?!

It's documented in the issue:

The reason is the same for both errors which is why it's confusing.

Now, i got the same error. But i definitly not uploaded 750 GB. Maybe 100 or 150 GB in the past 24 hours.
That error is an other then my first issue. There is no

error reading destination directory: couldn't list directory

and rclone retry 4 times which all failed.

Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded

the (last 100second) querrys are at 20-60 querrys/100sec

in the console i see that 100% of drive.files.create failed.

Almost the same here. 4th day in a row I've been rate limited with varying periods of restrictions.

That means you hit your quota. Nothing to do but wait.

1 Like

That means you hit your quota. Nothing to do but wait.

yes, its the same 403 error you can also see in the logs.
The question is why we got banned.
How mouch you uploaded in the past 24 hours?

You don't get 'banned'.

Google has 24 hours quotas on uploads and downloads. If you hit the quota, you have to wait for it to reset. There is no way for you to check your download or upload as it isn't documented. They document you can download 10TB and upload 750GB but that's about it.

i know this limitations. i never got near this. 100-150 GB Upload and maybe 300 GB Download.

But maybe there is also a quota for other things i don't know and i ran into it beceause of my first misstake i made.