I have a huge foldersystem on Gdrive and want to move files from a Server to Gdrive.
For imagine it looks like:
Folder 1
Folder 11
Folder 111 --> empty
Folder 12
Folder 121 --> empty
Folder 13
Folder 131 --> empty
Folder 132 --> file.txt
Folder 2
Folder 21
...
...
I do that with a Script that looks like:
if find $FROM* -type f -mmin +15 | read
then
echo "$(date "+%d.%m.%Y %T") RCLONE UPLOAD STARTED" | tee -a $LOGFILE
'# MOVE FILES OLDER THEN 15 MINUTES'
rclone move $FROM $TO -c --bwlimit 250M --transfers=300 --checkers=300 --delete-after --min-age 15m --no-traverse --exclude .sync/** --log-file=$LOGFILE
echo "$(date "+%d.%m.%Y %T") RCLONE UPLOAD ENDED" | tee -a $LOGFILE
else
'# echo "$(date "+%d.%m.%Y %T") RCLONE NOT STARTED" | tee -a $LOGFILE'
exit 1
fi
exit
I search with find files and move it to Gdrive.
But if there are files, i got an API ban and the log says:
error reading destination directory: couldn't list directory: googleapi: Error 403: User Rate Limit Exceeded.
Thats interesting beceause in that folder which rclone tried to list, was no new file. The folder was empty and the subfolder etc. I got a bunch of that messages. So the new files cant move beceause the api limit is hit bevore.
Did i miss a flag to don't hit the 1k Querry API limit? And Why is there a need to list empty folders that normaly don't get touched?
Google is telling you that you are making too many requests and to slow down.
Those are normal items.
You can normally create 2-3 files per second and are limited to 10 transactions via the API per second. Your command is flooding it as you have way too many transfers and checkers.
You should remove the transfers and checkers and just use the default values.
In my latest tests, it will work now. thanks.
it's a bit confusing beceause the script was runing since 3 years with that same flags.
Anyway. It worked now. I will come back when it happen again
Now, i got the same error. But i definitly not uploaded 750 GB. Maybe 100 or 150 GB in the past 24 hours.
That error is an other then my first issue. There is no
error reading destination directory: couldn't list directory
and rclone retry 4 times which all failed.
Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
the (last 100second) querrys are at 20-60 querrys/100sec
in the console i see that 100% of drive.files.create failed.
Google has 24 hours quotas on uploads and downloads. If you hit the quota, you have to wait for it to reset. There is no way for you to check your download or upload as it isn't documented. They document you can download 10TB and upload 750GB but that's about it.