i always disable all scheduled scans. i do that manually.
when i have new media in the mount, i run rclone vfs/refresh, which will update the dir cache.
then i run a manual scan with jellyfin, which will go quickly.
What is your rclone version (output from rclone version)
/rclone version
rclone v1.55.0-beta.5152.b2b5b7598
os/arch: linux/amd64
go version: go1.16rc1
Which OS you are using and how many bits (eg Windows 7, 64 bit)
Ubuntu Desktop 20.04.LTS
Which cloud storage system are you using? (eg Google Drive)
Googledrive Workspace
The command you were trying to run (eg rclone copy /tmp remote:tmp)
I guess any when the problem starts, even a reverse copy IE:
./rclone copy "GCrypt:Movies/The Legend of Hei (2019)/The Legend of Hei 2019 Unknown.mp4" C:\Users\MYACCOUNT\Desktop\Archivio -P
2021-04-11 01:20:38 ERROR : The Legend of Hei 2019 Unknown.mp4: Failed to copy: multipart copy: failed to open source: open file failed: googleapi: Error 403: The download quota for this file has been exceeded., downloadQuotaExceeded
As I said even a browser manual download won't work.
Yes, I Know. Yet I am trying any possible configs/limit or things, even enough cache to download locally the wole movie, but I can only think that each movie-fragment (even more if it is a multi-thread download) counts as a whole download.
Maybe you need to ask RClone developers to give you some history/summary of the api call distribution.
The issue here appears to be rate limits that Google, and many cloud vendors, impose on certain behavior. Maybe you need to ask Google for some information about what is behind the message to help out.
I have had paid Google accounts and have hit these a few times with mail. Eventually someone changed something (Microsoft or Google) and haven’t seen mine for a long time. Google rate quotas reset every say if I remember right. Sometimes all you need to do is wait until tomorrow:-)
So yes maybe there is an rclone change that does more api calls. But without knowing the call, hence the request from Google, how would development figure it out? In fact, once you know the issue you can even look at the source code to give you ideas for change where ever it might need to be done.
I had a similar problem to OP. I was just copying a folder (7.5 TB) from google drive to unraid using “rclone copy gdrive:/path mnt/user/downloads/drive -P” which worked for about 700GB, but then got the same Error 403: The download quota for this file has been exceeded downloadQuotaExceeded.
If this isn’t an API limit issue, should I be using a different flag with the copy command? 7.5TB should be under the daily download limit to my understanding.
Oh is the limit 750 GB per day? If so I’ll use the bwlimit tag as suggested
Edit: also, to my understanding I should be able to use the copy command again (using the tag) and not have to worry about duplicates, right? It’s too bad, I was enjoying these transfer speeds lol
Currently I encrypt all the contents of my drive using: rclone copy --create-empty-src-dirs --bwlimit 4M /mnt/encryption/Series Gcrypt:/Series
encryption is my workspace and Gcrypt is a shared disk that I created for myself from my workspace. so everything is done on the fly without having to worry about exceeding the limit per day