i use rclone-v1.34-38 to mount my gdrive to a dedicated server.
today I ran into the issue that for some files i recieve the following:
cannot open ‘xxxx.mkv’ for reading: Input/output error.
When checking the webinterfac of gdrive for the specific file and trying to manually download i get:
download quota exceeded for this file, so you can’t download it at this time
So I guess rclone can’t handle this specific type of error right now?
Also: the --dump-headers option of rclone only revealed “403 Forbidden” without stating any further information. Would it be possible to get this information like “user / dailiy limit” integrated into rclone?
I am not sure Google has a bandwidth limit per day which is exceedable, I have transferred over 10 terabytes in under 24 hours before. Since moving to a mount which does not use queries I have not reached any kind of a quota issue, except for when I used rclone to sync my GDrive twice in a day, which transferred little data, but did many checks.
I believe queries are checking what is available in a given directory. Checking the contents of one folder may be considered one query. It doesn’t use queries because it caches the entire folder structure instead of constantly asking Google what is in folder X to retrieve a list of files. One thing that rclone does that can use a lot of queries when uploading is check for duplicates.
If you’re receiving a quota limit simply by uploading throughout the day, and you don’t heavily access the mount, it’s likely that rclone checking for duplicates when you upload is the cause. If you upload a single file (or anything) into a folder with many subdirectories, I believe rclone checks each folder to make sure you’re not uploading any duplicate files. If I am correct about queries being the cause of Google’s download limits, you should benefit by adding --no-traverse when uploading files. This will stop rclone from checking every subdirectory to make sure you’re not uploading a file which already exists.
Last week I uploaded a audio file of one popular video available in YouTube. I added its link in Google drive and shared on YouTube. But Many people downloaded the file and I got an error. It shows “download quota exceeded for this file, so you can’t download it at this time”.
I found some third party file hosting services can overcome this issue. One such service is Kiuna. You can see how to bypass this Google Drive Error in this article.
A high cache info time works well to make the drive appear more snappy when navigating and also cuts down on a lot of time when syncing since it doesn't have to make a lost of list requests to traverse everything.
That said, be very aware of what the potential downfalls are.
Most importantly, very long info expiry timers should only be considered safe if ALL changes to it happens though this cache point. Otherwise, file info can become out of date and incorrectly displayed. In best case you don't see file changes done outside your cache. At worst files can get corrupted because it assumes a wrong size when it works with them.
I use long cache expiry timers myself, but only after thoroughly understanding them. Even other rclone instances on the same machine must be considered here...
Common pitfalls to be aware of:
Confusing VFS cache and backend cache parameters. Read up on which apply to which and know that these do not coordinate with eachother. Also consider their order if you use the cache-backend.
Setting a VFS --dir-cache-time higher than --cache-info-age (this must be avoided). If you have a high --cache-info-age (assuming that you use the cache backend that is) you can leave the VFS timer to default.
Forgetting that other rclone instances, such as daily syncs, also have to go through the cache to avoid your cache info becoming incorrect.