Rclone Cache storage

Currently I have hired 20GB of VPS storage. I have a Google Suite account, where I have 9 TB stored.

Is 20gb enough for the Rclone cache?

When activating the cache and analyzing my google suite, is there a risk of banning by Google?

Should be plenty. The only reason I can see to have a big cache is if you want to store things locally for a longer period of time and if people are constantly playing the same thing over and over.

I’ve been thinking of about moving to 12 or something smaller since the normal use case is everyone watches a different show anyway and with plexdrive, I had 0GB of cache as it was all in memory.

Thanks for answering,

I have the doubt, if when activating the cache and analyzing my google suite with Rclone, is there a risk of banning by Google since there are many terabytes

When you say analyze, not sure exactly what you mean.

I have 40TB and use the cache and haven’t hit any issues. My entire plex library was already analyzed before I made the switch from plexdrive to rclone cache. i did some testing and analyze anything new coming. I would doubt if setup as described you’d have a chance at a ban.

1 Like

Yes the ban occurs still. What are you deploying incase your case?

If you mean analyse in the sense of what plex/sonarr/radarr does, then only a small part of the file is actually downloaded to make this happen. This shouldn’t affect your 750gb upload limit. As far as I’m aware the cap on the download limit isn’t as strict as that. You shouldn’t be downloading even close to that to analyse your files.

@gforce - There is no ban that occurs anymore. That’s whole point of the cache.

If you think about it, if Rclone had to download the entire file for Radarr/Sonarr/Plex to scan the files, anyone with a library over 750 GB would face daily bans from Google. Typically, the scanners just need to look at a few MB of each file to determine its nature.

The 20 GB limitation is small, but it just means that you won’t be able to cache as much. It shouldn’t affect the ban status one way or another.

It normally just runs a ffprobe or mediainfo type command so only grabs a very small portion of the file.

1 Like

I installed everything in my dedicated server and everything went just fine. But the partition /dev/mapper/vg-lv root is completely full. is that normal ?

Without knowing what your config is, no idea what’s taking up 20GB.

I use rclone cache …

Chunk size: Enter to select the default 5MB.
Cache time: Enter to select the default 6 hours.
Maximum chunk size: Enter to select the default 10GB.

So the rclone cache would use 10GB total space.

Mine is configured for 32GB and it uses exactly that:

[felix@gemini cache-backend]$ du -ms
32311
1 Like

so I do not have to have a problem with my configuration? or should I use a vps with more storage?

What vps you do use?

You gotta do some work and figure out what else is taking up the space.

Are you using cache_tmp_upload as well? What else is running that is storing data on your / filesystem?

Are you downloading stuff to / ?

1 Like

It seems all normal.

I will monitor it a few days to see if it stays.

Thank you !!

since I installed rclone with cache drive it has consumed 70% of my server’s bandwidth, in just two weeks. I only have 2Tb of transfer. Is this normal?

Rclone is just a tool and doesn’t do anything but provide a mount.

Are you using Plex? Are folks watching a lot of movies? What other settings do you have in Plex? Are you running torrents as well?

only I am using it.
I only use it for plex.
The only thing I’ve done has been to update the library every time I upload a movie.
I do not get it

What’s your library settings?

What’s your Server settings for the libraries?