Cache with data from Gdrive

can Rclone have config option with mount command to help when first time to access file it remote read file and cache to local then the second time read that file it just read from cache

What is your rclone version (output from rclone version)

v1.51.0

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 18.04

Which cloud storage system are you using? (eg Google Drive)

Gdrive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone mount 
	--log-level INFO \
	--umask 022 \
	--vfs-read-chunk-size=128M \
	--vfs-read-chunk-size-limit=0 \
	--vfs-cache-mode writes
	--cache-dir /var/RCache
	--vfs-cache-max-size 10G

i try this command but it not cache any thing in folder /var/RCache (only cache when --vfs-cache-mode full) and --vfs-cache-max-size 10G not work, cache file is over 10Gigabit

You can use the cache backend for this:

https://rclone.org/cache/

But I'd personally just wait for the updates to the default backend to come out as that's going to have more what you are looking for.

That only caches writes and uploads them when complete, it keeps them there based on the vfs-cache-age.

has to download the whole file before it serves a read and not useful for streaming or large files imo.

in https://rclone.org/cache/
i found this parameter --cache-workers so if i set this 4
then when i work with 4 files in same time mean we have 16 worker or 1 woker for each file (if this happen which scenario if i work with 5 file with 4 worker)

It's not really a number of files thing. It's how it's going to get chunks and how many 'threads' if you will are going to get chunks.

Setting it too high would make it slow and too low would be slow as well. It's more of finding a sweet spot based on your use case.

https://rclone.org/cache/#cache-workers

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.