Hard Drive Full - VFS Full mode?

What is the problem you are having with rclone?

Filling up hard drive

What is your rclone version (output from rclone version)

V1.50.2

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Linux Ubuntu 20.04 LTS 64 bit

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

[Unit]

Description=RClone Service

Wants=network-online.target

After=network-online.target

[Service]

Type=notify

Environment=RCLONE_CONFIG=/home/user/.config/rclone/rclone.conf

ExecStart=/usr/bin/rclone mount --buffer-size 16M --cache-dir=/home/user/cache --vfs-cache-mode full --vfs-cache-max-size 40G --vfs-cache-max-age 24h --timeout 1h --rc --dir-cache-time 72h --allow-other --fast-list --cache-db-purge gdrive:/media /home/user/gdrive

ExecStop=/bin/fusermount -uz /home/user/gdrive

Restart=on-failure

RestartSec=20s

User=user

Group=user

[Install]

WantedBy=multi-user.target

The rclone config contents with secrets removed.

\\

[gdrive]
type = drive
client_id = xxx
client_secret = xxx
scope = drive
chunk_size = 16M
token = xxx
root_folder_id = xxx
\\

A log from the command with the -vv flag

Sorry wasn't sure how to pull this

I hope I formatted my queston ok. Apologies in advance. So here is the basic issue.

My hard drive is filling up on this machine. Specicially in the cache folder located at

/home/user/cache

I suspect it has something to do with the vfs mode. Interestingly I have a separate box that works just fine with the exact same setup as here. The only difference I could find though is that it was caching files in

/home/user/.cache/rclone...and it had correct limit of about 40G.

Can't figure it out as its the exact same setup as my previous box - except this one is on a newer kernel? 5.11

Ty!

hello and welcome to the forum,

that version of rclone is very old, you can get the new version here

there have been some major changes to the -vfs-cache-mode full. the new versions of rclone use sparse on compatible file systems. you can read about it here

--fast-list does nothing on a mount

based on your config file, this does nothing --cache-db-purge

Thank you for the help. I guess I will try latest version of rclone. I did double check my working system and it is same version though. Oh well.

Should it be caching in /home/user/cache or /home/user/.cache ?

I will remove fast list on the mount command and no I'm not using cache drive anymore. Ty!

what are you using the vfs for, plex or what?

the locaton really does not matter that much, as rclone will cache where you tell it to.

for the folder that rclone uses for the cache, what is the filesystem, ext4 or what?

Yes for streaming - via Jellyfin though. Can't remember how to verify but its an ubuntu filesystem so ext4 I assume.

Trying new setup now. Will report back.

the vfs is optional.

i also use jellyfin but without the vfs

Interesting. I assume the VFS makes it more performant though right? I'm on a gigabit connection up down with 32Gig Memory

well, you are having issues with lack of hard drive space, so why not try without the vfs cache and see what happens.

I'm going to try on new rclone version and see what happens. I'll report back.

Playback is nice and smooth now (without a jellyfin library scan happening). Now for the real test. Scan the rest of the library and see if it fills the hard drive. Hopefully can playback at same time.

Can stream smooth while scan is happening. So far so good. Seems like I just needed newer rclone! I guess I will update my older box just in case even though it seemed fine

good, are you using the vfs cache or not?

yeah, nothing like an update.

I'm using the vfs cache still. I may have spoken too soon. I'm currently scanning in the tv section, and its not starting a stream at same time. Cache is now full. Hoping it has something to do with initial scan of library. Once that's done I'll report back. As of right now - can't stream while scanning library.

But I think the issue is now Jellyfin for some reason. Seeing a lot of ffprobe failed in its logs and my hard drive is not full like before from Rclone. Strange

do the iniital scan without vfs.

if you fell the need to use vfs cache for playback, then add the vfs command back to the mount.

Hmm ok. I'll probably give that a try at some point. Damn it was starting quick for a while.

So I think I'm getting closer to this mystery. I checked the google console side as someone from Jellyfind said if its an api ban - which I doubt. I'm not sure exactly as to how to confirm but the google console does report

drive.files.get 47% error rate

all other metrics like drive.files.list are all 0% error.

The quota section is not even close to reaching those numbers

well,
jellyfin is expecting a local file system and will read into each and every file, to create thumbnails, previews and all sort of stuff.
so now jellyfin has to download all that data from gdrive.

Just checked my other working box 0% error rate everywhere. I think it has something to do with this.

It did though. It completed my movie library and like half of my tv library. I use mergerfs. Jellyfin thinks its local.

Again not sure if relevant - but the Jellyfin transcode log posted a input/output error. Which led them to think it was api ban.

mergerfs has nothing to to with it.

jellyfin,to create a library, needs to download a lot of data from gdrive.
gdrive has lots of limits and quotas.