VFS cache size limit not working?

Hi,

What is the problem you are having with rclone?

--vfs-cache-max-size 200G not working, logs show about 300GB. Any advice on what i'm doing wrong?

What is your rclone version (output from rclone version)

rclone v1.54.0-beta.4762.640d7d3b4

  • os/arch: linux/amd64
  • go version: go1.15

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 19

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

[Unit]
Description=RClone Service
Wants=network-online.target
After=network-online.target

[Service]
Type=notify
Environment=RCLONE_CONFIG=/root/.config/rclone/rclone.conf
KillMode=none
RestartSec=5
ExecStart=/usr/bin/rclone mount gcrypt: /home/aerya/mnt/gdrive --allow-other --allow-non-empty --dir-cache-time 1000h --log-level INFO --rc --rc-addr :5572 --rc-no-auth --log-file /home/aerya/logs/rclone.log --poll-interval 15s --umask 002 --user-agent xxxx --cache-dir=/cache --vfs-cache-mode full --vfs-cache-max-size 200G --vfs-cache-max-age 336h --bwlimit-file 16M
ExecStop=/bin/fusermount -uz /home/aerya/mnt/gdrive
ExecStartPost=/usr/bin/rclone rc vfs/refresh recursive=true --rc-addr 0.0.0.0:5572 _async=true
Restart=on-failure

[Install]
WantedBy=multi-user.target

A log from the command with the -vv flag

2020/09/22 10:53:42 INFO  : vfs cache: cleaned: objects 74 (was 81) in use 74, to upload 0, uploading 3, total size 310.970G (was 310.895G)

Afaik, files that are open (indicated by the in use parameter), are not removed from the cache. Can you see why those files are open and by what?

Erk, ok... Those files are used either by Plex or Torrent clients (longtime seeding from DGrive).

I am not sure what can be done then.

@ncw?

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.