VFS Cache purge - Prioritise file size instead of file age

The VFS cache is definately being hit. I've played around with this a lot to try and reduce loading times. I had the most luck with a extra layer of cache with using nginx since I can control it a bit more. But unfortunately there's a weird bug where about a 3rd of my video files just completely fuck with it. Might have to take ncw's advice for the meantime and write up a script to find the jpgs in the cache folder and then read the corresponding file on the remote every now and then. And then maybe let it not run once a week to clear out bloat.

I feel like refusing to delete most of my library or add several hundreds of gigs worth of local storage (many $ per month extra) isnt too unreasonable.

I don't expect this to be coded any time soon. But I feel like giving the user a little more control about how the cache acts can be beneficial to others too. Maybe that [--order-by] parameter could also be a way.

Edit: Wrote that script to read all the jpgs every 30 mins to keep them fresh and it seems to be working well for now. Hopefully it will self regulate by clearing out jpgs at random when it starts to fill up and then someone watches a movie. Here it is below for anyone else to use or modify.

Step 1 > Change max age of VFS back to 1 hour.
Step 2 > Make the below script run every 30 minutes via a crontab - 1,31 * * * *
Script > Replace USER (or whole path) and PATH with your setup.

#!/bin/sh
# Rclone cache vfs folder:
folder=/home/USER/.cache/rclone/vfs

# Rclone mount path:
mount=/PATH

# Magic code
cd $folder
find -type f -name '*.jpg' | cut -c 3- | while read filename
do
tail "${mount}/${filename}" > /dev/null 2>&1
done