How to copy file from remote to local while Plex streams it?

I am using rclone with two different cloud remotes mounted using rclone union in combination with my local storage. I am in the process of moving all my cloud-based data to my local storage, but I was wondering if there was some way to have specific files downloaded locally as they're streamed via Plex? This would eliminate essentially downloading the same file twice or more.

this is not officially supported but could work.

  1. use --vfs-cache-mode=full
  2. stream the media.
  3. mv /path/to/mount/file.ext /home/username/file.ext

Thanks for the idea! I have this pretty much working, and I believe it's definitely using the vfs cache when invoking the move command from the rclone mountpoint, as it's taking around 30 seconds to copy files around 5GB, and my internet certainly isn't that fast.

For anybody else who happens to stumble upon this post, what I did was basically this:

  • Mount an rclone remote with --vfs-cache-mode=full
  • Ensure your cache has ample space allowed to ensure entire files are kept there
  • If using Plex, I just created a notification in Tautulli for when a stream is stopped to trigger a script, sending the file path and file name as parameters.
    • My particular script removes the beginning path from the passed argument and stores that as a variable as the "relative" path so I can manipulate the root path easier.
    • It first prepends that relative path with the path to the vfs cache. If that file exists in the cache, it will run an if loop to check if the file exists in any of my three cloud mounts and if so, will move that file from it's respective rclone mount to the local unraid array.

This ensures there's no wasted bandwidth as I have another long-running rclone move process using rclone rc in the background constantly moving the entire root of my cloud storage to my local unraid array. I have another script that tails the Plex logs and adjusts the bandwidth limit for the rclone move process to ensure I have enough download bandwidth to download files being streamed in Plex. Since this file is being downloaded for Plex to stream it, now I can make sure the file is transferred to my local storage after it's cached rather than having to re-download it later.

The only thing I'm missing in this script is checking the file size via block allocation before initiating a move, just to make sure most of the file is actually cached. You can't simply check straight file size since it will always show the size of the original file, but you can see how many blocks the cached file is using and compare that to the file in the rclone mount. Unfortunately, since I'm running Tautulli in a Docker container and thus my script only has access to binaries in that container, I haven't found a way to compare file sizes using "bc" as it can't access necessary libraries. But even still, it's better than nothing.