Set specific times for rclone to sync newly downloaded data

Rclone Syncing is causing streaming to be impossible. I am wanting to know if there is a way to specify a specific time of day for the syncing part of rclone to take place while keeping the drive mounted for use. Or is there something I should add/remove to my command to block/keep syncing from being a thing and then add/remove this part of the command before I go to bed each night to allow syncing to resume manually?

rclone v1.56.2

  • os/version: ubuntu 20.04 (64 bit)
  • os/kernel: 5.4.0-92-generic (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.16.8
  • go/linking: static
  • go/tags: none

Google Drive

screen rclone mount --cache-dir=~/cache --vfs-cache-mode=full -v --dir-cache-time 1000h --poll-interval 0s --allow-other --vfs-cache-max-size 200G --vfs-cache-max-age 1h secret: ~/drive

The rclone config contents with secrets removed.

[GMedia]
type = drive
client_id = awholebunchofstuff.apps.googleusercontent.com
client_secret = 
scope = drive
token = 

[secret]
type = crypt
remote = GMedia:crypt
filename_encryption = standard
directory_name_encryption = true
password = 
password2 = 

A log from the command with the -vv flag

Couldn't figure out how to do this on Linux...
1 Like

I am not sure what you mean by rclone syncing as you have a mount.

A mount only reads based on what is requested by the application. If your application is requesting things, they get downloaded.

hi,

you can set schedule for rclone sync using
https://rclone.org/docs/#bwlimit-bandwidth-spec
and also use cron to start rclone sync at a certain time.

To clarify, when I download new items, let's say 200 individual episodes. Rclone encrypts and puts those on my google drive. while it does that my plex server is basically unusable. So I am wanting to know if there is a way to schedule the encrypting and uploading of new files to gdrive. Like, if I download new files in the middle of the day it will only encrypt and add those to gdrive between the hours of 10pm and 8am.

You mean you are copying to the mount and uploading to your Google Drive?

Gonna level with you, I am not entirely clear on how the process works. I think it goes download file to temp directory>sonarr/radarr move file to correct directory>rclone sees new file and encrypts it>rclone reuploads newly encrypted file to gdrive.

If someone is streaming from the server and the server is using gdrive and transcoding. Would the stream be an upload or a download? Basically not sure what to limit here.

hi,

since the files are in gdrive, the server would have to stream/download the file.

let's say:

  • the resolution of the file in gdrive is 4k
  • the resolution that the client needs is 720k
  1. the server would stream/download the file from gdrive in 4k.
  2. the server would transcode it to 720k and stream to the client.

So for the server it would all be download then?

that is correct

So then would this command work for buffering issues? screen rclone mount --cache-dir=~/cache --vfs-cache-mode=full -v --dir-cache-time 1000h --poll-interval 0s --bwlimit 10M:off --allow-other --vfs-cache-max-size 200G --vfs-cache-max-age 1h secret: ~/drive

I added the bwlimit command

it depends on your internet connection speeds, but give it a try

and if you are using gdrive, i would not use --poll-interval 0s

What poll interval would you use?

just do not use the flag, and use the default value.

Okay, is there anything else I can do to help with buffering?

not sure your exact use case.
so whatever command you choose, try it and see what happens.

and/or schedule your uploads

How do I schedule my uploads?

try using cron.

So a couple things.

The way the bwlimit works is for both upload and download. If you set it to 10M, that impacts both download and upload.

You can use a per file limit to make sure that no one file saturates your bandwidth instead:

      --bwlimit-file BwTimetable             Bandwidth limit per file in KiB/s, or use suffix B|K|M|G|T|P or a full timetable

Personally on Linux, I use mergerfs which adds another layer on top and all my writes happen to a local mounted area and I upload that on a scheduled basis since you can schedule rclone move/copy from the local mount point to your remote.

This requires a bit more knowledge and some time to setup but I do have my process documented here:

animosity22/homescripts: My Scripts for Plex / Emby with Dropbox and rclone (github.com)

Doesn't really matter what backend as the process is still the same. That might be a bit much if you are not that well versed on the Linux side of things so sticking with the per file limit would be an easier route, but with less customization.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.