Rclone webui job stats and proper backup strategy

Hi,

I'm new to reclone and in the process of backing up my local NAS's contents to Google Drive.

I have installed a docker container of the latest rclone and the first copy is running now. However, I have noticed in the WebUI that the job stats are not being updated. I triggered the current copy via the command line (docker exec -it rclone copy.sh) if this makes a difference? Is there a way to fix this or is there another recommended web based solution for monitoring progress?

Secondly, I was thinking of scheduling a daily copy and then a sync every 30 days. The reason for this is because I don't want deletions to be immediately synced just in case something gets removed by accident. Is this a sensible way to address this or are there other recommendations for how to handle this? (maybe retain a rolling 30 day copy of deleted files)?

Any help would be very appreciated!

If you are looking for a backup script, see: https://github.com/wolfv6/rclone_jobber

Can't help you with the other, but I think you need to start the "rc" and then issue commands to that to see what's happening in the webgui

@nle thanks for the reply.

I did look at rclone_jobber but it seems to be a little different. I am not looking to have multiple backups. A simple copy is OK with the exception that it would be nice to retain deleted files for a certain extra period of time in case they were deleted accidentally. I think this is possible by doing a daily copy and then a sync at lesser frequency to remove deletions from the cloud copy. It's not perfect as it would be better to maintain a rolling last X days of deletions but it's decent enough.

Thanks for the tip on the rclone rc method for calling remotely. It seems to be working. Am wondering how I can add monitoring to it to make sure it works as expected because the webui only shows current stats.

hello and welcome to the forum,

i do a sync every day.
i have a set rolling deletions, a subfolder for each day for changed/deleted files.
https://rclone.org/docs/#backup-dir-dir


dt=`date +%Y%m%d.%I%M%S`
source=/path/to/local/folder
dest=remote:/data/current
backupdir=remote:/data/archive/$dt
rclone sync $source $dest --backup-dir=$backupdir -vv --dry-run

Thanks @asdffdsa This almost exactly what is needed. Just to clarify:

  1. This does not manage the rolling deletions. If I want to retain today - 28 days then I need to manually go in and remove the directories older than that?
  2. This moves deletions and updates. Is there a way to keep updates in place and do it only for deletions?

hi,

it should be easy to write a script using folder names based on $backupdir`

not sure exactly what you mean?

that script does the following
if source file is changed and there is a existing copy in dest
then

  1. move existing copy from dest to backupdir
  2. copy source file to dest.