Questions about optimisation and prevention of data loss

I like to gather informations about optimisation of my rclone calls and arguments. As well to ask some generall questions

I am using rclone to sync to googledrive

After struggling for some time with doing the right cron job calls I ended up with following line in my crontab:

0 * * * * /bin/sh /home/user/scripts/bash/rclone_cron.sh >/dev/null 2>&1

This seems to work really good for now.
The content inside the rclone_cron.sh file is the following:

#!/bin/bash

if pidof -o %PPID -x “rclone-cron.sh”; then
exit 1
fi
current_date_time=$(date "+%F-%T")
/usr/bin/rclone copy --update --verbose --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --log-file /home/user/scripts/bash/rclone_$current_date_time.log --stats 1s "/home/user/projects" "googledrive:backup/user/projects"

exit

This also works well for now.
What I like to know is if I need all the settings. I copied the command from somewhere and thought that the default settings should do the job quite well.
So maybe no need to specify all of this. I had some trouble to find the default settings.

One thing which bugs me what happens to running rclone transfers when I shutdown my pc with "shutdown" and the transfer is still running. Will this damage anything? Is rclone restarting everything the next time?

And last question is how I would best include those settings as a variable in my call. since the line gets quit long and I want to do several calls, how can I use the parts I want to use every call as a variable? It should work like I did with the date_time variable right?

No idea why these settings are used. Start with:

/usr/bin/rclone copy --verbose --log-file /home/user/scripts/bash/rclone_$current_date_time.log /home/user/projects googledrive:backup/user/projects

All will be OK. Nothing will be damaged. Next time only what has not been transferred will be processed.

I think it is bash question. You will find plenty of resources on Internet. Start simple with one variable and step by step add more, testing every change.

PS. As you are using Gdrive make sure you configure your own client_id.

1 Like

Your answer helped me a lot. I changed my bash script to this:

#!/bin/bash

if pidof -o %PPID -x “rclone-cron.sh”; then
    exit 1
fi
    
    logfilepath="/home/user/scripts/bash/log/rclone.log"

    /usr/bin/rclone copy --verbose --log-file ${logfilepath} "/home/user/projects" "googledrive:backup/user/projects"

exit

Which seems to work well. I also managed to pack my logfile path into a variable and only include the variable into my rclone command. much more practical and short like this.

1 Like

If you are interested, you can take a look at this script, I created it these days and if you want to use it, just put it in the /bin folder and in your cron job, put it like this:

0 22 * * 5 Rclone-AIO -c '/home/user/projects' 'googledrive:backup/user/projects'

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.