Struggling to schedule an RClone job


#1

Hey guys, I’m completely new to this but learning fast so please bear with me :slight_smile:

I have installed RClone on a shared seedbox and am sending the files to my google drive. I have created a script that seems to work but I am completely lost when it comes to setting up a schedule for it.

I’ve read through the forums and tried to piece some bits together but I’m not getting anywhere fast with it at the moment.

This is the script I want to use;

rclone move ~/private/rtorrent/data/Complete/ “gdrive:/The Skull/Complete” -v --no-traverse --min-age 1m --log-file= /private/rtorrent/rclonelogs/rclone-upload.log

Could someone explain to me in laymans terms how I should create a schedule that runs the script every 15 minutes? I’m using a windows box with Putty.

Thanks for any help you can provide


#2

I think I’m almost there but it just doesn’t seem to work.

Can someone point out any stupid mistakes I might be making?

This is the content of my cronscript.sh:

#!/bin/bash
if pidof -o %PPID -x “rclone-cron.sh”; then
exit 1
fi
rclone move ~/private/rtorrent/data/Complete/ “gdrive:/The Skull/Complete” -v --no-traverse --min-age 1m --log-file= /private/rtorrent/rclonelogs/rclone-upload.log
exit

and this is what I am entering under crontab -e:

*/2 * * * * /media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1

any help would be greatly appreciated as I’m scratching my head here.


#3

I got there in the end

I stripped the script down to;

rclone move ~/private/rtorrent/data/Complete/ “gdrive:/The Skull/Complete” -v --min-age 1m

It appears the no-traverse and the log were the issue

I do have another question whilst I’m here though; Is it possible to have multiple rclone moves to differet destinations in the same script? Or do I need to make multiple scripts?

Cheers

EDIT:

Urghh, I spoke too soon. My script works fine if I run it manually, but the cron scheduling part just isn’t happening :frowning:
Anyone have any suggestions?


#4

So I contacted the admin at the seedbox and they confirmed the cron job is firing every 60 seconds as requested.

Jun 11 05:52:01 pallas CRON[16576]: (craftyclown) CMD (/media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1)
Jun 11 05:53:01 pallas CRON[23663]: (craftyclown) CMD (/media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1)
Jun 11 05:54:01 pallas CRON[30505]: (craftyclown) CMD (/media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1)
Jun 11 05:55:01 pallas CRON[5300]: (craftyclown) CMD (/media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1)
Jun 11 05:56:01 pallas CRON[28162]: (craftyclown) CMD (/media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1)
Jun 11 05:57:01 pallas CRON[2738]: (craftyclown) CMD (/media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1)
Jun 11 05:58:01 pallas CRON[9824]: (craftyclown) CMD (/media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1)
Jun 11 05:59:01 pallas CRON[17207]: (craftyclown) CMD (/media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1)
Jun 11 06:00:01 pallas CRON[24309]: (craftyclown) CMD (/media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1)
Jun 11 06:01:01 pallas CRON[15197]: (craftyclown) CMD (/media/dma/craftyclown/rclone-cron.sh >/dev/null 2>&1)

So I really am baffled now. The job is running every 60 seconds but nothing appears to happen, however If I run the job manually it works perfectly.

Am I being stupid and missing something obvious here?

Could someone tell me how to correctly add some kind of logging, so I can at least troubleshoot this further?


#5

I tried adding logging back into my script

#!/bin/bash
if pidof -o %PPID -x “rclone-cron.sh”; then
exit 1
fi
rclone moveto ~/private/rtorrent/data/Complete/ “gdrive:/The Skull/Complete” -v --min-age 1m --log-file=/private/rclonelogs/rclone-upload.log

but I get this error;

[pallas ~] /media/dma/craftyclown/rclone-cron.sh
2018/06/11 09:37:44 Failed to open log file: open /private/rclonelogs/rclone-upload.log: no such file or directory

I tried manually adding a file called rclone-upload.log using filezilla and gave it 777 permissions but it made no difference

If I can’t even get the log to work then I have no way of knowing what is going wrong with the rest of the job! :frowning:


#6

I use this for a simple move:

felix@gemini:~/scripts$ cat upload_cloud
#!/bin/bash

# Move older local files to the cloud
/usr/bin/rclone move /data/local/Movies/ gcrypt:Movies --checkers 4 ---fast-list -syslog -v
cd /data/local/Movies
rmdir /data/local/Movies/*
/usr/bin/rclone move /data/local/TV gcrypt:TV --checkers 4 --fast-list --syslog -v
cd /data/local/TV
rmdir /data/local/TV/*

and you can use something like this in cron:

# Nightly Cloud Upload
12 1 * * * /usr/bin/flock -n /tmp/upload_cloud.lock /home/felix/scripts/upload_cloud

#7

Thanks @Animosity022 one thing that stands out to me straight away is you have the path to rclone in your script. Is this necessary when using Cron?

apologies once again if these are stupid questions but I am completely new to Linux and learning as I go.


#8

Yes!!! That was it!

I needed the full path for RClone for it to run.

Is there any reason some people are able to run it without adding the full path?


#9

it depends on the PATH of the cron job and where exactly you installed rclone.

People typically put a line like this near the start of the crontab

PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin

To set the PATH to a known state.


#10

Right, I see. Is there a downside to adding the full path as I did in the script I used?


#11

Usually for any scripts, it’s best to use full paths to ensure you are running what you expect.


#12

Guys I think there’s still something wrong with my script.

Large files are getting copied many times over and I have ended up with 9 copies of a 35gig file! If I keep going like this I’m quickly going to hit the Google daily limit :frowning:

Is there anything I can add to the script to prevent these multiple downloads? I presume it’s because the larger files are still in the process of copying over.

This is my script as it stands;

#!/bin/bash
if pidof -o %PPID -x “rclone-cron.sh”; then
exit 1
fi
/media/dma/craftyclown/bin/rclone copy ~/private/rtorrent/data/ “gdrive:/The Skull/Feral” -v --min-age 1m


#13

At a guess you are starting off a new copy before the old one has finished.

This is what I normally use

LOCKFILE="/var/lock/`basename $0`"

(
    flock -n 9 || {
	echo "$0 already running"
	exit 1
    }

rclone commands go here

) 9>$LOCKFILE

#14

Thanks Nick. I thought that the above command would prevent the copy being run multiple times?

So would my new script become;

#!/bin/bash LOCKFILE="/var/lock/`basename $0`"

(
flock -n 9 || {
echo “$0 already running”
exit 1
}

/media/dma/craftyclown/bin/rclone sync ~/private/rtorrent/data/ “gdrive:/The Skull/Feral” -v --min-age 1m --log-file=/media/dma/craftyclown/rclone-upload.log

) 9>$LOCKFILE


#15

It will, but you put it in the wrong place… Put it on the line after #!/bin/bash, also change the smart quotes into normal ".


#16

I did have it after #!/bin/bash The below was my original script. Sorry, which quotes are wrong?


#17

These ones

“rclone-cron.sh”

should be

"rclone-cron.sh"

#18

Ahhhhhhhh. Apologies. I copy and pasted it from another thread here and didn’t spot that. Do you think that is why I have been getting the duplicated runs then?

So which script would be better? Or are both essentially the same?

#!/bin/bash
if pidof -o %PPID -x ‘‘rclone-cron.sh’’; then
exit 1
fi
/media/dma/craftyclown/bin/rclone copy ~/private/rtorrent/data/ “gdrive:/The Skull/Feral” -v --min-age 1m

or

#!/bin/bash
LOCKFILE="/var/lock/basename $0"

(
flock -n 9 || {
echo “$0 already running”
exit 1
}

/media/dma/craftyclown/bin/rclone sync ~/private/rtorrent/data/ “gdrive:/The Skull/Feral” -v --min-age 1m --log-file=/media/dma/craftyclown/rclone-upload.log

) 9>$LOCKFILE


#19

Quite possibly yet.

For practical purposes they are both the same. So use whichever you like!


#20

Thanks Nick, that is super helpful. I can’t believe the quotes were the issue!

In future if multiple Rclone jobs have been accidentally run, is there a way of checking for that and potentially cancelling them?

On another note, the huge amount of dupes have triggered a 24 hour ban with Google. Would you happen to know if there is a way of finding out when the ban was triggered and when it is due to end?