Issue with rclone running from cron

I know it's not an rclone problem, but I know several of us use Cron to automate our rclone jobs.

There's one particular job that just will not run from cron, even though it runs perfectly when executed manually.

My Cron Job:
30 * * * * /opt/rclone/scripts/gvault-move.sh

The Script:

#!/usr/bin/bash
if pidof -o %PPID -x "gvault-move.sh"; then
echo $(date "+%Y/%m/%d %H:%M:%S")" WARN  : Cron attempted to start the move the local data mirror to vault: but the process is already running." >> /opt/rclone/logs/vault-upload.log
exit 1
fi

echo $(date "+%Y/%m/%d %H:%M:%S")" INFO  : Cron started the rclone move subroutine" >> /opt/rclone/logs/vault-upload.log
# Move older local files to the cloud
rclone move \
/srv/dev-disk-by-label-Buffer/mirror/ vault: \
--log-level INFO \
--log-file /opt/rclone/logs/vault-upload.log \
--stats 0 \
--exclude-from /opt/rclone/exclude/gvault \
--drive-chunk-size 64M \
--delete-empty-src-dirs \
--user-agent gvaultapp \
--fast-list

And I've made the script executable with
chmod +x /opt/rclone/scripts/gvault-move.sh

When I expect the script to run from cron, I get no output in the logs. Can anyone help me figure out why it wont run? Other scripts in the cronjob run just fine.

cron has no environment variables so it has no idea where rclone is.

You can use the full path in there and you can > somelog.txt in the cron job to see what it is doing. I'm guessing it's failing due to the path though.

The other cron job running is almost identical and works from the same crontab:

#!/bin/bash
if pidof -o %PPID -x "root_backup.sh"; then
echo $(date "+%Y/%m/%d %H:%M:%S")" WARN  : Cron attempted to start the root backup but an existing root cron job is still running." >> /srv/vaultfs/rootfs/storage/scripts/rclone/root/root_backup.log
exit 1
fi
echo $(date "+%Y/%m/%d %H:%M:%S")" INFO  : Root backup started..." >> /srv/vaultfs/rootfs/storage/scripts/rclone/root/root_backup.log
# Echo the backup list
printf '%b\n' "$(cat /srv/vaultfs/rootfs/storage/scripts/rclone/root/filter.list)"
rclone sync / gdrive-root: --track-renames --copy-links --log-level INFO --log-file /srv/vaultfs/rootfs/storage/scripts/rclone/root/root_backup.log --drive-use-trash --filter-from /srv/vaultfs/rootfs/storage/scripts/rclone/root/filter.list --stats 0
echo $(date "+%Y/%m/%d %H:%M:%S")" INFO  : Root backup completed." >> /srv/vaultfs/rootfs/storage/scripts/rclone/root/root_backup.log
exit

Cron entry:

30 * * * * /srv/vaultfs/rootfs/storage/scripts/rclone/root/root_backup.sh

Add a log file at the end of the cron entry and see what the issue is:

30 * * * * /srv/vaultfs/rootfs/storage/scripts/rclone/root/root_backup.sh >> /tmp/myjob.log 2>&1

Sorry for the late reply

Here's the output:

/bin/sh: 1: /opt/rclone/scripts/gvault-move.sh: not found

But the file does certainly exist:

root@omvdocker:/opt/rclone/scripts# pwd
/opt/rclone/scripts
root@omvdocker:/opt/rclone/scripts# ls
gvault-move.sh

Is it missing execute permissions?

ls -al /opt/rclone/scripts/gvault-move.sh

No

root@omvdocker:/opt/rclone/scripts# ls -al gvault-move.sh
-rwxr-xr-x 1 root root 687 Sep 17 14:19 gvault-move.sh

Hmm, that's definitely a bit odd.

Are you running as root? What OS is it running on? From the Linux side, that all looks very good.

running as root, running crontab -e as root.

Debian OS with OMV installed.

Let me see if I can replicate.

Can you share the crontab -l as root? I want to make sure I test the same thing. I'm guessing it's something more OMV related but I've never used it before and wanted to see if I can replicate on a test server.

root@omvdocker:~# crontab -l
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/go/bin
52 * * * * /opt/rclone/scripts/gvault-move.sh >> /tmp/myjob.log 2>&1

OMV are just packages installed, much like webmin.

Definitely a strange one.

If the execute is missing I get:

/bin/sh: /opt/rclone/scripts/gvault-move.sh: Permission denied

If I have the file missing completely, I get:

/bin/sh: /opt/rclone/scripts/gvault-move.sh: No such file or directory

Fixing both of those, it executes fine for me in a test VM. Doesn't seem anything specifically rclone related as it can't even call the script as it seems something more OSish, but I'm not sure what else offhand. I'll play around a little more and see if I have another idea.

If you check root's crontab, no funky control chars or anything else hanging in there?

[root@gemini cron]# cat -vet root
$
7 7 * * * cp -p /var/lib/grafana/grafana.db /data/backups/grafana.db$
$
# Update Grafana Plugins$
15 3 * * * /usr/sbin/grafana-cli plugins update-all$
30 4 * * * /bin/systemctl restart grafana-server$
$
# rsnapshot backups$
12 1 * * * /usr/bin/ionice -c2 -n7 /usr/bin/rsnapshot daily$
[root@gemini cron]#

Which directory are you running that from?

Should be in:

/var/spool/cron/crontabs

root@omvdocker:/var/spool/cron/crontabs# cat -vet root
# DO NOT EDIT THIS FILE - edit the master and reinstall.$
# (/tmp/crontab.l2oR6G/crontab installed on Wed Sep 18 15:05:30 2019)$
# (Cron version -- $Id: crontab.c,v 2.13 1994/01/17 03:20:37 vixie Exp $)$
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/go/bin$
30 * * * * ({ /srv/vaultfs/rootfs/storage/scripts/rclone/root/root_backup.sh; } | tee /tmp/U9GhC8Gz09FCqrfo.stdout) 3>&1 1>&2 2>&3 | tee /tmp/U9GhC8Gz09FCqrfo.stderr$
52 * * * * /opt/rclone/scripts/gvault-move.sh >> /tmp/myjob.log 2>&1$
#51 * * * * env >> /tmp/myjob.log$

@jiru make executable the script
Using user "root" type
chmod +x /opt/rclone/scripts/gvault-move.sh

Also:
service cron start

Is the path for bash correct?

It is executable and other Cron jobs in the same file are running.