Automate sync - failed to load config file: permission denied

I am attempting to automate a sync command so that every night at 11:55 PM, my EC2 instance syncs with my GDrive so everything from my EC2 instance is backed up on GDrive. Initially, everything works great when manually running the sync command below. After I put it in a cronjob it gives the error message Failed to load config file: permission denied when running the cronjob and when attempting to run manually. After I get this message I can no longer edit or make any changes to rclone config, and eventually, have to remove it and restart the config process.

I am running rclone v1.53.3 on an AWS EC2 Cloud9 instance attempting to use Google Drive as cloud storage.

rclone sync /home/ec2-user/environment/radio_data_collection remote:RadioDataCollection --config /home/ec2-user/.config/rclone/rclone.conf

I don't understand how I can run rclone config file and get the path, place it in the tag --config and then all of a sudden not have access to it as soon as I put it in a cronjob. I am part of a team that each has our own EC2 instances that collect data throughout the day and we want to store that data every night on our team GDrive. It seems for some the command above is working but for others, it is not. It seems some of the team is running into this permission issue but some are not despite everything being exactly the same between instances. My only guess is that cronjobs run as a different user so I have to configure rclone to work with cronjobs but am unsure how to do that. Additionally, I have attempted to move the command into a .sh file and run the cronjob on the .sh file but end up with the same issue.

You can run rclone config file and get the path location for your config file:

felix@gemini:~$ rclone config file
Configuration file is stored at:

Once you know the path, you can append it.

felix@gemini:~$ rclone lsf GD: --config /opt/rclone/rclone.conf

In general though, are you running it as the same user? Here is my example upload script that I use as depending on how you are running it, you may need full paths.

felix@gemini:/opt/rclone/scripts$ cat upload_cloud
# RClone Config file

# Local Drive
# This must be a local mount point on your server that is used for the source of files
# WARNING: If you make this your rclone Google Drive mount, it will create a move loop
# and delete your files!
# Make sure to this to the local path you are moving from!

#exit if running
if [[ "`pidof -x $(basename $0) -o %PPID`" ]]; then exit; fi

#check for excludes file
if [[ ! -f /opt/rclone/scripts/excludes ]] ; then
    echo ' excludes is not there, aborting.'
    exit 1

#is $LOCAL actually a local disk?
if /bin/findmnt $LOCAL -o FSTYPE -n | grep fuse; then
        echo "fuse file system, exitting"
        exit 1

# Move older local files to the cloud. I added in 3 days to let files sit a few days so intro anaylsis can happen locally
/usr/bin/rclone move $LOCAL gcrypt: --log-file /opt/rclone/logs/upload.log -v --exclude-from /opt/rclone/scripts/excludes --delete-empty-src-dirs --fast-list --drive-stop-on-upload-limit --min-age 3d

Does your command work that you shared? If you are running as the same user, you generally do not need to specify a config file location unless you moved it from the default location (which I do).

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.