I'm using Rclone inside a shell script to send a Centos7 backup folder to Google drive. The purpose is to keep a remote clone of my server backups. The backup is running, however the source folder is 48GB and so far the backup is on 102GB and still going!
rclone copy -l --log-file /path/to/log -v --filter-from /path/to/excludes /backups/ mydestination:
I'm using the drive.file scope in config for google drive.
The source backups folder being copied contains tar's, sql dumps and (probably more relevant) a folder with rsync backups of complete system (using laurent22/rsync-time-backup). There is a great deal of sym/hard links in this folder.
First run I used the --copy-links switch and it resulted in a gigantic folder. I misunderstood the command and appears the sym/hard links were followed and copied as files /folders.
I'm now using the -l switch assuming the sym/hard links are copied as links (not followed). The log file shows thousands of .rclonelink files so it appears to work as intended.
Transferred: 102.409G / 104.965 GBytes, 98%, 1.699 MBytes/s, ETA 25m40s
Errors: 318 (retrying may help)
Checks: 0 / 0, -
Transferred: 184907 / 194919, 95%
Elapsed time: 17h9m0s