Don't be me: really understand that sync will delete stuff

so not thinking, I was thinking that an easy way to move a few TBs of stuff to gdrive would be to to call sync once a day until stuff was copied.

seeing: sync is taking a while to start moving, but eh? probably doing some calculations to determine what to send
seeing: ah, sync is moving, and dies when it hit limits
seeing: sync finished, why isn't it exiting?

thinking: oh no! what I have done!

realizing: this is google drive: I can restore from trash! (hint, move to list mode, easier on not hanging firefox).

as an aside, supposdly able to restore files from admin console, but if it works, its much slower than restring them manually as user.

learn from my mistakes, make sure you know sync will delete.

1 Like

I made this mistake with rclone too. Luckily I didn't delete anything important because I was still testing rclone. Sync to me means bi-directional, so that if a file on remote exists but not on the local, then copy back to local, but as you mention with rclone sync will make remote identical to the source thus deleting the remote file. I am glad I know this should I ever need to restore data. I always use the 'copy' command now for my back-up jobs.

I sorry to hear this and know that you aren't the first @spotter :frowning:

A few releases ago I changed all the docs to add the -i flag to the examples which is great at catching this sort of error.

I always start out with rclone copy before moving onto rclone sync.

1 Like

I learned about the -i tag on this forum so use it often whilst testing.

i find it most useful to use and a log file.

1 Like

well, i didnt lose any data (and also discovered that the supposed 30 day trash removal isn't true, nothing had been deleted from my trash since I opened it!)

1 Like

Also, --backup-dir (docs) and/or --suffix (docs) are great to use if you have things running automatically. This way, if you screw up, it is unwindable. It may not be an easy unwinding but it is possible.

I use--backup-dir extensively since I prefer non-interative calls to rclone such as from crontab or just because I don't want to think about it.

If you go the --suffix route, see the admonition about filters

you can add a timestamp to the --backup-dir, then each time it runs, rclone will create a new folder, keep each old version, sort of a forever forward incremental copy.

a simplified command would look like

rclone sync source  remote:backup --backup-dir=remote:archive/20200525.191201

for me, the full command would be

C:\data\rclone\scripts\rclone.exe  sync  "b:\calibre_20200525.191201\data\M\CalibrePortable" wasabieast2:en07/calibre/rclone/backup --backup-dir=wasabieast2:en07/calibre/rclone/archive/20200525.191201  --log-level DEBUG --log-file=C:\data\rclone\logs\calibre\20200525.191201\calibre_20200525.191201_rclone.log
1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.