I am trying to sync multiple remote directories, that resides on a FreeBSD system/ZFS pool, to a matching set of local directories that are on an EXT4/Linux Ubuntu 22.04 host.
**I have to go that way because the FreeBSD system is too old and Rclone does not run on it. So I am basically running Rclone on the destination to fetch the data from the source. Also, Rsync performance is horrible compared with Rclone. **
Each remote directory contains one sub-directory, and multiple symlinks to that sub-directory like so;
$ ls -lah
drw-r--r-- 6 user staff 4.0K Jul 13 06:54 backup
drwxr-xr-x 3 user staff 4.0K Jul 29 05:51 music
drwxrwxrwx 6 user staff 4.0K Sep 25 02:11 plex
lrwxrwxrwx 1 user staff 5 Oct 23 16:08 plex2 -> plex/
lrwxrwxrwx 1 user staff 5 Oct 23 16:09 plex3 -> plex/
Rclone does fetch the data, but it treats each symlink on the source as a directory, resulting in 3X data written on the target.
Run the command 'rclone version' and share the full output of the command.
rclone v1.53.3-DEV
- os/arch: linux/amd64
- go version: go1.18.1
This seems to be the latest version available to my system.
Which cloud storage system are you using? (eg Google Drive)
None. The sync is done over SSH.
The command you were trying to run (eg rclone copy /tmp remote:tmp)
Thank you @asdffdsa for the idea. Unfortunately, as stated in my original post, the FreeBSD system is beyond obsolete, and the datasets are set up in a complex way that would make attempting SMB exports a nightmare. I don't think I would want to touch that. Still, Thank you, for the hint and the thorough guide. Bookmarking it for the future.