Mount FTP remotes for local mirroring - Sounds like a plan?

What is the problem you are having with rclone?

Not a problem but a question about rclone being fit for my goal

What is your rclone version (output from rclone version)

rclone v1.45

  • os/arch: linux/arm64
  • go version: go1.11.6

Which OS you are using and how many bits (eg Windows 7, 64 bit)

linux amd64 -> ARM64 aka aarch64 aka ARMv8

Which cloud storage system are you using? (eg Google Drive)

Several FTP servers

Hi there, this is not about any particular problem but instead a question about whether or not I should use rclone for this task.

I want to mount a handful of remote FTP accounts and back them up by locally mirroring their contents.

I had been using lftp for a long time until I found it had been failing silently by not fully downloading serveral files, so lftp is out of the question for me.

Then I tried to update my scripts with aria and wget and, while wget does a fine job, it doesn't purge files removed from source.
To use wget I should implement some fancy way to compare local files against ",listing" contents and purge....

Duplicity looked interesting, but it can't use remotes as sources (grrrr).

These remote server don`t have rsync and won't allow me ssh into them, that rules out pretty much any other option I researched.

So, I am down to mounting the FTP server locally and perform a "local to local" sync with... anything I find reasonable at this stage really.....

So my question is whether using rclone to script the mounting of the remotes seems like a reliable solution or not.

I intend to run implement a few checks and mail notifications after the sync, that would be beyond rclone's scope i guess.

These backups don' t tend to include large files, but the total size ranges widely, from 200 MB to 5 GB.

Thanks in advance for any feedback on this.
MK

hello and welcome to the forum,

that version of rclone is over two years old.
best to update from rclone.org website.

sure, rclone can download files from a remote ftp server to local storage.
and if you are looking for a backup solution check out flags --backup-dir with a date/time stamp.

first setup and test a remote pointing to the ftp server
https://rclone.org/ftp/

to backup, you can use

  • rclone copy ftpserver: /path/to/local
    or
  • rclone mount ftpserver: /path/to/local/ with FUSE and use standard linux tools to copy from that mount to local storage.

Hi!

Yeah, I just realized it was old (didn`t realize it was THAT old though). I actually never used rclone on that system, but the truth is that my intention is to use this board for the task, so I pasted what is there.

That is the latest version apparently packaged for the distro I am testing in this SBC, but I suppose I will be able to work around that one way or another.

My main concerns are these:

  • first, that I assume rclone is much more tested as an UPLOAD solution than as a download one

  • the second, and more important one, that I think FTP being such an old protocol might work unreliably in this scenario.

I had discovered the lftp issue a long while ago now and I need to get this back on track, but every solution I have attempted so far has been blocked for the silliest reasons... I could add to my list above but it would be just sad to read, lol.

why would you assume that?

yes, ftp is a problematic protocol for backup.
imho, ftp cannot be used for backups.

do the servers have any other protocol or can you install software such as sftp?
as rclone can perform checksum hashing for uploaded/downloaded files.

given the small amount of data, you could get some cloud storage, aws, google drive.
and use rclone to upload from those servers, to cloud.
then rclone can perform md5 check-summing during copy.
now that would be a backup you could trust.

Just because I seem to read a lot more posts from people doing ofisite backups of their local info than from people downloading remote contents for local backup. And FWIW I am talking about regular uses, not the devs.

Not really. What i might be able to do is to run some bash scripts in the servers to get the hashes of every file and check that once the files ave been downloaded. Sounds like more work that I' d really what but if that is what it takes to trust the backups I could do that.

But AFAIK that is not present over the ftp protocol, right?

Thanks a Ton for your time BTW.

how would you run bash scripts on those remote ftp servers, ssh or what?

some ftp servers can do check-summing with a compatible ftp client.

At least some of them do allow me to upload bash scripts and use cron to execute them, but on second thought I am not sure all of them will allow me to do this.

On the other hand, I' d like to avoid an overcomplicated scheme that becomes unsustainable and/or has too many potential points of failure.

it`s not that I am lazy, it's just that, IMHO, they tend to become abandoned and not trustwhorthy ...

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.