Strategy for uploading large(ish) library?

I have about 2.5 TB to upload, and with my current internet connection, it should take almost a week. I’m running rclone from my OMV server through an SSH connection. When I shut down the SSH connection, though, the process stops. Is there some way I can get it to run in the background? Or do I have to keep my SSH session up and running the entire time? Thanks.

use screen

screen -dmS rclone rclone copy remote: remote2:

Thanks. That’s exactly what I was looking for. (Unfortunately it looks like OMV doesn’t have screen, and I can’t install it with apt-get.)

Download it manually
wget http://de.archive.ubuntu.com/ubuntu/pool/main/s/screen/screen_4.3.1-2build1_amd64.deb
and install with
dpkg -i screen_4.3.1-2build1_amd64.deb

1 Like

Unfortunately my installation of OMV is old enough that this package didn’t have the right dependencies (or versions) installed. I don’t want to risk messing anything up by doing a update. Instead I use an rpi to ssh in and use “screen”. I don’t mind leaving the rpi running.

You could always run it in the background the old fashioned unix way, something like this

nohup rclone .... </dev/null >/tmp/log 2>&1 &

This should remain in the process tree (see it with ps ax) after you log out.

if you use screen, is there any way to open the window to see where its at?

screen -S rclone

*where rclone is the name of your screen session that you passed before the command screen -dmS rclone rclone copy remote: remote2:

to dettach you’ll do:
ctrl-a
d

Ahhhh! That makes sense why rclone was in the command twice.

Thanks!