Script to upload to two cloud drives at once?

Hi,

Does anyone have a script to upload to two drives at once?

I’m using a seedbox.

What I was doing was copying the files to a second folder on the seedbox, and then uploading from the original location to one drive, and the second location to a second drive, but that bogged down the seedbox too much.

Now I upload to one, and use a second VPS to copy from one to the other. As my library grows, this is starting to go slower and slower.

A script to upload to two drives would be perfect.

This question is worded pretty poorly, but the main question i think you had was “Can i upload to two drives at once?”

Short answer: Yes. Create two rclone upload processes to different “drives” (AKA backend cloud storage providers)

I would personally just execute a script that puts an ampersand at the end of my rclone command

rclone copy /mount/point/one driveone:/ &
rclone copy /mount/point/two drivetwo:/ &

That should spawn both rclone processes to the background and allow simultaneous upload, I hope!

1 Like

Oh I didn’t know about the & thing… I’ll give it a try…

Just before I do this, and muck everything up…

I want to use rclone move

If i use rclone move cloud1 & rclone move cloud2, and one fails, will the file remain local until both are complete?

The way the ‘rclone move’ command works is that it completes the copy of the file to the cloud storage provider and then deletes the local copy.

Running 2x “move” commands at the same time would result in one of the remotes having the file, and the other one failing as the local file was deleted before the other process completed.

I would suggest using the “copy” command rather than move if at all possible.

I have just changed it today with rclone min age 5 mins copy cloud 1 & rclone min age 30 mins move cloud 2.

Just testing to see how it works.

In my opionon this is not a good idea. Cause if the move command finishes before the copy command, your file get’s deleted while the copy command couldn’t finish the upload.

You should try to implement a callback so that you notice when the uploads finished and delete the stuff afterwards. Otherwise you could just run it synchronous.

How can I do that? Any help is appreciated

Not exactly what you’re asking for, but I just do it like this:

# Upload
rclone copy -c /PATH/TO/LOCAL REMOTE1:
rclone copy -c /PATH/TO/LOCAL REMOTE2:
rclone move -c /PATH/TO/LOCAL REMOTE3:

It’s async but it get’s the job done.

Upload like normal using Move and then do a sync cron on your vps to sync gdrive1 to gdrive2?

Yeah as it says this is what I do, but its starting to go really low as the library grows…

What is going slow?

What I was doing was copying the files to a second folder on the seedbox

Literally copying and not moving? If so, use hardlink instead.

Use a Google Compute instance to do gdrive > gdrive sync. Ingress data is free, so that costs nothing. You get $300 free credit when signing up and you can make lots of accounts. Get a 2core ~3GB ram instance to do regular syncs.

1 Like

Here’s a way that I’m doing it parallel:

Install the parallel utility

sudo apt-get install parallel -y

Add your rclone commands into a text file. EG:

multicopy.txt

rclone copy /my/local/files Google1:/videos
rclone copy /my/local/files Google2:/videos
rclone copy /my/local/files GoogleN:/videos

Run all lines in parallel. Output will be concatenated by the lines in order so output will appear that it has run sequentially (that is, unlike using & at the end of each line where output is all sent to the screen as it happens).

parallel < multicopy.txt

Things to remember:

You may need to tweak your rclone parameters so that you’re optimizing your connection. Remember that if you usually get optimal copying via 24 simultaneous copies on a single rclone command (--transfers 24), you’ll want to make each line use a fraction of that (ie, for three simultaneous copies, use --transfers 8). The same would be true for bandwidth limiting parameters and such.

Obviously, make sure each line works on its own. If you haven’t gotten the copy to work yet, test each line separately before trying to run them in parallel.

Parallel will exit when all copies are complete. If you’re doing this as part of another script, the next line in your script will run only after parallel finishes the multi-copy. Thus, something like:

#!/bin/bash
echo Starting parallel copy
parallel < multicopy.txt
echo Done copying!

…won’t show “Done copying” until all lines in multicopy.txt have completed.

omg thats an awesome writeup!

Thank you so much!

I’ll give it a try.