Copyurl command for store links simultaneously

Hello everyone, newbie question here!

I have a list of tons of URLs that I would like to store simultaneously via rclone on Windows to my gdrive business. Urls point directly to files (wav, zip, rar, flac, ecc…). I’ve used copyurl command to copy single link to my Gdrive and it worked like a charm:

rclone copyurl https://example.com gdrive:path [flags]

but can’t figure out which command can I use for copy links in bulk instead one by one.

THANKS!!

Do they all come from the same website?

If so you can set up an http backend on the fly and use --files-from. Like this

rclone copy --files-from list.txt --http-url https://example.com :http: /destination/path

Where list.txt contains paths from the root of the files so

file1.txt
dir/file2.txt

Thanks for your time.

Yes they come from the same site, anyway it’s to much complicated for me to set up an http backend :slightly_frowning_face:

Is there an alternative way to put that list.txt on remote?
About your command line is https://example.com the remote link of list.txt?

Great

Hopefully not.

Assuming you have a file with a list of links

https://example.com/file1.txt
https://example.com/file2.txt

Save a copy of this to list.txt then do a search and replace for https://example.com/ to empty string ``. This will produce a file like this

file1.txt
file2.txt

You can then feed this into the rclone command

rclone copy --files-from list.txt --http-url https://example.com :http: /destination/path

that is it! No extra config required.

Yes that is the base URL of the website.

Okay, rclone command is as follow:

rclone copy --files-from list.txt --http-url https://40.rdeb.io/d/ :http: /gdrive:/ --verbose

And got this error:

2019/01/27 22:44:14 NOTICE: Local file system at : Replacing invalid characters in “\gdrive:” to “\gdrive_”
2019/01/27 22:44:15 ERROR : : error reading source directory: Stat failed: failed to stat: HTTP Error 404: 404 Not Found
2019/01/27 22:44:15 INFO : Local file system at \gdrive_: Waiting for checks to finish
2019/01/27 22:44:15 INFO : Local file system at \gdrive_: Waiting for transfers to finish
2019/01/27 22:44:15 ERROR : Attempt 1/3 failed with 1 errors
2019/01/27 22:44:15 ERROR : : error reading source directory: Stat failed: failed to stat: HTTP Error 404: 404 Not Found
2019/01/27 22:44:15 INFO : Local file system at \gdrive_: Waiting for checks to finish
2019/01/27 22:44:15 INFO : Local file system at \gdrive_: Waiting for transfers to finish
2019/01/27 22:44:15 ERROR : Attempt 2/3 failed with 1 errors
2019/01/27 22:44:15 ERROR : : error reading source directory: Stat failed: failed to stat: HTTP Error 404: 404 Not Found
2019/01/27 22:44:15 INFO : Local file system at \gdrive_: Waiting for checks to finish
2019/01/27 22:44:15 INFO : Local file system at \gdrive_: Waiting for transfers to finish
2019/01/27 22:44:15 ERROR : Attempt 3/3 failed with 1 errors
2019/01/27 22:44:15 INFO :
Transferred: 0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors: 1 (retrying may help)
Checks: 0 / 0, -
Transferred: 0 / 0, -
Elapsed time: 1.5s

I’ve created

list.txt

file with a list of 10 links without base url like this

XYJMZGFRRBOPC/example.rar
5TKWL7TDWHWOO/example.rar
K5PF4D7WDRXQC/example.flac
K5PF4D7QDRXQC/example.wav
G5PF1D7WDRXJK/example.zip

ecc…

and put it into my local directory of rclone.exe.
Where am I doing wrong?

That should be

rclone copy --files-from list.txt --http-url https://40.rdeb.io/d/ :http: gdrive:/ --verbose

there was a stray /

rclone copy --files-from list.txt --http-url https://40.rdeb.io/d/ :http: gdrive:/ --verbose

It worked! I transferred four links for a total of 5Gb in about 29 minute. Maybe not a great performance…

Transferred: 5.511G / 5.511 GBytes, 100%, 3.234 MBytes/s, ETA 0s
Errors: 0
Checks: 0 / 0, -
Transferred: 4 / 4, 100%
Elapsed time: 29m4.9s

C:\rclone>

Is it possible to increase the transfer speed? What does it depend on?
Actually I’m testing my Gdive personal not business.

Thanks for all.

You can try increasing --transfers - that might help.

Increasing --buffer-size might help and increasing --drive-chunk-size might help.

I tried to copy 8 links with this command

rclone copy --files-from list.txt --http-url https://40.rdeb.io/d/ :http: gdrive:/ --verbose --transfers 50 --buffer-size 1G --drive-chunk-size 128M

but there are not been any improvements. Furthermore, the PC has become unstable.
I have to say that I have applied the values arbitrarily to do some tests and I have no idea which settings apply to increase the speed. Any suggestions?

Try it with a smaller number of transfers - 10 say and the --drive-chunk-size flag - those will probably make the biggest difference.

Like this?

rclone copy --files-from list.txt --http-url https://40.rdeb.io/d/ :http: gdrive:/ --verbose --transfers 10 --drive-chunk-size 128M

Unfortunately the speed remains the same around 3MB/s. At least the PC does not freeze. There is probably some bottleneck between cloud source and gdrive? What speed should I expect by transfering a file remotely?

Now I have a problem with folder management.
The direct link to the file that I want to transfer has this structure:

https://40.rdeb.io/d/VGPQO33JMABPK/example.rar

with the base URL https://40.rdeb.io/d/ common to all links but with a folder (above in the example VGPQO33JMABPK) different for each file. So if I have to transfer a series of links the structure of the direct links generated by Real-Debrid is as follow:

VGPQO33JMABPK/example01.rar
MSP7KRZRGV6CE/example02.rar
YGKMCZEIRL6HM/example03.rar
SOFMOVIZACPI4/example04.rar
GGK5NXMHALO4O/example05.rar
ZAMZEQUJ5NLZ4/example06.rar
S7XBHWK4QRDY6/example07.rar

At the end I get 7 files in 7 different folders in gdrive with names like VGPQO33JMABPK, MSP7KRZRGV6CE, etc … This forces me to manually move these 7 files into a common folder (called EXAMPLE) with a great waste of time!

So my question is as follows:
is it possible to instruct rclone so that the 7 files are copied directly into the EXAMPLE folder of gdrive instead of nesting 7 subfolders in the EXAMPLE folder?

I mean, I’d like to have a structure like this:

gdrive:/
        EXAMPLE/
                example01.rar
                example02.rar
                example03.rar
                example04.rar
                example05.rar
                example06.rar
                example07.rar

instead of

gdrive:/
        EXAMPLE/
                VGPQO33JMABPK/
                              example01.rar
                MSP7KRZRGV6CE/
                              example02.rar
                YGKMCZEIRL6HM/
                              example03.rar
                SOFMOVIZACPI4/
                              example04.rar
                GGK5NXMHALO4O/
                              example05.rar
                ZAMZEQUJ5NLZ4/
                              example06.rar
                S7XBHWK4QRDY6/
                              example07.rar

Sorry if I have not been very clear. English is not my mother tongue.

rclone can’t do this directly, but it is easy with some shell scripting, eg

find /dir1 -mindepth 2 -type f -exec mv -t /dir1 -i '{}' +

Thank you. I’ll give a try.

Two rclone parallel command report:

Transferred:        3.031G / 10.053 GBytes, 30%, 3.010 MBytes/s, ETA 39m48s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 12, 0%
Elapsed time:    17m11.1s
Transferring:

Transferred:       76.512M / 3.175 GBytes, 2%, 183.491 kBytes/s, ETA 4h55m18s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:      7m6.9s
Transferring:

Why I’m getting these ridicoulos low speed remote transfer? :angry:

Do you get faster transfers if you transfer straight to the local disk?

Nope… same speed around 3Mb/s.

Transferred:        1.066G / 1.551 GBytes, 69%, 3.030 MBytes/s, ETA 2m43s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:      6m0.1s
Transferring:

Maybe it is just how fast the external website goes? How fast does curl download stuff?