Copy Files from Multiple links to my drive

What is the problem you are having with rclone?

I have many individual links with me which are in a file called files.txt
What I would like to do is copy all the files inside these links to my own drive

What is your rclone version (output from rclone version)

rclone v1.51.0

  • os/arch: windows/amd64
  • go version: go1.13.7

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 10 64 bit

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copyurl -a "files.txt" remote:

The rclone config contents with secrets removed.

Paste config here

A log from the command with the -vv flag

Paste  log here

Hi Roshan, I'm not entirely sure I understand what you mean, but the --include-from flag may be useful to you? See here for the documentation on filtering. This assumes that you can control the contents of the files.txt file..

I must have given more clarity
So i have around 1000 links like this

Currently I saved all of them in a txt file called lists.txt
Is it possible to copy all of these links to my drive? rather than doing it one by one?

i could be wrong but filtering compares filter/folders for a given path.

i think the OP wants feed a set of paths to rclone copyurl command
for that you would need to write a script.

you could write a little script like this

if all the files are from the same domain+url, you could use
rclone copy --files-from=lists.txt --http-url= :http: remote:

the lists.txt would be just filenames, not full paths

if all the files are not from the same domain+url, you could use
for /f %%u in (lists.txt) do rclone copyurl %%u remote: -a

the lists.txt would be the full path+filename for each file

Ah, sorry guys, my bad :see_no_evil:

@asdffdsa, those seem like good solutions. I have learnt something new again today...

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.