There was nothing to transfer

What is the problem you are having with rclone?

I have a list of url and would like to copy them to my drive. They all come from the same website: gdindex-demo(dot)maple3142(dot)workers(dot)dev/ I'm following this guide (rclone forum site) copyurl-command-for-store-links-simultaneously/8487/2 but I am getting error THERE WAS NOTHING TO TRANSFER. It works just fine for others site. What have I done wrong? Or is there any other way I can use to copyurl in batch?

What is your rclone version (output from rclone version)

os/arch: windows/amd64
go version: go1.14.3

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 10, 64 bit

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy --files-from list.txt --http-url https://gdindex-demo.maple3142.workers.dev :http: gdrive:/test --verbose
Enter configuration password:
password:
2020/06/02 00:44:39 INFO  : There was nothing to transfer
2020/06/02 00:44:39 INFO  :
Transferred:             0 / 0 Bytes, -, 0 Bytes/s, ETA -
Elapsed time:         0.0s

This is my txt file

epub/accessible_epub_3.epub
epub/konosuba01.epub
Sintel/Sintel.de.srt
sample.pdf

Doesn't seem like the site lists anything:

felix@gemini:~$ rclone ls --http-url https://gdindex-demo.maple3142.workers.dev :http: -vvv
2020/06/01 12:56:14 DEBUG : rclone: Version "v1.52.0" starting with parameters ["rclone" "ls" "--http-url" "https://gdindex-demo.maple3142.workers.dev" ":http:" "-vvv"]
2020/06/01 12:56:14 DEBUG : Using config file from "/opt/rclone/rclone.conf"
2020/06/01 12:56:14 DEBUG : 3 go routines active

You can test with something like:

 rclone lsd --http-url https://beta.rclone.org :http:

It looks like the way the site is written. You can see how it really renders with curl.

curl https://gdindex-demo.maple3142.workers.dev
<!DOCTYPE html><html lang=en><head><meta charset=utf-8><meta http-equiv=X-UA-Compatible content="IE=edge"><meta name=viewport content="width=device-width,initial-scale=1"><title>GDIndex</title><link href="/~_~_gdindex/resources/css/app.css" rel=stylesheet></head><body><script>window.props = { title: 'GDIndex', default_root_id: 'root', api: location.protocol + '//' + location.host, upload: false }</script><div id=app></div><script src="/~_~_gdindex/resources/js/app.js"></script></body></html>
[user@user ~]$ curl https://gdindex-demo.maple3142.workers.dev/sample.pdf

pulls back something looking very like a pdf file

Yes he could pass the individual files to rclone and loop. But the overall page doesn't display the links from curl. You'd need javascript.

1 Like

Yes, I picked the special case...

I'm sorry but what do you mean by you'd need javascript?

The page you're trying to gather links from requires JavaScript to render the page with links. So at present rclone won't find those links evidence by what curl outputs.

Oh I see. So is there any other way to copy all of them into my drive?

Not with rclone. You'd have to extract the urls from the page and use those directly.

You'd have to feed in each link I'd assume to copy it as rclone is not able to parse the site.

To parse this sort of javascript heavy site you'd probably need to automate it with a headless chrome or something like that...

So I've created a batch file to execute all the copyurl commands instead. Thank you everyone for your support.

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.