HTTP not listing from "directory lister"


I am trying to list files from a website that uses “Directory Lister” to list files,
But sadly rclone isn’t able to list from it.

Would be really glad if anyone could help.

Using rclone version:
rclone v1.45

  • os/arch: windows/amd64
  • go version: go1.11


Looking at the directory lister example

This then links to files like

This won’t work with the current scheme as rclone is expecting the links to look like this

If you extract the names from the directory listings, then rclone can fetch the files for you or mount them or whatever…

eg put this into files.txt


The names can have directories in, but should start from the root


$ rclone --http-url --files-from files.txt lsf :http:

$ rclone -v --http-url --files-from files.txt copy :http: code-copy
2018/12/02 14:06:27 INFO  : Local file system at /tmp/code-copy: Waiting for checks to finish
2018/12/02 14:06:27 INFO  : Local file system at /tmp/code-copy: Waiting for transfers to finish
2018/12/02 14:06:27 INFO  : hello-world.css: Copied (new)
2018/12/02 14:06:27 INFO  : hello-world.c: Copied (new)
2018/12/02 14:06:27 INFO  : Copied (new)
2018/12/02 14:06:27 INFO  : hello-world.html: Copied (new)
2018/12/02 14:06:27 INFO  : 
Transferred:   	       659 / 659 Bytes, 100%, 447 Bytes/s, ETA 0s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            4 / 4, 100%
Elapsed time:        1.4s

Hope that helps!


Thank you so much for your reply.

I had another question.

Can we skip the files that don’t exist in the http remote but are present in the --files-from files.txt?
I’ve extracted all of files into a text file but some of the files in the remote longer exists.
So rclone doens’t do anything and puts up this error,
Failed to lsf: Stat failed: failed to stat: HTTP Error 404: 404 Not Found

Can I suppress this error and still continue with the files that are present in the remote as well as in the files.txt.


I think rclone should be doing that already…

In fact it looks like a bug…

Try this (uploaded in 15-30 mins)


You are right,

This does work.

Thanks a lot.


Thanks for testing. I’ll merge that to the latest beta now - it will be there in 15-30 mins.



I’ve done just that, a scrapy spider crawls the website to list me all the links and stores it into a file for the remote i am trying to fetch using rclone.
Both the crawler and rclone runs in a scheduled manner, once every 6 hours,
but Rclone is taking very long to start the process of copy (about 2-4 hours) depending on the number of entries in the file mentioned in --files-from.

Without the use of --files-from rclone would perform much better,
is this supposed to take a long time to start?
(The file has around 4k-5k entries)

Rclone version I’ve tried on:
rclone: Version “v1.45-031-ge7684b7e-beta”


Hmm, yes rclone is checking each file in the file-from exists. However it does this in a very inefficient way, one at a time! Rclone should really be parallelising this using --checkers threads at once like it does everything else.

This would be relatively easy to implement.

The code is here:

Can you please make a new issue on github about that and we can have a go at fixing it. Maybe you’d like to help?


I’ve created a new issue,

I dont think I would be much help right now as I dont know GoLang at all,
Currently I am learning the basics of it,
Will try to help when I am able to understand whats going on in the project.


I’ve posted a beta in the issue for you to try :smile:

This issue isn’t a good one for people new to Go as anything involving concurrency is always difficult!