HTTP not listing from "directory lister"

It doesn’t yet!

I’d need to add another parameter to copyurl rc API too.

It sounds like rclone is maxing out the website already - you said you were downloading 10-20 simultaneously at 10-15 KB.

The problem seems to be that most of the files are erroring out.

That probably means it is a security measure from the website.

Does it help if you lower the number of simultaneous transfers?

Maybe rclone’s download just isn’t sophisticated enough for this website and you should use a different one, but save the data into an rclone mount?

It was an issue with the website.
Thanks for your support.

On another note,

Parallelization works perfectly but I has noticed a new issue:
On using a rclone copy http_remote: gdrive_remote: --files-from files1.txt --checkers 100 -vv command, with the files1.txt having around 5k links, start really late as the log is flooded with low level retry.

2019/01/10 20:03:23 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=7595#####250, userRateLimitExceeded)
2019/01/10 20:03:23 DEBUG : pacer: Rate limited, sleeping for 16.885667094s (5 consecutive low level retries)

I am using my own project and the my quota limits are:

Courtesy daily quota is 1,000,000,000 QPD
Courtesy quota is 10,000 queries per 100s
Per User Limit is set to 1,000 queries per 100s

Stats for the command with ~5k links:

Transferred: 0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors: 0
Checks: 4552 / 4552, 100%
Transferred: 0 / 0, -
Elapsed time: 18m35.4s

Stats for the command with ~15k links:

Transferred: 0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors: 0
Checks: 14942 / 14942, 100%
Transferred: 2 / 2, 100%
Elapsed time: 3hr17m :scream:

Needless to say a lot of pacing and sleeping was done.

This does not happen when I dont use --files-from flag.

Using:
rclone v1.45-058-g9d16822c-beta

  • os/arch: windows/amd64
  • go version: go1.11

That is a fact of life using google drive alas. You can set all the rate limits you like in your account but you’ll still get those 403.

I suggest you add a --tpslimit 10 to your command to slow things down a bit.