Can't use HTTP remote with port other than 80

What is the problem you are having with rclone?

http remote not working with port other than 80

Run the command 'rclone version' and share the full output of the command.

rclone v1.62.2

  • os/version: slackware 14.2+ (64 bit)
  • os/kernel: 5.10.28-Unraid (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.20.2
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

http

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone lsd --http-url http://campsolo:darble@websiteurl.com:85 :http:

Also doesn't work setting up an http remote

The rclone config contents with secrets removed.

[glasto-niki]
type = http-test
url = http://user:pass@websiteurl.com:85/

A log from the command with the -vv flag

rclone lsd --http-url http://campsolo:darble@niki.me.uk:85 :http: -vv
2023/05/21 18:55:09 DEBUG : rclone: Version "v1.62.2" starting with parameters ["rclone" "lsd" "--http-url" "http://user:pass@websiteurl.com:85" ":http:" "-vv"]
2023/05/21 18:55:09 DEBUG : Creating backend with remote ":http:"
2023/05/21 18:55:09 DEBUG : Using config file from "/mnt/user/test/rclone.conf"
2023/05/21 18:55:09 DEBUG : :http: detected overridden config - adding "{owxIE}" suffix to name
2023/05/21 18:55:09 DEBUG : Root: http://user:pass@websiteurl.com:85/
2023/05/21 18:55:09 DEBUG : fs cache: renaming cache item ":http:" to be canonical ":http{owxIE}:"
2023/05/21 18:55:09 ERROR : : error listing: error listing "": failed to readDir: HTTP Error: 401 Authorization Required
2023/05/21 18:55:09 DEBUG : 2 go routines active
2023/05/21 18:55:09 Failed to lsd with 2 errors: last error was: error listing "": failed to readDir: HTTP Error: 401 Authorization Required

Have you tried to curl that URL from your server? The error in the log indicates that the error coming back from the http server is 401 Authorization Required. Just trying to rule out whether rclone is the actual issue here.

1 Like

perhaps type = http-test should be type = http

Okay so I just tried this. Weirdly I do get the 401. However in Chrome and Firefox it works fine and it's normal HTTP authentication.

The issue might be that the website uses Tiny File Manager 2.5.2. Does anyone have any experience with how files are served by this rather than Apache?

based on my own experience in the past and forum posts,
rclone has support for static .html files.
rclone has issues with accessing dynamic content such as PHP

for example, https://tinyfilemanager.github.io/demo/ works in firefox but not rclone

rclone ls :http,url='https://tinyfilemanager.github.io/': -vv
DEBUG : rclone: Version "v1.62.2" starting with parameters ["rclone" "ls" ":http,url='https://tinyfilemanager.github.io/':" "-vv"]
DEBUG : Creating backend with remote ":http,url='https://tinyfilemanager.github.io/':"
DEBUG : Using config file from "c:\\data\\rclone\\rclone.conf"
DEBUG : :http: detected overridden config - adding "{uZzg9}" suffix to name
DEBUG : Root: https://tinyfilemanager.github.io/
DEBUG : fs cache: renaming cache item ":http,url='https://tinyfilemanager.github.io/':" to be canonical ":http{uZzg9}:"

and if you want to see more details
rclone ls :http,url='https://tinyfilemanager.github.io/': --dump=headers,bodies,responses

1 Like

Thanks. So I guess not currently supported by rclone unfortunately?

not an expert with http remote; perhaps someone more experienced can opine?
there are a number of forum topics like yours about dynamic content.
i never saw a solution.

not sure your use case but one possible workaround:
use rclone as a http server
or
use script that pre-generates a set of static .html files.

1 Like

I'm trying to copy data from an HTTP server not controlled by me so unfortunately there doesn't seem to be any solution.

For the record though, it doesn't seem that the port is the issue.

i imagine that is the what the host wants, to prevent mass downloads.

you can download individual files using direct link
rclone copyurl https://tinyfilemanager.alwaysdata.net/Audio/horse.mp3 ./ --auto-filename

1 Like

Is there a way to do that with a list of URLs?

Unfortunately the dl links look like:

http://user:pass@url.com:85/index.php/?p=dir&dl=vid.mp4

And it just downloads an index.php instead of the file. Same with wget and curl.

Thanks anyway

you can download files using direct links, not the .php

rclone copyurl "https://tinyfilemanager.alwaysdata.net/Audio/horse.mp3" ./dest/ --auto-filename  -vv
DEBUG : rclone: Version "v1.62.2" starting with parameters ["rclone" "copyurl" "https://tinyfilemanager.alwaysdata.net/Audio/horse.mp3" "./dest/" "--auto-filename" "-vv"]
DEBUG : Creating backend with remote "./dest/"
DEBUG : Using config file from "c:\\data\\rclone\\rclone.conf"
DEBUG : fs cache: renaming cache item "./dest/" to be canonical "//?/c:/data/rclone/dest"
DEBUG : horse.mp3: File name found in url
INFO  :
Transferred:       28.237 KiB / 28.237 KiB, 100%, 0 B/s, ETA -
Transferred:            1 / 1, 100%
Elapsed time:         0.6s

rclone ls ./dest
    28915 horse.mp3

Ah, you mean generating your own direct links? Let me try.

How can I pass a textfile with a list of URLs to copyurl?

yes, this has been discussed in the forum i have posted an example or two.
so search the forum for it.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.