Where is the temporary files while using "rclone copyurl" or "rclone copy --http-url"

I tried to download a 1T file from a URL and upload it directly on a Ceph storage without saving this anywhere else.

The link that I tried to download was on "amazonaws.com". there was not a specific file name at the end of the URL. So, I couldn't use this command:

rclone copy -v --http-url https:// :http:  ceph:bucket/

Then I decided to use this command instead:

clone copyurl --auto-filename  https:// ceph:backups/ 

My rclone version is:

rclone v1.53.3-DEV
- os/arch: linux/amd64
- go version: go1.15.9

I have some questions here:

1-Did I understand correctly that I can not use "rclone copyurl" when I do not have a specific file name at the end of the URL?

2- The size of the file is so big, So The file should be downloaded somewhere temporarily to complete the download process. Where is the temporary file (In the server where I ran the command or on the destination bucket)?

3- The process of Downloading the file with "rclone copy --http-url" is so slow in comparison with the "wget" or the "axel" command. Are there any options for speeding up the download process or increasing the number of download connections?

Thank you so much for this great software.

hello and welcome to the forum,

https://rclone.org/commands/rclone_config_paths

--- best to update to latest stable v1.57.0, not use some custom compiled old version.
https://rclone.org/downloads/#script-download-and-install

--- best to post a debug log, add -vv to the command and post the output.

i think there are some forum topics about that.
i could be wrong, but copyurl is going to be slow.

Thank you "asdffdsa" for your response.

rclone config paths
--- best to update to latest stable v1.57.0, not use some custom compiled old version.

actually, I installed rclone from Debian official repository, which is here:
http://ftp.debian.org/debian/pool/main/r/rclone/

I think the version is the reason that I have no "paths" option under "rclone config". Is that correct?

i think there are some forum topics about that.
i could be wrong, but copyurl is going to be slow.

Thanks, but I used "rclone copy --http-url " to download from URL and upload on Ceph directly, and It was so slow.

--- best to update to latest stable v1.57.0, not use some custom compiled old version.
the only way to get the official up to date rclone is at
https://rclone.org/downloads/#script-download-and-install

--- best to post a debug log, add -vv to the command and post the output.

might want to figure out which part is slow and focus on that.
might want to test on a smaller file
--- rclone copy to local --- local to ceph:bucket`

might want to figure out which part is slow and focus on that.
might want to test on a smaller file
--- rclone copy to local --- local to ceph:bucket`

Sure, I'll check it.

Are there any options to list incomplete files on "Ceph:"?

when you first posted, there was a template of questions.
as you are new to the forum, we over looked that.

so update, test again and if the problems remains,

Thanks for your explanation.

But my question was not a new question, It was completely related to my second question, which was "Where is the temporary or incomplete" file.

BTW, I need to know, Does "rclone ls " can show the files which are not downloaded completely (temporary or incomplete files)?

You can give a file name on the command line, or use --auto-filename to use the one in the url.

rclone copyurl doensn't make a temporary copy, it streams it from the source to the destination.

Because rclone copyurl is streaming the data it only uses one connection.

Assuming you are using the s3 protocol with Ceph you can use

rclone backend list-multipart-uploads Ceph:

and

rclone backend cleanup Ceph:

See help here: Amazon S3

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.