What is the problem you are having with rclone?
I'm getting an error when I try to access B2 through Cloudflare. However, when I try to view the same URL / file with curl, it works perfectly:
What is your rclone version (output from
Which OS you are using and how many bits (eg Windows 7, 64 bit)
Which cloud storage system are you using? (eg Google Drive)
The command you were trying to run (eg
rclone copy /tmp remote:tmp)
rclone cat --http-url https://<Cloudlare_DNS_CNAME_URL>/file/<bucket>/test.txt :http:
The rclone config contents with secrets removed.
Calling URL directly in the command, it is not using config file.
A log from the command with the
ERROR : : error listing: error listing "": directory not found
Failed to cat with 2 errors: last error was: error listing "": directory not found
Hello @TowerBR - nice to see you again
The http backend requires directory listings to work which don't with the b2 bucket exported via cloudflare.
If you want to use rclone to download a single file you know the name of then use
rclone copyurl https://<Cloudlare_DNS_CNAME_URL>/file/<bucket>/test.txt
If you want more "syncing" abilities then investigate the --download-url parameter
Hope that helps!
Hello @ncw, Rclone is my daily tool, my real "Swiss army knife of cloud", and sometimes I try to do something "different" and I encounter problems.
I actually want to copy / move hundreds of GB out of B2, and I would like to avoid download charges.
I've already done some tests with the
--b2-download-url parameter, but I was afraid if the download was really coming from the custom endpoint.
For example, when testing the command
rclone lsd <b2_remote_in_config_file>: --b2-download-url=https://<Cloudlare_DNS_CNAME_URL>/file/<bucket>/
it returned all buckets (according to the first parameter, where I did not specify buckets, but all storage), and not just the
How can I check if the downloads really come from the custom endpoint?
When you use
--b2-download-url it uses normal B2 endpoints for all the API stuff but used the cloudflare URL only for downloading files.
So what you found above is what I'd expect.
If you want to double check where the downloads come from use
-vv --dump headers which will show the HTTP headers in use - you can check the downloads from there.
So you want to set up your cloudflare domain cdn.example.com which has a CNAME pointing to f000.backblazeb2.com (or whaterver the right name is for you)
Then you pass
Interestingly I found a very small bug which made it so you couldn't download HTML files via cloudflare as it strips the Content-Length form the header whch rclone was relying on.
I just merged a fix for that to the latest beta.
Rclone is the best!
Transferred: 36.402G / 48.048 GBytes, 76%, 264.796 MBits/s
A Google Cloud VM, transferring from B2 to Wasabi.
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.