TowerBR
(TowerBR)
March 23, 2021, 9:45pm
1
What is the problem you are having with rclone?
I'm getting an error when I try to access B2 through Cloudflare. However, when I try to view the same URL / file with curl, it works perfectly:
curl https://<Cloudlare_DNS_CNAME_URL/file/<bucket>/test.txt
What is your rclone version (output from rclone version
)
rclone v1.54.1
Which OS you are using and how many bits (eg Windows 7, 64 bit)
Debian 4.19.0-14
Which cloud storage system are you using? (eg Google Drive)
Backblaze B2
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
rclone cat --http-url https://<Cloudlare_DNS_CNAME_URL>/file/<bucket>/test.txt :http:
The rclone config contents with secrets removed.
Calling URL directly in the command, it is not using config file.
A log from the command with the -vv
flag
ERROR : : error listing: error listing "": directory not found
Failed to cat with 2 errors: last error was: error listing "": directory not found
ncw
(Nick Craig-Wood)
March 24, 2021, 10:12am
2
Hello @TowerBR - nice to see you again
The http backend requires directory listings to work which don't with the b2 bucket exported via cloudflare.
If you want to use rclone to download a single file you know the name of then use
rclone copyurl https://<Cloudlare_DNS_CNAME_URL>/file/<bucket>/test.txt
If you want more "syncing" abilities then investigate the --download-url parameter
Hope that helps!
TowerBR
(TowerBR)
March 24, 2021, 11:28am
3
Hello @ncw , Rclone is my daily tool, my real "Swiss army knife of cloud", and sometimes I try to do something "different" and I encounter problems.
I actually want to copy / move hundreds of GB out of B2, and I would like to avoid download charges.
I've already done some tests with the --b2-download-url
parameter, but I was afraid if the download was really coming from the custom endpoint.
For example, when testing the command
rclone lsd <b2_remote_in_config_file>: --b2-download-url=https://<Cloudlare_DNS_CNAME_URL>/file/<bucket>/
it returned all buckets (according to the first parameter, where I did not specify buckets, but all storage), and not just the <bucket>
objects.
How can I check if the downloads really come from the custom endpoint?
ncw
(Nick Craig-Wood)
March 24, 2021, 5:11pm
4
TowerBR:
Hello @ncw , Rclone is my daily tool, my real "Swiss army knife of cloud", and sometimes I try to do something "different" and I encounter problems.
TowerBR:
I've already done some tests with the --b2-download-url
parameter, but I was afraid if the download was really coming from the custom endpoint.
For example, when testing the command
rclone lsd <b2_remote_in_config_file>: --b2-download-url=https://<Cloudlare_DNS_CNAME_URL>/file/<bucket>/
it returned all buckets (according to the first parameter, where I did not specify buckets, but all storage), and not just the <bucket>
objects.
When you use --b2-download-url
it uses normal B2 endpoints for all the API stuff but used the cloudflare URL only for downloading files.
So what you found above is what I'd expect.
If you want to double check where the downloads come from use -vv --dump headers
which will show the HTTP headers in use - you can check the downloads from there.
So you want to set up your cloudflare domain cdn.example.com which has a CNAME pointing to f000.backblazeb2.com (or whaterver the right name is for you)
Then you pass --b2-download-url https://cdn.example.com
Interestingly I found a very small bug which made it so you couldn't download HTML files via cloudflare as it strips the Content-Length form the header whch rclone was relying on.
I just merged a fix for that to the latest beta .
1 Like
TowerBR
(TowerBR)
March 25, 2021, 12:09am
6
Rclone is the best !
Transferred: 36.402G / 48.048 GBytes, 76%, 264.796 MBits/s
^^^^^^^^^^^^^^^
1 Like
TowerBR
(TowerBR)
March 26, 2021, 12:24pm
8
A Google Cloud VM, transferring from B2 to Wasabi.
1 Like
system
(system)
Closed
March 29, 2021, 12:24pm
9
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.