TL;DR Adding workers to the cloudflare proxy, as in the official tutorial, seems to conflict with the authentication in rclone. The following is how I reached the speculation.
I was setting up rclone with backblaze b2, with RCLONE_B2_DOWNLOAD_URL set to Cloudflare to proxy requests. Meanwhile I followed the official blog from Backblaze "How to allow Cloudflare to fetch content from a Backblaze B2 private bucket – Backblaze Help"(couldn't post links) to enable public access to a private bucket.
I had a VPS working but on multiple clients rclone showed errors when copying an object.
2020/12/28 14:18:46 DEBUG : fs cache: adding new entry for parent of "<remote:file>", "<remote:path>"
2020/12/28 14:18:46 DEBUG : <file_name>: Need to transfer - File not found at Destination
2020/12/28 14:18:46 DEBUG : B2 bucket <bucket_name>: Unauthorized: Unknown 401 Unauthorized (401 bad_auth_token)
2020/12/28 14:18:46 DEBUG : pacer: low level retry 1/10 (error Unknown 401 Unauthorized (401 bad_auth_token))
2020/12/28 14:18:46 DEBUG : pacer: Rate limited, increasing sleep to 20ms
I did some search and found one post concerning Backblaze with Cloudflare. "Correct format for RCLONE_B2_DOWNLOAD_URL variable? - Help and Support - rclone forum" It wasn't the same case, but it came to me that it might due to the difference between versions. So I did a binary search on the releases. It happens that some commit between v1.52.3 and v1.53.0 breaks it. I looked up the commits between them and located a seemingly relevant commit 957311. I confirmed that with this commit reverted the problem resolves, but applying the revert to the latest commit does not do the same.
I added some debug info to confirm that this issues is raised by the different styles of api: fetch object by id or by name. I looked at the errors again and it came to me that maybe this has something to do with Cloudflare tampering the authentication headers. The worker script in the tutorial is as follows.
addEventListener('fetch', event => {
event.respondWith(handleRequest(event.request))
})
async function handleRequest(request) {
let authToken='<b2 auth token>'
let b2Headers = new Headers(request.headers)
b2Headers.append("Authorization", authToken)
modRequest = new Request(request.url, {
method: request.method,
headers: b2Headers
})
const response = await fetch(modRequest)
return response
}
I removed the worker and everything got smooth. It took me a whole evening. Whew!
This post is for sharing my solution to people who encounter the same problems. I have some questions, too. The script enables public access to private buckets by settings the 'Authorization' header. I confirmed that with the worker I can publicly fetch the files from cloudflare. How did that conflict with the credentials in rclone? The files are requested by filenames but not file ids, so are there other authentication schemes?
I don't know golang, so apologies if I missed anything. Thanks in advance!