R2 cloudflare unauthorized

What is the problem you are having with rclone?

rclone was working find for months until last week. Now, I get authorization errors when I try any rclone command with Cloudflare R2.

Run the command 'rclone version' and share the full output of the command.

rclone v1.65.0
- os/version: darwin 12.5 (64 bit)
- os/kernel: 21.6.0 (arm64)
- os/type: darwin
- os/arch: arm64 (ARMv8 compatible)
- go/version: go1.21.4
- go/linking: dynamic
- go/tags: none

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy myfile.mp3 r2:path/to/directory

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[r2]
type = s3
provider = Cloudflare
access_key_id = XXX
secret_access_key = XXX
endpoint = https://5db001e3b81b42dcc6cfe425ce8376a8.r2.cloudflarestorage.com
acl = private
region = auto
2023/12/15 12:52:03 ERROR : Attempt 1/3 failed with 1 errors and: Unauthorized: Unauthorized
	status code: 401, request id: , host id: 
2023/12/15 12:52:03 ERROR : Attempt 2/3 failed with 1 errors and: Unauthorized: Unauthorized
	status code: 401, request id: , host id: 
2023/12/15 12:52:03 ERROR : Attempt 3/3 failed with 1 errors and: Unauthorized: Unauthorized
	status code: 401, request id: , host id: 
2023/12/15 12:52:03 Failed to copy: Unauthorized: Unauthorized
	status code: 401, request id: , host id: 

This all worked fine up until a week or so. Since rclone didn't change, I assume something changed on the R2 backend that disallows authorization.

Thank you for any suggestions you might have.

I would start with validating your credentials.

Try to connect using some different tool than rclone - e.g. aws cli or any other S3 compatible program you prefer.

Hi, thanks for the reply.

I can access the r2 buckets just fine with s3cmd and using the same exact config setting (key)

If this isn't the credentials then it could be that rclone is connecting to a different server.

So check DNS. Is rclone connecting via IPv6 or IPv4? (Try --bind 0.0.0.0)

Try adding -vv --dump bodies and see whether it looks OK.

Are you using a scoped apikey at the cloudflare side? I have noticed the same that object read/write is not enough but you need admin read/write.

I just dealt with this. It turns out my R2 token expired :smiley:

Thanks everyone for the responses.

I'm still having issues. I have the same keys and config settings for s3cmd and rclone. s3cmd works just fine, but rclone says Access Denied.

I went ahead and made new keys in cloudflare and update both s3cmd and rclone settings and get the same thing. s3cmd works fine, but rclone says access denied.

I've tried with --bind 0.0.0.0, but still get the same thing.

I tied with dumbing bodies and get no extra info, just that it fails with 403 (edited below)

2024/01/05 16:48:46 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2024/01/05 16:48:46 DEBUG : <myfile>.mp3: Need to transfer - File not found at Destination
2024/01/05 16:48:46 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/01/05 16:48:46 DEBUG : HTTP REQUEST (req 0x14000219c00)
2024/01/05 16:48:46 DEBUG : PUT /<myfolder> HTTP/1.1
Host: <path>.r2.cloudflarestorage.com
User-Agent: rclone/v1.65.0
Content-Length: 148
Authorization: XXXX
X-Amz-Acl: private
X-Amz-Content-Sha256: XXXXXXXXXXXXXXXXXXXXXXXXXXXX
X-Amz-Date: 20240106T004846Z
Accept-Encoding: gzip

<CreateBucketConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/"><LocationConstraint>auto</LocationConstraint></CreateBucketConfiguration>
2024/01/05 16:48:46 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2024/01/05 16:48:46 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2024/01/05 16:48:46 DEBUG : HTTP RESPONSE (req 0x14000219c00)
2024/01/05 16:48:46 DEBUG : HTTP/1.1 403 Forbidden
Transfer-Encoding: chunked
Cf-Ray: 840ff513ab55db62-LAX
Connection: keep-alive
Content-Type: application/xml
Date: Sat, 06 Jan 2024 00:48:46 GMT
Server: cloudflare
Vary: Accept-Encoding

6e
<?xml version="1.0" encoding="UTF-8"?><Error><Code>AccessDenied</Code><Message>Access Denied</Message></Error>
0

might try --s3-no-check-bucket

yup, that was it.

rclone --s3-no-check-bucket copy <file> r2:<path> works fine

rclone copy <file> r2:<path> gives the 403

Any reason this is happening?

well, initially, the error was 401 and no one is sure about the cause of that.
more testing is needed.

after that, "I went ahead and made new keys" which lead to a new error 403.

perhaps, when you created those new keys, you limited the permissions to:
"Object Read & Write" and/or "Apply to specific buckets only"
is that correct?

Thanks for the reply and your patience.

You are correct. I created the key with limited permissions. If I upgrade the key to "Admin read and write", then it can clone copy <file> r2:<path> without issues.

The original error and keys still give 401. But, it gives the same error in s3cmd so it is likely a key issue. The key was and is still set for TTL "forever" in the cloudflare UI, so I don't understand what is/was causing the 401. Maybe it's just an issue with their keys.

In any case, I am up and running again. Thank you.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.