Failed to link: GCS bucket doesn't support public links

What is the problem you are having with rclone?

Unable to copy the public links for google cloud and backblaze
Trying to get a link of google file is giving this error-
Failed to link: GCS bucket swxxxxx doesn't support public links
But the file is actually available to public access.

Second issue is more of a question, using backblaze newly created buckets are set to 'private' by default, and after manually changing the buckets permission to 'public' rclone link command returns the "Native URL".
Is it possible to make the new buckets public, and get the S3 URL instead?

Run the command 'rclone version' and share the full output of the command.

rclone v1.62.2
- os/version: Microsoft Windows 10 Pro 21H2 (64 bit)
- os/kernel: 10.0.19044.2728 Build 19044.2728.2728 (x86_64)
- os/type: windows
- os/arch: amd64
- go/version: go1.20.2
- go/linking: static
- go/tags: cmount

Which cloud storage system are you using? (eg Google Drive)

Google cloud/backblaze

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone.exe link "gCloud:swxxxxx/index.html"

The rclone config contents with secrets removed.

type = google cloud storage
project_number = fine-command-123123
object_acl = publicRead
bucket_acl = publicRead
token = {"access_token":""}

A log from the command with the -vv flag

2023/03/23 23:13:47 DEBUG : rclone: Version "v1.62.2" starting with parameters ["rclone.exe" "link" "gCloud:swxxxxx/index.html" "-vv"]
2023/03/23 23:13:47 DEBUG : Creating backend with remote "gCloud:swxxxxx/index.html"
2023/03/23 23:13:47 DEBUG : Using config file from "C:\\Users\\tomaj\\AppData\\Roaming\\rclone\\rclone.conf"
2023/03/23 23:13:47 DEBUG : fs cache: adding new entry for parent of "gCloud:swxxxxx/index.html", "gCloud:swxxxxx"
2023/03/23 23:13:47 DEBUG : 3 go routines active
2023/03/23 23:13:47 Failed to link: GCS bucket swxxxxx doesn't support public links

Thank you for this great tool, just starting to use it and love it already!

This is a bit missing from the GCS backend. It would be easy to add if you wanted to have a go? You'd have to implement the PublicLink method.

Why do you want the S3 URL? Are you accessing the buckets via the S3 interface? Or is that just for public access?

I think any of the URLs will work for public access.

Note that you could set this and rclone link would give the URL based off the download URL


Custom endpoint for downloads.

This is usually set to a Cloudflare CDN URL as Backblaze offers
free egress for data downloaded through the Cloudflare network.
Rclone works with private buckets by sending an "Authorization" header.
If the custom endpoint rewrites the requests for authentication,
e.g., in Cloudflare Workers, this header needs to be handled properly.
Leave blank if you want to use the endpoint provided by Backblaze.

The URL provided here SHOULD have the protocol and SHOULD NOT have
a trailing slash or specify the /file/bucket subpath as rclone will
request files with "{download_url}/file/{bucket_name}/{path}".


(No trailing "/", "file" or "bucket")


  • Config: download_url
  • Type: string
  • Required: false

Thank you Nick, yes I would surely give my best!
Please let me know what needs to be done, thank you.

The first start would be to look in the other backends for the PublicLink function to see how it is done.

Once you have seen those implement in the googlecloudstorage backend.

I think the SDK should be able to give you the URL.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.