Rclone azure blob storage auth with container level sas token

STOP and READ USE THIS TEMPLATE NO EXCEPTIONS - By not using this, you waste your time, our time and really hate puppies. Please remove these two lines and that will confirm you have read them.

What is the problem you are having with rclone?

we have copy job that copy from azure blob storage to gcp gcs bucket.
the way we setup authentication is:
it works fine when i setup storage account level sas token above but I could not find a way to setup container level sas token authentication.

i got 403 issue.
Any idea?

Run the command 'rclone version' and share the full output of the command.

output of rclone version:

rclone v1.64.0

  • os/version: centos 7.9.2009 (64 bit)
  • os/kernel: 3.10.0-1160.59.1.el7.x86_64 (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.21.1
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

azure blob, gcs

The command you were trying to run (eg rclone copy /tmp remote:tmp)

Paste command here

export RCLONE_CONFIG_SRC_TYPE="azureblob"
export RCLONE_CONFIG_SRC_ENDPOINT="http://blob.core.windows.net"
export RCLONE_CONFIG_SRC_SAS_URL="container sas token url"

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

Paste config here

A log from the command that you were trying to run with the -vv flag

Paste  log here

2023/11/17 19:25:55 Failed to ls: GET https://storageaccount.blob.core.windows.net/container

RESPONSE 403: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.

ERROR CODE: AuthenticationFailed

<?xml version="1.0" encoding="utf-8"?>AuthenticationFailedServer failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.


Time:2023-11-17T19:25:55.8260265Zsr is mandatory. Cannot be empty

You didn't include the command you are using...

You must include the container name in the rclone command if using container level SAS, so

rclone copy blah azureblob:container


rclone copy blah azureblob:container

the command that I used was:
rclone copy src:${CONTAINER}${AZURE_DATA_SUB_PATH}/datafolder dst:${GCS_BUCKET}/destination_folder -v --bwlimit ${BWLIMIT}"M"

Why don't you run rclone with -vv to make a debug log, then we can see exactly what arguments you passed.

That will tell you if there is anything wrong with those env cars.

sure let me get log with --v

Great! That will tell us what

rclone copy src:${CONTAINER}${AZURE_DATA_SUB_PATH}/datafolder dst:${GCS_BUCKET}/destination_folder -v --bwlimit ${BWLIMIT}"M"

Actually arrived as in the first line, eg

2023/11/21 10:47:55 DEBUG : rclone: Version "v1.65.0-beta.7523.15747cc9f.fix-s3-gcs-looping" starting with parameters ["rclone" "lsf" "azureblobsas:rclone" "-vv"]

Also I just verified that the container level SAS URL is working OK for me.

Hi Nick, it turns out the issue was the sas token url is not right.
I've made it work at my end. Thank you for your input.

BTW, we are exploring using azure Service principal (application) to authentication to blob storage.
Is the auth set up can be something like below or differently?



No worries!

You would use these env vars

You can use the azure standard variables also and you would need to set RCLONE_AZUREBLOB_ENV_AUTH=true as well.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.