Rclone s3 copy fails when aws s3 cp works

What is the problem you are having with rclone?

When trying to copy a local file to s3, getting Forbidden(status code: 403) error, but awscli commands working fine with no issues.

Works - aws s3 cp test.csv s3://international/to_transfer/ --profile international --dryrun
Fails - rclone copy test.csv s3://international/to_transfer/ --s3-profile international --dry-run

Please advise me if I am missing anything here.

What is your rclone version (output from rclone version)

rclone v1.56.0

  • os/version: darwin 10.15.7 (64 bit)
  • os/kernel: 19.6.0 (x86_64)
  • os/type: darwin
  • os/arch: amd64
  • go/version: go1.16.6
  • go/linking: dynamic
  • go/tags: cmount

Which OS you are using and how many bits (eg Windows 7, 64 bit)

MacOS v10.15.7

Which cloud storage system are you using? (eg Google Drive)

AWS S3

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy test.csv s3://international/to_transfer/ --s3-profile international --dry-run

The rclone config contents with secrets removed.

[s3]
type = s3
provider = AWS
region = us-east-1
env_auth = true

AWS Config

> cat ~/.aws/credentials
[international]
aws_access_key_id = XXXXXXXX
aws_secret_access_key = YYYYYYYY

> cat ~/.aws/config
[profile international]
region = eu-west-1

A log from the command with the -vv flag

2021/08/16 21:24:02 DEBUG : rclone: Version "v1.56.0" starting with parameters ["rclone" "copy" "test.csv" "s3://international/to_transfer/" "--s3-profile" "international" "--dry-run" "-vvv"]
2021/08/16 21:24:02 DEBUG : Creating backend with remote "test.csv"
2021/08/16 21:24:02 DEBUG : Using config file from "/Users/aparthiban/.config/rclone/rclone.conf"
2021/08/16 21:24:02 DEBUG : fs cache: adding new entry for parent of "test.csv", "/Users/aparthiban/Downloads"
2021/08/16 21:24:02 DEBUG : Creating backend with remote "s3://international/to_transfer/"
2021/08/16 21:24:02 DEBUG : s3: detected overridden config - adding "{xGogr}" suffix to name
2021/08/16 21:24:02 DEBUG : fs cache: renaming cache item "s3://international/to_transfer/" to be canonical "s3{xGogr}:international/to_transfer"
2021/08/16 21:24:03 NOTICE: S3 bucket cnnintl-dp-tdcid path eds/to_warnermedia: Switched region to "eu-west-1" from "us-east-1"
2021/08/16 21:24:03 DEBUG : pacer: low level retry 1/10 (error BucketRegionError: incorrect region, the bucket is not in 'us-east-1' region at endpoint ''
        status code: 301, request id: , host id: )
2021/08/16 21:24:03 DEBUG : pacer: Rate limited, increasing sleep to 10ms
2021/08/16 21:24:03 DEBUG : pacer: Reducing sleep to 0s
2021/08/16 21:24:03 ERROR : Attempt 1/3 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: R758D29V3EJ8DYBN, host id: ukigO3tQdGC3z1tP3wGEv8/o037RIf4+jWS47uIv9tHmR2XptnFddHDR/PYA2+u1uE9tNNjG64Y=
2021/08/16 21:24:03 ERROR : Attempt 2/3 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: NMH6JV33C8DRGEGK, host id: eHlmaevB56/X0mptxoVp8AE9UI3NH1f8/+X6CzjR7uKLwgQaWkDXVAQNN8g4Dasj1CaE+iarDZk=
2021/08/16 21:24:03 ERROR : Attempt 3/3 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: NMHD7HX7Q7WG5QGT, host id: IaJ+TwTkhGqpzMZLiHK8fGRPdc2CbSRtSw3a99s78dU/UGo0BEVW6qPDU3ISgGuiosI4z39Kvq4=
2021/08/16 21:24:03 NOTICE: 
Transferred:              0 / 0 Byte, -, 0 Byte/s, ETA -
Errors:                 1 (retrying may help)
Elapsed time:         1.3s

2021/08/16 21:24:03 DEBUG : 10 go routines active
2021/08/16 21:24:03 Failed to copy: Forbidden: Forbidden
        status code: 403, request id: NMHD7HX7Q7WG5QGT, host id: IaJ+TwTkhGqpzMZLiHK8fGRPdc2CbSRtSw3a99s78dU/UGo0BEVW6qPDU3ISgGuiosI4z39Kvq4=

hello,

as per the debug log:
incorrect region, the bucket is not in 'us-east-1

rclone is told to use us-east-1 however, i assume, the bucket is stored at eu-west-1

  • the rclone config uses region = us-east-1
  • the aws config uses region = eu-west-1

When I tried setting region explicitly, I don't see the region specific error. But getting the same response.

> rclone copy test.csv s3://international/to_transfer/ --s3-profile international --s3-region eu-west-1 --dry-run -vvv
2021/08/17 10:20:11 DEBUG : rclone: Version "v1.56.0" starting with parameters ["rclone" "copy" "test.csv" "s3://international/to_transfer/" "--s3-profile" "international" "--s3-region" "eu-west-1" "--dry-run" "-vvv"]
2021/08/17 10:20:11 DEBUG : Creating backend with remote "test.csv"
2021/08/17 10:20:11 DEBUG : Using config file from "/Users/aparthiban/.config/rclone/rclone.conf"
2021/08/17 10:20:11 DEBUG : fs cache: adding new entry for parent of "test.csv", "/Users/aparthiban/Downloads"
2021/08/17 10:20:11 DEBUG : Creating backend with remote "s3://international/to_transfer/"
2021/08/17 10:20:11 DEBUG : s3: detected overridden config - adding "{L7M58}" suffix to name
2021/08/17 10:20:11 DEBUG : fs cache: renaming cache item "s3://international/to_transfer/" to be canonical "s3{L7M58}:international/to_transfer"
2021/08/17 10:20:12 ERROR : Attempt 1/3 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: 1WRWAJ1QC9YZH48P, host id: LyRv7lpwlcLw1iAJXTupo46IYOoy8S4O6LpK6Fy4IPduKePxD/NVo0C7h6MMucMxoMIUOQImMF4=
2021/08/17 10:20:12 ERROR : Attempt 2/3 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: 9RGGB2909MYM1HC7, host id: eDDvamhVgzxB50g6QH/2yiV1J7iBseBUg1TgltHsQnlonE+KrR2VVTU4zyAdlDlfUlGJCDMFcBY=
2021/08/17 10:20:12 ERROR : Attempt 3/3 failed with 1 errors and: Forbidden: Forbidden
        status code: 403, request id: 9RGR41V2FZ7TRVBP, host id: GP0xjqPALl2qoWsMrta4ncKOAWWGiDENokIqQW6T6vo9pvE+fMS5DagBGJJ+IT3bm2lHb//Ax10=
2021/08/17 10:20:12 NOTICE: 
Transferred:              0 / 0 Byte, -, 0 Byte/s, ETA -
Errors:                 1 (retrying may help)
Elapsed time:         1.0s

2021/08/17 10:20:12 DEBUG : 6 go routines active
2021/08/17 10:20:12 Failed to copy: Forbidden: Forbidden
        status code: 403, request id: 9RGR41V2FZ7TRVBP, host id: GP0xjqPALl2qoWsMrta4ncKOAWWGiDENokIqQW6T6vo9pvE+fMS5DagBGJJ+IT3bm2lHb//Ax10=

I wonder if we need more s3 permissions for rclone other then PutObject. Does below permissions enough to perform this copy operation?

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetBucketLocation"
            ],
            "Resource": "arn:aws:s3:::international"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject"
            ],
            "Resource": [
                "arn:aws:s3:::international/to_transfer/",
                "arn:aws:s3:::international/to_transfer/*"
            ]
        }
    ]
}

this is documented at https://rclone.org/s3/#s3-permissions

i suggest to use that template, see if it works.
if it does work, then test different s3 bucket policies for your use-case.

I can only see the required S3 permissions for rclone sync and rlcone lsd commands in the given link above. I am looking for rclone copy command, where I have access to only copy/push the file (not even ls access).

the policy below will do that with a rclone command such as
rclone copy file.txt remote:en07 --s3-no-check-bucket --s3-no-head-object --s3-no-head

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "s3:PutObject"
      ],
      "Resource": [
        "arn:aws:s3:::en07/*",
        "arn:aws:s3:::en07"
      ]
    }
  ]
}
1 Like

The above solutions works just fine. But I wonder why below 2 commands behave differently. Can you please advise if I am missing anything here.

> rclone copy /Users/aparthiban/Downloads/test.csv s3://international/to_transfer/ --s3-no-check-bucket --s3-no-head-object --s3-no-head --ignore-checksum (WORKS !)

> rclone copy --include test.csv /Users/aparthiban/Downloads s3://international/to_transfer/ --s3-no-check-bucket --s3-no-head-object --s3-no-head --ignore-checksum (FAILS !!)

2021/08/18 21:59:53 DEBUG : rclone: Version "v1.56.0" starting with parameters ["rclone" "copy" "--ignore-checksum" "--include" "test.csv" "/Users/aparthiban/Downloads" "s3://international/to_transfer/" "-vvv"]
2021/08/18 21:59:53 DEBUG : Creating backend with remote "/Users/aparthiban/Downloads"
2021/08/18 21:59:53 DEBUG : Using config file from "/Users/aparthiban/.config/rclone/rclone.conf"
2021/08/18 21:59:53 DEBUG : Creating backend with remote "s3://international/to_transfer/"
2021/08/18 21:59:53 DEBUG : fs cache: renaming cache item "s3://international/to_transfer/" to be canonical "s3:international/to_transfer"
2021/08/18 21:59:54 ERROR : S3 bucket international path to_transfer: error reading destination root directory: AccessDenied: Access Denied
        status code: 403, request id: TYR5673CSXV34622, host id: ZcLYLwv+y1xLY2tcJxaC1U9ao6J+RoJB5J5wjAsw6TDgJrfkma8+e63qGr+TaEP8wVhZcw2X+oY=
2021/08/18 21:59:54 DEBUG : S3 bucket international path to_transfer: Waiting for checks to finish
2021/08/18 21:59:54 DEBUG : S3 bucket international path to_transfer: Waiting for transfers to finish
2021/08/18 21:59:54 ERROR : Attempt 1/3 failed with 1 errors and: AccessDenied: Access Denied
        status code: 403, request id: TYR5673CSXV34622, host id: ZcLYLwv+y1xLY2tcJxaC1U9ao6J+RoJB5J5wjAsw6TDgJrfkma8+e63qGr+TaEP8wVhZcw2X+oY=
2021/08/18 21:59:54 ERROR : S3 bucket international path to_transfer: error reading destination root directory: AccessDenied: Access Denied
        status code: 403, request id: TYR4DGER2PJP27A2, host id: SdcC21kV1mfFkRTDOc2RLMT56usGSZqBFPzKc7Cbuub5vsnplBfWLAa4tL7WXFStxznMPWNSWGE=
2021/08/18 21:59:54 DEBUG : S3 bucket international path to_transfer: Waiting for checks to finish
2021/08/18 21:59:54 DEBUG : S3 bucket international path to_transfer: Waiting for transfers to finish
2021/08/18 21:59:54 ERROR : Attempt 2/3 failed with 1 errors and: AccessDenied: Access Denied
        status code: 403, request id: TYR4DGER2PJP27A2, host id: SdcC21kV1mfFkRTDOc2RLMT56usGSZqBFPzKc7Cbuub5vsnplBfWLAa4tL7WXFStxznMPWNSWGE=
2021/08/18 21:59:54 ERROR : S3 bucket international path to_transfer: error reading destination root directory: AccessDenied: Access Denied
        status code: 403, request id: TYR2JMD4TKDC8J27, host id: lpnlrOZ1++WpmQP5EGY1yDiulM1uFxy7JXnjt/wzyFm5CKA1DLh/Xg1XFZLRhRf6bgAhoAIjFZw=
2021/08/18 21:59:54 DEBUG : S3 bucket international path to_transfer: Waiting for checks to finish
2021/08/18 21:59:54 DEBUG : S3 bucket international path to_transfer: Waiting for transfers to finish
2021/08/18 21:59:54 ERROR : Attempt 3/3 failed with 1 errors and: AccessDenied: Access Denied
        status code: 403, request id: TYR2JMD4TKDC8J27, host id: lpnlrOZ1++WpmQP5EGY1yDiulM1uFxy7JXnjt/wzyFm5CKA1DLh/Xg1XFZLRhRf6bgAhoAIjFZw=
2021/08/18 21:59:54 INFO  : 
Transferred:              0 / 0 Byte, -, 0 Byte/s, ETA -
Errors:                 1 (retrying may help)
Elapsed time:         1.1s

2021/08/18 21:59:54 DEBUG : 6 go routines active
2021/08/18 21:59:54 Failed to copy: AccessDenied: Access Denied
        status code: 403, request id: TYR2JMD4TKDC8J27, host id: lpnlrOZ1++WpmQP5EGY1yDiulM1uFxy7JXnjt/wzyFm5CKA1DLh/Xg1XFZLRhRf6bgAhoAIjFZw=

the first command copies a single file.

the second command copies a folder.
in this case, rclone needs to compare the source and dest; rclone will try to GET a listing of files in the dest folder.
based on the s3 bucket policy, that is not permitted and leads to the 403

to work around this, add --no-traverse

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.