Required S3 Perms

We are trying to use Rclone to push files to S3 using this command

rclone -v --config /path/to/rclone.conf sync /path/to/local/directory S3Remote:bucket-name/path/in/bucket

We are trying to protect them so we can only upload the files once and we can't delete or overwrite them. We have policy in place so S3 delete the files over X days old. The idea behind this is if we are hit with ransomware the backups in S3 will survive.

We found a post related to this Doc Improvement: Required S3 Perms · Issue #1455 · rclone/rclone · GitHub and it recommends these rules

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::USER_SID:user/USER_NAME"
            },
            "Action": [
                "s3:ListBucket",
                "s3:DeleteObject",
                "s3:PutObject",
                "s3:PutObjectAcl"
            ],
            "Resource": [
              "arn:aws:s3:::BUCKET_NAME/*",
              "arn:aws:s3:::BUCKET_NAME"
            ]
        }
    ]
}

but it doesn't appear to work.
Seems we need a few other permissions to get rclone sync to work
Can anyone help point us at where we are going wrong.

welcome to the forum,

that is from the year 2017.
might start with the "official" rclone s3 policy and tweak that.

can remove s3:DeleteObject

not sure that is possible to prevent overwrite using user/bucket polices.
can use versioning, object lock, compliance as workarounds

and would be helpful to know the s3 provider?

keep in mind, that rclone provides minimal protection against that.
anyone with the rclone config, can overwrite the files.

That's just my two pennies' worth but you can not achieve strong ransomware protection using rclone and S3 policies only.

As already mentioned in order to do this right you have to enable S3 object locking (which implies versioning) and probably objects' retention unless you are happy to pay for forever growing storage space.

But what rclone does not support (at least today) is locking period update. When you upload new file it can be locked automatically for specified time. However in order for protection to be affective it has to be refreshed periodically - otherwise after some time old files won't be protected. It can be easily scripted though using aws-cli (aws s3api put-object-retention ...).

What rclone can help with is (beyond uploading new data) accessing past bucket versions. I have tried it myself few times and it is birlliant when I can see and use my bucket as it was at some point in the past using rclone I am familiar with.

Now for "cheap" protection substitute you can use versioning and restricted access:

   {
       "Version": "2012-10-17",
       "Statement": [
           {
               "Sid": "Stmt1480692207000",
               "Effect": "Deny",
               "Action": [
                   "s3:DeleteBucket",
                   "s3:DeleteBucketPolicy",
                   "s3:DeleteBucketWebsite",
                   "s3:DeleteObjectVersion"
               ],
               "Resource": [
                   "arn:aws:s3:::*"
               ]
           }
       ]
   }

But it will be only effective against basic ransomware.

Compliance lock when properly implemented will protect you from everything but your S3 account termination.

1 Like