so I have read the docs on required s3 permissions and done some testing with S3 IAM users who are supposed to be restricted to a subfolder within a bucket.
It would be super useful if rclone could work with permissions restricted to a subfolder within a bucket, say with a policy such as the following:
however, it doesn’t seem to work with this kind of permission. When doing a directory listing of the subfolder using rclone, it complains about
AccessDenied: User: arn:aws:sts::100--------:----- is not authorized to perform: s3:ListBucket on resource: arn:aws:s3:::BUCKET_NAME
I could be missing something, but rclone should be able to do everything it needs to do to complete the action using the given permission since I can do them manually via the S3 API.
Is this maybe to do with the way rclone uses the prefixes? If so, could the way it does this be adjusted to be able to work with a subfolder with tight permissions?
Many thanks Nick, the -vv --dump bodies flags revealed the problem which was easily fixed by changing the policy
The thing is that the previous policy which I posted above does not allow ListBucket action in sub-sub-folders, which turns out to be necessary by looking at the --dump bodies output. So simply expanding the prefix condition to SUBFOLDER_NAME/* as opposed to SUBFOLDER_NAME/ solved the problem.
The only thing I can still see that Rclone is doing is that the first call it does is HEAD /BUCKET_NAME/SUBFOLDER_NAME
This call is failing with 403 Forbidden since the head action is only allowed on /BUCKET_NAME/SUBFOLDER_NAME/*
It doesn’t seem to affect the rest of the command processing, so I am not sure if anything needs to be looked at there.