AWS S3 Configuration

I want to trial s3 or more specifically s3 glacier deep archive.

I've created a root user on AWS website, but I am having difficulty what exactly I need to do to create the rclone config, I believe I need an IAM user, is that right?

The config tool and website also mention several different providers that the s3 storage system works with, is there any benefit in using an s3 provider than using s3 direct? or do I lose anything by using an s3 provider and not aws s3 direct.

Also am I right in thinking if I choose the 48-hour option as the retrieval for anything I may have in deep archive storage there is no retrieval cost?

Also when we ask for retrieval the files themselves stay in the deep archive and a copy is placed in a different s3 storage type temporarily for retrieval for X number of days and is then removed?

right. create the user, aws will display the client_id and client_secret. put that ingo into a remote.

for basic usage, the main difference is cost.

aws pricing is very complicated.
Amazon S3 Simple Storage Service Pricing - Amazon Web Services shows bulk retrieval at $0.0025/GB

AWS Pricing Calculator shows pricing at the same cost
1,024 GB per month x 0.0025 USD = 2.56 USD (Cost for Glacier Deep Archive Data Retrieval (Bulk))

correct, you pay twice.

Do you know what policies I need to assign to IAM user, I can't seem to figure it out and the search function is no help.

I'm getting these errors after I have logged into the IAM user.

User: arn:aws:iam::REDACTED:user/REDACTED
Action: iam:ListMFADevices
Context: no identity-based policy allows the action

User: arn:aws:iam::REDACTED:user/REDACTED
Action: iam:ListAccessKeys
Context: no identity-based policy allows the action

User: arn:aws:iam::REDACTED:user/REDACTED
Action: iam:ListSigningCertificates
Context: no identity-based policy allows the action

User: arn:aws:iam::REDACTED:user/REDACTED
Action: iam:GetLoginProfile
Context: no identity-based policy allows the action

Yes I agree, tried that aws pricing calculator, so I get charged also for uploading data to glacier deep archive?

I'm thinking of just making a copy of my data to deep archive for now whilst trialing it, and as it's low cost I would setup a seperate account and just deposit maybe £10-£15 into it each month, and then use that as a buffer as and when I need to get access to something. Some of my data I haven't needed to access for 5+ years, but was until recently sat in a google workspace account.

Example policy

Oh I have to make my own policy? I expected to be able to just add what I need using the website.

correct, add the policy using the website.

This is an unexpected learning curve moving from google to s3.

But it's all good getting there slowly, on another note do aws charge you for uploading content to deep glacier or have I totally misunderstood the confusing pricing calculator?

Nothing is really free with AWS but on positive side you only pay for what you use.

In terms of uploads (including glacier) you do not pay for network bandwidth but you pay for API transactions. From practical perspective it is better to have few big files than many small ones.

yes, it is.
with gdrive, basically, your client id is root user. so it is simple to use, due to lack of security features.
with s3, there are two types of users.

  1. root user - does not require a policy. in production, should never be used.
  2. non-root user, which is the typical IAM user.

fwiw, for now, create an root IAM user, forget about deep glacier, and make your mistakes on small text files.
Creating access keys for the root user

whatever the calculator tells you.
in general, providers do not charge for bandwidth for uploads, but with AWS, you pay for API calls.

how many TiB of data?

I originally had 125TB of data in Google Workspace, however I have been pretty ruthless and got it down to approx 8TB that I would hate to lose, this is now in JottaCloud. I have I think approx further 15-20TB that wouldn't be the end of world if I lost it, albeit some of it I would never be able to get again. Of the 8TB mention I think approx 600M to 1TB could be in deep glacier.

ok, but given the very small amount of data, why bother with the complexity of deep glacier, policies and understanding complex costs analysis?
with idrive e2, $2.50/TiB/month, and the first year is half-price.

I hadn't looked at that one mainly because it's paid in USD rather than GBP.

I did look at ionos as that was mentioned in the rclone config but there pricing seemed just as complex https://cloud.ionos.co.uk/storage/object-storage#pricing

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.