Rclone copy for s3 AWS

Hi guys, I need some help. I have a cron where every day at 12AM and 7PM I copy files to the AWS s3. Problem is that there are many small files, XML, PDF etc...
What happens is that the daily value is coming very expensive, could someone help me which parameters to use? I'm currently testing these
--fast-list --size-only

Check out:

Amazon S3 (rclone.org)

hello and welcome to the forum,

aws s3 is can be expensive as they charge for api calls and downloads.
some providers do not charge those fees, such as wasabi.

depending on your use case,
to reduce the number of api calls. might try using filters such as
--max-age=24h

This would be to do only new files within the space of 24h? Or modifiers?
For example, a file that has been there for 1 week and has been modified will be uploaded or just the new one created?

that file will be uploaded and overwrite the existing file in aws.

note:
if you move a file from one dir to another dir and that file is older than 24 hours,
then rclone will not sync/copy that file.
so the source and dest will be out of sync.

understood, so in case you just move directories it won't work.
So if I use max age every day and a full copy once a week, I think it will reduce costs, do you agree?

yes, i agree...

Thank you so much for your help and your time.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.