How to use Rclone to copy local files to S3 and delete files older than N days?

What is the problem you are having with rclone?

  • How to use Rclone to copy local files to S3 and delete files older than N days?

Run the command 'rclone version' and share the full output of the command.

  • rclone 1.63.1

Which cloud storage system are you using? (eg Google Drive)

  • AWS S3

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy src dst
rclone delete src --min-age 5d

The rclone config contents with secrets removed.

Paste config here

A log from the command with the -vv flag

"When I am implementing the task of copying source files to the target object storage, 
I will first use the copy command to copy the files, 
and then call the delete method to delete files that are N days old. 
However,  
if the source directory is very large and cannot be copied at once,
I am worried that using the delete method will delete files that have not been uploaded.
I cannot use the "rclone sync" method to synchronize because the target bucket will contain many
files that are not synchronized from the source. 
Can you advise me on how to use rclone to build the command properly?"

what about to replace it with

rclone move src dst --min-age 5d

"I'm sorry if I expressed myself incorrectly. What I want to do is to copy the latest files to S3 and then delete files that are N days old. Using move+--min-age can only copy and delete files that are N days old. Is it possible to complete all the above functions with just one command?"

No it is not possible to do this in one line command.

I suggest to use move instead of delete as a safety measure if you suspect that they might be reasons that copy was not complete. During move files already in destination will be deleted from source and any potentially not preset will be uploaded first and then deleted.

"If it is not possible to solve the problem with a single command, I suggest using the 'move' command to copy and delete files from 5 days ago, and then using the 'copy' command to copy the latest files. This should partially solve the problem. However, I am concerned that if there is an error with the 'move' command, it may delete files that have not been uploaded. My current approach is to use the 'copy' command to copy the data first, and then use the 'delete' command to clean up only if there are no errors with the 'copy' command."

File is only deleted from source after successful upload - this is how move works. If something goes wrong file will remain in source.

you still have to run initial copy but later you can only rely on e.g. daily move

my current usage method is to use a scheduled task with a daily interval to drive rclone execution. thx :slight_smile:

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.