How to move directories/contents older than x?

Basically I just want to archive to S3.
So far I have this script:


find /mnt/path/to/Log/rClone -ctime +183 -delete

rclone move /mnt/Path/to/files/ -v --fast-list --bwlimit 2M --progress --update --use-server-modtime --log-file=/mnt/DS/db/Log/`date +\%Y-\%m-\%d_\%H:\%M:\%S`-rClone_Starting_Daily_foo_To_S3_Archive.log S3-foo:bucketname/Path/to/files/ >> /mnt/Path/to/Log/rClone/`date +\%Y-\%m-\%d_\%H:\%M:\%S`-rClone_Daily_foo_Archive_Task.log 2>&1

However I’m missing the part where it selects directories older than whatever days, uploads them intact, & deletes the source after.

Here’s an example of the structure:


etc, etc.

Are you seeing empty directories on the source, is that what you mean? If so try --delete-empty-src-dirs

No, I don’t know how to make it select & move directories & their contents that are over x days old.

Rclone only pays attention to the modification time of files not directories. If that is OK then you can use --min-age and --max-age, so rclone ls --min-age 5d remote:path to list all files older than 5 days.

That’s the problem I was having with ‘aws s3’, which led me to try rclone instead.

I’m able to copy all the files older than foo, but their names are simply the time of day plus file extension. The days of the year are only defined by their containing folders.

When they all get lumped together in a single destination directory they become useless.

I’m not quite following what you want to happen.

Can you share what would happen in your first example and what would get moved and what you expect it to look like when done?

They don’t get lumped together. The directory tree is preserved.

If you need to use the directory names as the date to move, then you can script it and select the directories you need and put them in an include file to be added to your rclone move command.

Is that what you’re looking for to move based on the directory name date or the directory creation date?

Relevant screenshots:

First is the directory in question before anything is done. Files within each are timestamped with time of day not date information.

Next is directories over a week old selected for archiving. I’ve failed every way I’ve tried to use the ‘find’ command on directories (with their contents intact) instead on what becomes a huge lump of the files alone, without their associated directories.

Last is a picture of the resulting directory after having the old directories archived & associated space reclaimed.

With regards to selecting directories by name or date, either will suffice, as they’re both expected to be the same.


This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.