Rclone only pays attention to the modification time of files not directories. If that is OK then you can use --min-age and --max-age, so rclone ls --min-age 5d remote:path to list all files older than 5 days.
That’s the problem I was having with ‘aws s3’, which led me to try rclone instead.
I’m able to copy all the files older than foo, but their names are simply the time of day plus file extension. The days of the year are only defined by their containing folders.
When they all get lumped together in a single destination directory they become useless.
They don’t get lumped together. The directory tree is preserved.
If you need to use the directory names as the date to move, then you can script it and select the directories you need and put them in an include file to be added to your rclone move command.
Is that what you’re looking for to move based on the directory name date or the directory creation date?
First is the directory in question before anything is done. Files within each are timestamped with time of day not date information.
Next is directories over a week old selected for archiving. I’ve failed every way I’ve tried to use the ‘find’ command on directories (with their contents intact) instead on what becomes a huge lump of the files alone, without their associated directories.
Last is a picture of the resulting directory after having the old directories archived & associated space reclaimed.
With regards to selecting directories by name or date, either will suffice, as they’re both expected to be the same.