Can you try to run it with --dry-run and -vv and see what the output looks like? I think you might be getting an issue with the dates and age as just because you say 30 days old, rclone keeps the filestamp on when it was created.
Example, I just uploaded a hosts file that was changed on 7-1 and if I use a 20 day filter, I can still see it since it was created 24 days ago even though I uploaded it today.
I was hoping that when folder /old/xxx was created it would use that date for the --min-age flag. So from what I understand it uses creation date and not date when moved to old?
I assume it will work the same way using delete + rmdirs with --min-age?
If you go back to your use case, what do you want to accomplish though?
/old/2019-24-7-1800
/old/2019-25-7-1800
Is the goal was to remove the whole folder "/old/2019-24-7-1800" if it's passed a certain number of days so basically you'd have 30 days worth of backups and you want to keep the last 30 folders?
It's very easy to script on a mount as you have a bit more options to do.
Offhand, I can't think of a single or two rclone commands to make it happen but gonna play a bit and see what I can think of.
The trick is that directories are time stamped when they are created and I believe the flow is you want to check for all the directories and remove / delete any of the full directories that are older than 30 days.
I did find one script that kind of do what I want. The author of the script has 3 versions of it spread a cross the two first pages in this thread
Using this method I can get it working, but it wont give me a folder with date-time for when it got moved to /old/ but it uses pre defined folder names like old1, old2, old3....
last version of the script will work like this for the old folders, and when old1 is at top it will be overwritten. I tried this one but the part of the script that rotate bak.list wont work properly for me so old1 never gets removed from top.
old3
old2
old1
old7 Wed Dec 12 13:42:40 CET 2018
old6 Thu Dec 13 10:00:01 CET 2018
old5 Thu Dec 13 10:05:01 CET 2018
old4 Fri Dec 14 01:30:02 CET 2018
An older version of his script had a different approach and were moving files on remote. This depends if gdrive support this operation or not and in /old/ it were named version1, version2....
Mon Dec 10 08:40:51 CET 2018
Moving version 2 to 3...
... done
Mon Dec 10 09:07:43 CET 2018
Moving version 1 to 2...
... done
Mon Dec 10 09:08:11 CET 2018
Performing backup to amazonS3:karoline...
...done
Mon Dec 10 09:13:46 CET 2018
I tried both of them but the script just throws a bunch of errors at me so I eventually gave up.
--min-age only works on files not on directories so this approach won't work
Also purge does not obey the filters anyway.
You can use delete + rmdirs but this may not do exactly what you want...
What you really want is to write a script which enumerates the backups you want to keep then excludes those and deletes the rest.
Here are some things which might help
# get a list of the 30 most recent dates
for i in `seq 0 30`; do date +%Y-%m-%d-\* -d "now +$i days" ; done | sort > dirs-to-keep
# list all directories
rclone lsf --dirs-only --dir-slash=false remote ${dest}/old | sort > all-dirs
Can any of these solution work?
Create a script for rclone. Inside this script I want the following to happen.
When rclone creates a new folder inside /old/"date-time" the date-time gets written to old.list. Only the date-time that matches the folder name is written to this list, nothing else.
Next time it moves stuff to /old/ it add a line at top in the old.list with the new date-time.
Then in the same scrip it looks for line 31 in old.list. If line 31-100 is present it copy its values and add them to the delete and rmdirs flag.