Delete only folders older than xx but leave rest?

I am doing daily backups to google drive using

"${dest}/recent" --backup-dir "${dest}/old/${date}"

This works really well. All deleted/modified files is moved to

/old/2019-24-7-1800
/old/2019-25-7-1800

I want to keep all backups until they are 30 days old. After 30 days I want to delete only the folders inside /old/ that is older than 30 days

rclone purge --min-age 30d "${dest}/old/"

I thought this would delete everything INSIDE /old/ that was older than 30 days, but it delete the entire /old/ folder.

So is what I want even possible to do using rclone?

1 Like

Can you try to run it with --dry-run and -vv and see what the output looks like? I think you might be getting an issue with the dates and age as just because you say 30 days old, rclone keeps the filestamp on when it was created.

Example, I just uploaded a hosts file that was changed on 7-1 and if I use a 20 day filter, I can still see it since it was created 24 days ago even though I uploaded it today.

felix@serenity:~ rclone lsl GD:testing --min-age 20d
      213 2019-07-01 19:56:08.748000000 hosts
felix@serenity:~ rclone lsl GD:testing
      213 2019-07-01 19:56:08.748000000 hosts
      243 2019-07-09 15:28:30.502000000 3/hosts
      243 2019-07-09 15:28:26.432000000 2/hosts

You need to run 2 commands

Rclone delete --min-age 30d /old/
Rclone rmdirs --min-age 30d /old/

Delete will only delete files not directories
Rmdirs will remove empty directories

Purge will just wipe everything and won't obey min max policies

Run --dry-run flag and see first

Rclone delete --min-age 30d /old/ --dry-run
Rclone rmdirs --min-age 30d /old/ --dry-run

Good catch on purge as I don't use it personally so thanks!

If his goal is to remove a folder based on backups, he does not want to use 30 days on delete as that will remove all files everywhere.

I'd use purge but you'd need to get a bit savvy and create a list of directories to delete.

felix@serenity:~ rclone ls GD:testing
      213 hosts
      243 delete/hosts
      243 2/hosts
      243 3/hosts

felix@serenity:~ rclone purge GD:testing/delete

felix@serenity:~ rclone lsl GD:testing
      213 2019-07-01 19:56:08.748000000 hosts
      243 2019-07-09 15:28:26.432000000 2/hosts
      243 2019-07-09 15:28:30.502000000 3/hosts

I see. So I cant use rclone the way I thought?

I was hoping that when folder /old/xxx was created it would use that date for the --min-age flag. So from what I understand it uses creation date and not date when moved to old?

I assume it will work the same way using delete + rmdirs with --min-age?

Is there some way to change this?

If you go back to your use case, what do you want to accomplish though?

/old/2019-24-7-1800
/old/2019-25-7-1800

Is the goal was to remove the whole folder "/old/2019-24-7-1800" if it's passed a certain number of days so basically you'd have 30 days worth of backups and you want to keep the last 30 folders?

Yes, that is exactly what I want to accomplish.

In /hostname/current/ there will be a "fresh" backup from last day.

In /hostname/old/date I want to keep individual folders named with date for when they got moved to /old/.

This way I will have a complete backlog of my server for 30 days.

When 30 days has past I want to delete ONLY the folder that was moved to /old/ 30 days ago and leave all other folders in /old/ intact.

Do you use a mount at all or this is all via rclone copy/sync uploads basically?

I have a mount for gdrive, but in the script I run it with rclone sync src dest and purge dest

Let me play around and think about it a bit.

It's very easy to script on a mount as you have a bit more options to do.

Offhand, I can't think of a single or two rclone commands to make it happen but gonna play a bit and see what I can think of.

The trick is that directories are time stamped when they are created and I believe the flow is you want to check for all the directories and remove / delete any of the full directories that are older than 30 days.

I did find one script that kind of do what I want. The author of the script has 3 versions of it spread a cross the two first pages in this thread

Using this method I can get it working, but it wont give me a folder with date-time for when it got moved to /old/ but it uses pre defined folder names like old1, old2, old3....

last version of the script will work like this for the old folders, and when old1 is at top it will be overwritten. I tried this one but the part of the script that rotate bak.list wont work properly for me so old1 never gets removed from top.

old3
old2
old1
old7 Wed Dec 12 13:42:40 CET 2018
old6 Thu Dec 13 10:00:01 CET 2018
old5 Thu Dec 13 10:05:01 CET 2018
old4 Fri Dec 14 01:30:02 CET 2018

An older version of his script had a different approach and were moving files on remote. This depends if gdrive support this operation or not and in /old/ it were named version1, version2....

Mon Dec 10 08:40:51 CET 2018
Moving version 2 to 3...
... done
Mon Dec 10 09:07:43 CET 2018
Moving version 1 to 2...
... done
Mon Dec 10 09:08:11 CET 2018
Performing backup to amazonS3:karoline...
...done
Mon Dec 10 09:13:46 CET 2018

I tried both of them but the script just throws a bunch of errors at me so I eventually gave up.

--min-age only works on files not on directories so this approach won't work :frowning:

Also purge does not obey the filters anyway.

You can use delete + rmdirs but this may not do exactly what you want...

What you really want is to write a script which enumerates the backups you want to keep then excludes those and deletes the rest.

Here are some things which might help

# get a list of the 30 most recent dates
for i in `seq 0 30`; do date +%Y-%m-%d-\* -d "now +$i days" ; done | sort > dirs-to-keep

# list all directories
rclone lsf --dirs-only --dir-slash=false remote ${dest}/old | sort > all-dirs
1 Like

Ok, so purge is out of the question?

Can any of these solution work?
Create a script for rclone. Inside this script I want the following to happen.
When rclone creates a new folder inside /old/"date-time" the date-time gets written to old.list. Only the date-time that matches the folder name is written to this list, nothing else.
Next time it moves stuff to /old/ it add a line at top in the old.list with the new date-time.
Then in the same scrip it looks for line 31 in old.list. If line 31-100 is present it copy its values and add them to the delete and rmdirs flag.

It is too much of a blunt instrument!

That looks like it would work.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.