Delete folders older than x days?

What is the problem you are having with rclone?

I can't find this answered in other similar topics... but how does one remove the folders (not files!) in my /inc/ folder on Google Drive that are older than x days?

Run the command 'rclone version' and share the full output of the command.

rclone v1.59.0

  • os/version: centos 7.9.2009 (64 bit)
  • os/kernel: 5.17.5-x86_64-linode154 (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.18.3
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone sync --verbose --delete-during /myfolder/ "GoogleDrive:myfiles/latest" --backup-dir="GoogleDrive:myfiles/inc/$(date +%Y-%m-%d.%I-%M-%S)"

The rclone config contents with secrets removed.

type = drive
client_id = xxxxxx
client_secret = xxxxxx
scope = drive
token = {"access_token":"xxxxxx","token_type":"Bearer","refresh_token":"xxxxxx","expiry":"2022-08-05T13:58:30.573112+01:00"}
team_drive = 

Thanks

Rclone generally works on files not folders.

You can delete empty folders but not sure that's what you are trying to do.

Delete all files and then delete empty folders is OK.

I am just a bit surprised its not an option, considering after some time I would have a massive amount of data and Google charging me loads of money :roll_eyes:

Flag is here for delete:

rclone delete

You aren't paying really anything for a directory. It's the files that cost money. So having a bunch of empty directories is really not significant to anything in terms of cost.

I'm still trying to understand how sync and deleting folders more than x days is a thing. What are you trying to make happen?

I read rclone delete only deleted files though not folders?

I am backing up all my website files. Say I have 200 websites in the /home/ folder. I am backing up that folder. I am doing it incrementally, so every changed file is in the /inc/DATE folder. I keep incremental files because sometimes we need to revert back to an old file, and normally we only need to do it within the last week or so. So over time I will have hundreds of /inc/DATE folders. I could delete all files and leave empty folders but its still really messy to have hundreds of empty folders inside my inc folders. I only need to retain files for say 7 or 14 days, I do not need to retain them for years.

I also have many servers, so did not want to manually delete all the old files from every server every so often.

Before Rclone I used my own rsync script to save them to a Raspberry PI USB. I have some issues with it hence wanting to move to Rclone, but in my own script I would run a bash script to delete the folders older than a certain date. I cant see how to do that in Rclone. I am surprised Rclone does not allow it, and you have to keep them all.

Thanks

Based on that format, I'd write up a script to parse for dates I'd imagine and feed that set of directories into rclone delete to remove / prune older folders.

There is an example.

rclone mount GoogleDrive: /path/to/mountpoint
and run your bash script pointing to /path/to/mountpoint

Ooh OK will try that, thanks.

might need to add --allow-other

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.