Restrict destination directory to X Size

Hi there, first time poster. Apologies if this is a duplicate or silly idea.

I use Rclone to back up a OneDrive for Business account to local storage. This storage is limited to 969GB and it would be great if Rclone could be set up to sync all files to this storage within a predefined size limitation, say 920GB to allow a buffer, and all files in excess of this could be discarded based which files are older.

The suggestion would be:

  • implement --max-dest-size with optional flag to customise how files in excess of the size set are deleted, eg. delete the oldest files in excess of set size.

If you think Rclone can already do this, I'm happy to hear suggestions! I am currently settling for using --max-age 9y --deleted-excluded which happens to keep the destination around 900GB however this is needing regular modification as the source is constantly being added to.

It would be most helpful to my use case if something like this could be done.

Thanks

You could use rclone size --max-age 9y as part of a script to find the correct cutoff age automatically.

You can get it to output in --json for easy parsing.

Other ideas - use rclone lsf -R --files-only --format tsp . - sort this file so the files are in date age, then write a simple program to output the correct lines where the total sum is < 920GB. Then feed the new lines into rclone sync --files-from newfile --delete-excluded.

i am of no help for you, but out of sheer curiosity, why would someone back up a cloud drive locally?

You can lose your cloud data

  • The provider can lose the data
  • You can get locked out of the account for some reason beyond your control
  • The provider can go bust or get bought

If you care about your data you need multiple copies in multiple locations.

2 Likes

Thanks Nick, I'll give that a shot

For anyone else who needs to do something like this, here's how I solved this problem using Nick's advice from above and ChatGPT.

Run via cmd, create txt file listing all files reverse sorted by date:
rclone lsf remote: -R --files-only --format tsp --progress=false | SORT /r /o ReverseSort.txt

Run in PowerShell, use ReverseSort.txt to keep path for lines cumulatively adding to 945 GB in new file FileList.txt:

# Delete ReverseSort.txt if it already exists
Remove-Item ReverseSort.txt -Force -ErrorAction SilentlyContinue

# Define the maximum size in bytes
$maxsize = 945 * 1073741824

# Initialize the cumulative size
$size = [Int64]0

# Read each line from ReverseSort.txt
Get-Content ReverseSort.txt | ForEach-Object {
  # Split each line into an array of values
  $values = $_.Split(";")
  # Convert the size to an Int64 and add to the cumulative size
  $size += [Int64]$values[1]
  # If the cumulative size is less than the maximum size, append the path to FileList.txt
  if ($size -lt $maxsize) {
    Add-Content FileList.txt $values[2]
  }
  # If the cumulative size is greater than or equal to the maximum size, exit the loop
  if ($size -ge $maxsize) {
    break
  }
}

I call this script to a batch file using Powershell.exe -executionpolicy remotesigned -File Modify.ps1

Last step, run rclone sync using FileList.txt:
rclone sync src: dst: --files-from FileList.txt --delete-excluded

Hope this helps someone else. There's probably a better way of doing it, but for right now this works, thanks Nick!

1 Like

Good work both of you :slight_smile:

That's pretty much what I would have done, but I'd have done it in python since I don't know powershell!

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.