Extracting data from log files on Linux

What is the problem you are having with rclone?

Need to identify large files from logs for a restore. Logging the output while using the --dry-run switch gets me the filenames and sizes, just unsure how to either sort them from largest to smallest and/or only list those above XX size.

Log entries look something like this;


2024/03/11 13:42:59 NOTICE: 2022-03-01/videos/Family/Xmas2020/GX85/P1290689.MP4: Skipped copy as --dry-run is set (size 514.984Mi)

Run the command 'rclone version' and share the full output of the command.

rclone v1.65.2

  • os/version: debian 11.9 (64 bit)
  • os/kernel: 6.1.0-0.deb11.13-amd64 (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.21.6
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

Google Drive

Linux has plenty of tools you can use (awk, sed etc.) if you enjoy scripting but why not to use already existing filtering functionality? Among many others there are --min-size and --max-size flags available.

Thanks, ended up using --max-size which will do the job for this requirement.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.