What is the problem you are having with rclone?
rclone lsf on a Local Filesystem (local directory) is taking a long time, is there any flags to add to increase its processing speed and make more performant?.
Run the command 'rclone version' and share the full output of the command.
rclone v1.66.0
- os/version: debian bookworm/sid (64 bit)
- os/kernel: 6.5.0-28-generic (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.22.1
- go/linking: static
- go/tags: none
Which cloud storage system are you using? (eg Google Drive)
Local Filesystem
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
rclone lsf --format "hstp" --hash "SHA1" --files-only --recursive --csv --links --human-readable --exclude-from "~/Downloads/Rclone/exclude-from-file.txt" --ignore-case --log-file "~/Downloads/Rclone/rclone-lsf-log-file.log" --verbose=2 "/media/laweitech/MY-DRIVE/Z-LARGE-DIRECTORY" > ~/Downloads/Rclone/local-metadata.csv
A log from the command that you were trying to run with the -vv
flag
2024/05/07 15:32:00 DEBUG : rclone: Version "v1.66.0" starting with parameters ["rclone" "lsf" "--format" "hstp" "--hash" "SHA1" "--files-only" "--recursive" "--csv" "--config" "/home/laweitech/.config/rclone/rclone.conf" "--links" "--human-readable" "--exclude-from" "/home/laweitech/Downloads/Rclone/exclude-from-file.txt" "--ignore-case" "--log-file" "/home/laweitech/Downloads/Rclone/rclone-lsf-log-file.log" "--verbose=2" "/media/laweitech/MY-DRIVE/Z-LARGE-DIRECTORY"]
2024/05/07 15:32:00 DEBUG : Creating backend with remote "/media/laweitech/MY-DRIVE/Z-LARGE-DIRECTORY"
2024/05/07 15:32:00 DEBUG : Using RCLONE_CONFIG_PASS password.
2024/05/07 15:32:00 DEBUG : Using config file from "/home/laweitech/.config/rclone/rclone.conf"
2024/05/07 15:32:00 DEBUG : local: detected overridden config - adding "{b6816}" suffix to name
2024/05/07 15:32:00 DEBUG : fs cache: renaming cache item "/media/laweitech/MY-DRIVE/Z-LARGE-DIRECTORY" to be canonical "local{b6816}:/media/laweitech/MY-DRIVE/Z-LARGE-DIRECTORY"
2024/05/07 15:38:14 DEBUG : Scoop/persist/bitwarden/bitwarden-appdata/Cache: Excluded
2024/05/07 15:39:58 DEBUG : Scoop/persist/brave/User Data/GrShaderCache: Excluded
Further Information
/media/laweitech/MY-DRIVE/Z-LARGE-DIRECTORY' contains 1,348,080 items, totalling 415.0 GB
after runing the above command it takes 4 hours, 35 minutes, and 29 seconds to complete and save the results to local-metadata.csv
Is there any way I can reduce processing time to something lower than 4 hours, 35 minutes, and 29 seconds ? do I need to add flags like --fast-list
?