What is the problem you are having with rclone?
Rclone lsf on Pcloud taking so long
Run the command 'rclone version' and share the full output of the command.
rclone v1.66.0
- os/version: debian bookworm/sid (64 bit)
- os/kernel: 6.5.0-28-generic (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.22.1
- go/linking: static
- go/tags: none
Which cloud storage system are you using? (eg Google Drive)
Pcloud
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
rclone lsf --hash "SHA1" --files-only --recursive --csv --checkers=8 --fast-list --links --human-readable --ignore-errors --ignore-case --checksum --low-level-retries=100 --retries=3 --tpslimit 100 --verbose=2 --log-file "~/Downloads/Rclone/rclone-lsf-log-file.log" "PcloudChunker:/LARGE-DIRECTORY" > ~/Downloads/Rclone/remote-metadata.csv
Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.
[Pcloud]
type = pcloud
client_id = XXX
client_secret = XXX
hostname = eapi.pcloud.com
username = XXX
password = XXX
token = XXX
[PcloudChunker]
type = chunker
remote = Pcloud:
description = Chunker for my remote Pcloud, each chunk will be of the size specified in chunk_size parameter in rclone.conf
chunk_size = 1Gi
hash_type = sha1
### Double check the config for sensitive info before posting publicly
A log from the command that you were trying to run with the -vv
flag
2024/04/26 17:11:09 INFO : Starting transaction limiter: max 100 transactions/s with burst 1
2024/04/26 17:11:09 DEBUG : rclone: Version "v1.66.0" starting with parameters ["rclone" "lsf" "--hash" "SHA1" "--files-only" "--recursive" "--csv" "--checkers=8" "--fast-list" "--links" "--human-readable" "--ignore-errors" "--ignore-case" "--checksum" "--low-level-retries=100" "--retries=3" "--tpslimit" "100" "--verbose=2" "--log-file" "/home/laweitech/Downloads/Rclone/rclone-lsf-log-file.log" "--config" "/home/laweitech/.config/rclone/rclone.conf" "PcloudChunker:/Z-LARGE-DIRECTORY"]
2024/04/26 17:11:09 DEBUG : Creating backend with remote "PcloudChunker:/Z-LARGE-DIRECTORY"
2024/04/26 17:11:16 DEBUG : Using config file from "/home/laweitech/.config/rclone/rclone.conf"
2024/04/26 17:11:16 DEBUG : Creating backend with remote "Pcloud:/Z-LARGE-DIRECTORY"
2024/04/26 17:11:17 DEBUG : fs cache: renaming cache item "Pcloud:/Z-LARGE-DIRECTORY" to be canonical "Pcloud:Z-LARGE-DIRECTORY"
2024/04/26 17:11:17 DEBUG : Reset feature "ListR"
2024/04/26 17:11:18 DEBUG : 5 go routines active
The directory 'Pcloud:Z-LARGE-DIRECTORY' contains 1,348,080 items, totalling 415.0 GB
I had it run for 12 hours 56 minutes and ~/Downloads/Rclone/remote-metadata.csv contains 54,280 records.
How do I get it to run faster should I Increase the checkers? e.g: --checkers=8000