What is the problem you are having with rclone?
rclone size on a large dropbox remote with millions of files takes a loooong time. In fact, after leaving it running for an hour, I lost the patience. There's also "too many requests" error in between that slows it down. Interestingly, this was not an issue in Google drive.
In the interim, can I purely get the size of a remote directory? This should be much faster as the web UI allows me to calculate the size in a few seconds.
Run the command 'rclone version' and share the full output of the command.
rclone v1.62.2
- os/version: darwin 13.3 (64 bit)
- os/kernel: 22.4.0 (arm64)
- os/type: darwin
- os/arch: arm64 (ARMv8 compatible)
- go/version: go1.20.2
- go/linking: dynamic
- go/tags: cmount
Which cloud storage system are you using? (eg Google Drive)
Dropbox
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
rclone size dbox:
rclone size dbox: --tpslimit 12 --tpslimit-burst 12
The latter command doesn't run into "too many requests" error but is still slow.
The rclone config contents with secrets removed.
[dbox]
type = dropbox
token = {"access_token":"<REDACTED>":"bearer","refresh_token":"<REDACTED>","expiry":"<REDACTED>"}
client_id = <REDACTED>
client_secret = <REDACTED>
A log from the command with the -vv
flag
2023/06/15 04:18:21 DEBUG : rclone: Version "v1.62.2" starting with parameters ["rclone" "size" "dbox:" "-vv"]
2023/06/15 04:18:21 DEBUG : Creating backend with remote "dbox:"
2023/06/15 04:18:21 DEBUG : Using config file from "/Users/fusionx/.config/rclone/rclone.conf"