What is the problem you are having with rclone?
My typical rclone move command "failed" with a batch commit error and deleted my files locally.
2023-01-03 13:06:17 DEBUG : Local file system at /mnt/d1/linux/isos/: deleted 16 directories
Transferred: 56.837 GiB / 56.837 GiB, 100%, 135.897 MiB/s, ETA 0s
Checks: 332 / 332, 100%
Deleted: 166 (files), 16 (dirs)
Renamed: 166
Transferred: 166 / 166, 100%
Elapsed time: 2m52.0s
2023/01/03 13:06:17 INFO :
Transferred: 56.837 GiB / 56.837 GiB, 100%, 135.897 MiB/s, ETA 0s
Checks: 332 / 332, 100%
Deleted: 166 (files), 16 (dirs)
Renamed: 166
Transferred: 166 / 166, 100%
Elapsed time: 2m52.0s
2023/01/03 13:06:17 DEBUG : 34 go routines active
2023/01/03 13:06:17 INFO : Dropbox root 'dev/诲⚂縫疋ꜙ瞏泟响鱟/彆ᚣ䴧䭘絯嚫堺燭ڿ': Committing uploads - please wait...
2023/01/03 13:06:19 DEBUG : pacer: Reducing sleep to 28.476562ms
2023/01/03 13:06:19 ERROR : Dropbox root 'dev/诲⚂縫疋ꜙ瞏泟响鱟/彆ᚣ䴧䭘絯嚫堺燭ڿ': async batch commit: failed to commit batch length 166: batch had 166 errors: last error: too_many_write_operations
Run the command 'rclone version' and share the full output of the command.
rclone v1.62.0-DEV
- os/version: debian 11.6 (64 bit)
- os/kernel: 5.10.0-19-amd64 (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.18.1
- go/linking: dynamic
- go/tags: none
Latest Git commit 98fa93f6d17909e9c6b9778a4dc9e67c58151f6a
Which cloud storage system are you using? (eg Google Drive)
Dropbox + crypted
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
rclone move $PWD dbc2:/vvvvvv/9999/ --delete-empty-src-dirs --create-empty-src-dirs --exclude-from ~/.config/rclone/excludes.txt --tpslimit 12 --tpslimit-burst 12 --disable-http2 --transfers 10 --checkers 10 -P -vv --min-size 1B --stats-file-name-length 0
The rclone config contents with secrets removed.
[db]
type = dropbox
client_id = 123
client_secret = 456
chunk_size = 128Mi
batch_mode = async
batch_size = 700
token = {"access_token":"yes","token_type":"bearer","refresh_token":"no","expiry":"2023-01-22T16:48:26.524930825+01:00"}
[dbc]
type = crypt
remote = db:/dev
password = yeahprobably
password2 = isurehopeso
filename_encoding = base32768
base32768 is as per Base32768 to compress filename length - #12 by Max-Sum to get around path/file length restrictions of dropbox and i suspect the problem lies in here somewhere...
A log from the command with the -vv
flag
2023-01-03 13:06:13 DEBUG : x/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/|CCCCCCEEE X XXCDDDDXX][ 11X|| 12354QC|: Making directory
2023-01-03 13:06:14 DEBUG : x/yyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy/|CCCCCCEEE X XXCDDDDXX|| 11X|| 12345QC|: Making directory
2023-01-03 13:06:14 DEBUG : pacer: low level retry 1/10 (error too_many_write_operations/.)
2023-01-03 13:06:14 DEBUG : pacer: Rate limited, increasing sleep to 20ms
2023-01-03 13:06:15 DEBUG : pacer: low level retry 2/10 (error too_many_write_operations/...)
2023-01-03 13:06:15 DEBUG : pacer: Rate limited, increasing sleep to 40ms
2023-01-03 13:06:15 DEBUG : pacer: Reducing sleep to 30ms
2023-01-03 13:06:15 DEBUG : x/zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz: Making directory
2023-01-03 13:06:15 DEBUG : pacer: Reducing sleep to 22.5ms
2023-01-03 13:06:16 DEBUG : pacer: low level retry 1/10 (error too_many_write_operations/)
2023-01-03 13:06:16 DEBUG : pacer: Rate limited, increasing sleep to 45ms
2023-01-03 13:06:16 DEBUG : Dropbox root 'dev/诲⚂縫疋ꜙ瞏泟响鱟/彆ᚣ䴧䭘絯嚫堺燭ڿ': Batch idle for 10s so committing
2023-01-03 13:06:16 DEBUG : Dropbox root 'dev/诲⚂縫疋ꜙ瞏泟响鱟/彆ᚣ䴧䭘絯嚫堺燭ڿ': Committing async batch length 166 starting with: /dev/诲⚂縫疋ꜙ瞏泟响鱟/彆ᚣ䴧䭘絯嚫堺燭ڿ/燫澙ᆕ凖㭣颙㽜㑮江/燼䶉檶擴㩭玴哯曺汖摆餙⢩柷ꐺ㖆ᨼ渡幀䄍呔哯ꐕ艃䭥镒㝿/零躝嗱嚓磴咙䶅⺴崿䙓ᖽ㶂鳊ᇨ覛峄㕻洮拗韩䆾蛚ꉭ蔥鯂悴嬋匚瘴ꍈ㑸长㿲瞈ɿ
2023-01-03 13:06:16 DEBUG : pacer: Reducing sleep to 33.75ms
2023-01-03 13:06:16 DEBUG : E/eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee/EEEEEE: Making directory
2023-01-03 13:06:16 DEBUG : pacer: Reducing sleep to 25.3125ms
2023-01-03 13:06:17 DEBUG : pacer: low level retry 1/10 (error too_many_write_operations/..)
2023-01-03 13:06:17 DEBUG : pacer: Rate limited, increasing sleep to 50.625ms
2023-01-03 13:06:17 DEBUG : pacer: Reducing sleep to 37.96875ms
2023-01-03 13:06:17 DEBUG : Encrypted drive 'dbc2:/vvvvvv/9999/': copied 16 directories
...
2023-01-03 13:06:17 INFO : D: Removing directory
2023-01-03 13:06:17 DEBUG : Local file system at /mnt/d1/linux/isos/: deleted 16 directories
Transferred: 56.837 GiB / 56.837 GiB, 100%, 135.897 MiB/s, ETA 0s
Checks: 332 / 332, 100%
Deleted: 166 (files), 16 (dirs)
Renamed: 166
Transferred: 166 / 166, 100%
Elapsed time: 2m52.0s
2023/01/03 13:06:17 INFO :
Transferred: 56.837 GiB / 56.837 GiB, 100%, 135.897 MiB/s, ETA 0s
Checks: 332 / 332, 100%
Deleted: 166 (files), 16 (dirs)
Renamed: 166
Transferred: 166 / 166, 100%
Elapsed time: 2m52.0s
2023/01/03 13:06:17 DEBUG : 34 go routines active
2023/01/03 13:06:17 INFO : Dropbox root 'dev/诲⚂縫疋ꜙ瞏泟响鱟/彆ᚣ䴧䭘絯嚫堺燭ڿ': Committing uploads - please wait...
2023/01/03 13:06:19 DEBUG : pacer: Reducing sleep to 28.476562ms
2023/01/03 13:06:19 ERROR : Dropbox root 'dev/诲⚂縫疋ꜙ瞏泟响鱟/彆ᚣ䴧䭘絯嚫堺燭ڿ': async batch commit: failed to commit batch length 166: batch had 166 errors: last error: too_many_write_operations
Plaintext filenames have been censored but le tme know if any issues.
I did try running a cryptcheck cmd with (rclone cryptcheck db:/诲⚂縫疋ꜙ瞏泟响鱟/彆ᚣ䴧䭘絯嚫堺燭ڿ/ dbc:/vvvvvv/9999/ -vvv)
But the output was not helpful:
2023/01/03 13:29:06 ERROR : Dropbox root '诲⚂縫疋ꜙ瞏泟响鱟/彆ᚣ䴧䭘絯嚫堺燭ڿ': error reading source root directory: directory not found
2023/01/03 13:29:06 NOTICE: Encrypted drive 'dbc:/vvvvvv/9999/': 1 differences found
2023/01/03 13:29:06 NOTICE: Encrypted drive 'dbc:/vvvvvv/9999/': 1 errors while checking
The files I was moving have been removed from my local disks and I can't find them on dropbox using their webpage/search.