Rclone cannot complete upload to B2, restarts upload frequently

What is the problem you are having with rclone?

Rclone is not able to complete file uploads to my BackBlaze B2 bucket. After about 50-60 minutes (consistently) the upload is abandoned and restarted, resulting in dozens of incomplete large file uploads to B2 (as viewed in my BackBlaze portal).

The latest backup attempt involved only 1 file roughly 1.11 TB in size.

Run the command 'rclone version' and share the full output of the command.

rclone v1.61.1
- os/version: Microsoft Windows 10 Education 20H2 (64 bit)
- os/kernel: 10.0.19042.1526 (x86_64)
- os/type: windows
- os/arch: amd64
- go/version: go1.19.4
- go/linking: static
- go/tags: cmount

Which cloud storage system are you using? (eg Google Drive)

BackBlaze B2

The command you were trying to run (eg rclone copy /tmp remote:tmp)

I use a service manager to automatically mount BackBlaze on Windows startup using the following command:

rclone --log-file=RCLONE_LOG.txt --transfers=16 --vfs-cache-mode full --cache-dir=D:\ --config=C:\Users\Dave\AppData\Roaming\rclone\rclone.conf mount BackBlaze: B:

The rclone config contents with secrets removed.

[BackBlaze]
type = b2
account = #REDACTED#
key = #REDACTED#

A log from the command with the -vv flag

This log is a bit long. I let it run until rclone abandoned and restarted the upload. Note the runtime error at the end...

https://pastebin.com/raw/kZ6HKBFy

What happens if you directly use sync or copy or copyto instead of mounting?

If you stick with mounting, how big is your cache drive?

I'm happy to try a sync or copy but really want to mount long term. The mounted drive is set as the destination for my backup software. My cache drive is 2 TB (1.81 TB formatted) and used almost solely for caching, not storage.

The error is this

panic: runtime error: slice bounds out of range [:122683392] with capacity 100663296

goroutine 61 [running]:
github.com/rclone/rclone/backend/b2.(*largeUpload).Upload.func2()
	github.com/rclone/rclone/backend/b2/upload.go:453 +0x385

Which is a bug.

The relevant logs are these

2023/01/21 15:39:52 DEBUG : Backup/6F039FE9BFB937ED-Full-January-20-2023--00-00.mrimg: size: 1.112Ti, parts: 10000, default: 96Mi, new: 117Mi; default chunk size insufficient, returned new chunk size
2023/01/21 16:31:23 DEBUG : Backup/6F039FE9BFB937ED-Full-January-20-2023--00-00.mrimg: Starting upload of large file in 9962 chunks (id "4_z84d34c99e0b5dfdf76c20b18_f20789b095f789965_d20230121_m213121_c002_v0001118_t0046_u01674336681551")
ncw@dogger:/tmp$ grep Backup/6F039FE9BFB937ED-Full-January-20-2023--00-00.mrimg b2-upload-bug.txt | grep -v RemoveNotInUse
2023/01/21 15:39:47 DEBUG : Backup/6F039FE9BFB937ED-Full-January-20-2023--00-00.mrimg: vfs cache: truncate to size=1222159055049 (not needed as size correct)
2023/01/21 15:39:47 DEBUG : Backup/6F039FE9BFB937ED-Full-January-20-2023--00-00.mrimg: vfs cache: setting modification time to 2023-01-21 00:16:52.8900445 -0500 EST
2023/01/21 15:39:47 INFO  : Backup/6F039FE9BFB937ED-Full-January-20-2023--00-00.mrimg: vfs cache: queuing for upload in 5s
2023/01/21 15:39:52 DEBUG : Backup/6F039FE9BFB937ED-Full-January-20-2023--00-00.mrimg: vfs cache: starting upload
2023/01/21 15:39:52 DEBUG : Backup/6F039FE9BFB937ED-Full-January-20-2023--00-00.mrimg: size: 1.112Ti, parts: 10000, default: 96Mi, new: 117Mi; default chunk size insufficient, returned new chunk size
2023/01/21 16:31:23 DEBUG : Backup/6F039FE9BFB937ED-Full-January-20-2023--00-00.mrimg: Starting upload of large file in 9962 chunks (id "4_z84d34c99e0b5dfdf76c20b18_f20789b095f789965_d20230121_m213121_c002_v0001118_t0046_u01674336681551")

So it looks like the chunksize calculator worked out it needed a larger chunk size, but the uploader didn't allocate one.

Please try this fix

v1.62.0-beta.6697.351fc609b.fix-b2-upload-chunk on branch fix-b2-upload-chunk (uploaded in 15-30 mins)

It looks like the beta has fixed the problem! I managed to upload my entire 1.11 TB backup without issue today, on the first try. Thank you so much for the help!! :partying_face:

Hopefully the bugfix will benefit others as well.

I've merged this to master now which means it will be in the latest beta in 15-30 minutes and released in v1.62

If we do a v1.61.2 then I'll put it in that too.

Thank you for testing. Hopefully it continues working. If not, let me know!

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.