What is the problem you are having with rclone?
Copying a tree of 35 files with rclone copy
from local disk to Backblaze B2, I see frequent corrupted on transfer: sizes differ
errors with multipart file uploads. In each case, rclone reports that the destination file size is zero, e.g.:
2023/09/18 17:20:13 ERROR : catalog_sales/20221122_161347_00795_djagr_3a042953-d0a2-4b8d-8c4e-6a88df245253: corrupted on transfer: sizes differ 244695498 vs 0
Six files in the tree are big enough for multipart uploads; three transfer correctly, three fail with corrupted on transfer: sizes differ
. The same files succeed and fail through the two subsequent retries.
Digging deeper: helpfully, Rclone just hides the files in B2 on failure, so I can download them. They all match the source files. Further, if I use --ignore-size
, the copy succeeds with no errors, and a subsequent rclone check ... --download
shows that all of the files in the destination are identical to those in the source.
So, is the b2_finish_large_file
API erroneously reporting zero file size? I ran with --dump headers,responses
and the b2_finish_large_file
response contains the expected ContentLength
:
2023/09/18 17:20:13 DEBUG : catalog_sales/20221122_161347_00795_djagr_3a042953-d0a2-4b8d-8c4e-6a88df245253: Finishing large file upload with 3 parts
2023/09/18 17:20:13 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2023/09/18 17:20:13 DEBUG : HTTP REQUEST (req 0xc000c74c00)
2023/09/18 17:20:13 DEBUG : POST /b2api/v1/b2_finish_large_file HTTP/1.1
Host: api004.backblazeb2.com
User-Agent: rclone/v1.64.0
Content-Length: 259
Authorization: XXXX
Content-Type: application/json
Accept-Encoding: gzip
2023/09/18 17:20:13 DEBUG : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2023/09/18 17:20:13 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2023/09/18 17:20:13 DEBUG : HTTP RESPONSE (req 0xc000c74c00)
2023/09/18 17:20:13 DEBUG : HTTP/1.1 200
Content-Length: 854
Cache-Control: max-age=0, no-cache, no-store
Content-Type: application/json;charset=UTF-8
Date: Mon, 18 Sep 2023 17:20:13 GMT
{
"accountId": "15f935cf4dcb",
"action": "upload",
"bucketId": "91a5cf5973f5dc4f848d0c1b",
"contentLength": 244695498,
"contentMd5": null,
"contentSha1": "none",
"contentType": "application/octet-stream",
"fileId": "4_z91a5cf5973f5dc4f848d0c1b_f221926aaeb9f362c_d20230918_m172005_c004_v0402008_t0028_u01695057605512",
"fileInfo": {
"src_last_modified_millis": "1669152496000",
"large_file_sha1": "af2db424fe5092a7933e1ca128c0613a1953342f"
},
"fileName": "tpcds-benchmark/catalog_sales/20221122_161347_00795_djagr_3a042953-d0a2-4b8d-8c4e-6a88df245253",
"fileRetention": {
"isClientAuthorizedToRead": false,
"value": null
},
"legalHold": {
"isClientAuthorizedToRead": false,
"value": null
},
"serverSideEncryption": {
"algorithm": null,
"mode": null
},
"uploadTimestamp": 1695057605512
}
2023/09/18 17:20:13 DEBUG : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2023/09/18 17:20:13 ERROR : catalog_sales/20221122_161347_00795_djagr_3a042953-d0a2-4b8d-8c4e-6a88df245253: corrupted on transfer: sizes differ 244695498 vs 0
It seems like, somehow, that size is being zero'd out sometimes.
Run the command 'rclone version' and share the full output of the command.
# rclone version
rclone v1.64.0
- os/version: debian 11.7 (64 bit)
- os/kernel: 5.10.0-23-amd64 (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.21.1
- go/linking: static
- go/tags: none
Which cloud storage system are you using? (eg Google Drive)
Backblaze B2
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
rclone --no-check-dest copy tpcds-benchmark b2remote:metadaddy-rclone/tpcds-benchmark
I added --dump headers,responses -vv
during my investigation.
The rclone config contents with secrets removed.
{
"b2crypt": {
"password": "*****",
"remote": "b2remote:metadaddy-rclone",
"type": "crypt"
},
"b2remote": {
"account": "*****",
"key": "*****",
"type": "b2"
},
"s3remote": {
"access_key_id": "*****",
"location_constraint": "us-west-1",
"provider": "AWS",
"region": "us-west-1",
"secret_access_key": "*****",
"type": "s3"
}
}