Sync B2 US -> B2 EU unexpected EOF

#### What is the problem you are having with rclone?
Since the OVH datacenter thing, I decided to make a copy of my B2 buckets to their EU datacenter.  

- I am running this on a VM in Los Angeles and the B2 DC is in LA area as well.
- B2 EU seems to be in Amsterdam 
- The VM has 10Gbit.

I used: 

> rclone sync -P --transfers 45 --max-size 25M b2:me b2eu:me2

This worked well for ~60k files.  Getting ~35MB/s 

Then I switched to --transfers 12 for the larger files and the problems began.  Anything over 1GB it seemed I would get `Reopening on read failure after x bytes: retry y/10: unexpected EOF` repeatedly.

With 12 transfers going I would see file 'completions' of  of 200-350% - this was eating up all my bandwidth charges so I set out to figure out wtf.

I tried `--bwlimit 3M --transfers 1`  still same  `Reopening on read failure after x bytes: retry y/10: unexpected EOF` errors.

I  picked one file that the problem was happening on and `copyto` it locally, rclone worked fine - Downloaded at 20MB/s - no read errors.  I then used `copyto` from local to the remote and that worked fine as well... 3.5MB/s on an encrypted bucket, and 5.5MB/s to a non encrypted one.

So something seems amiss with `sync` - maybe latency problem to EU?

#### What is your rclone version (output from `rclone version`)
Used stock 1.50.2 that came with Ubuntu 20.10 and then after discovering the problem installed v1.55.0 deb from GitHub releases.

#### Which OS you are using and how many bits (eg Windows 7, 64 bit)
Ubuntu 20.10 amd64

####  Which cloud storage system are you using? (eg Google Drive)
Backblaze b2


#### The command you were trying to run (eg `rclone copy /tmp remote:tmp`)  
<!--  You should use 3 backticks to begin and end your paste to make it readable.   -->

rclone sync b2:me b2eu:me2



#### The rclone config contents with secrets removed.  
<!--  You should use 3 backticks to begin and end your paste to make it readable.   -->

[b2]
type = b2
account = d
key = d

[b2eu]
type = b2
account = b
key = b




#### A log from the command with the `-vv` flag  

rclone sync -P -vv b2:me2/files/thatarebig/2gbfile.omg b2eu:metest/bl-2.omg
2021/04/01 05:31:59 DEBUG : Using RCLONE_CONFIG_PASS password.
2021/04/01 05:31:59 DEBUG : Using config file from "/home/me/.config/rclone/rclone.conf"
2021/04/01 05:31:59 DEBUG : rclone: Version "v1.55.0" starting with parameters ["rclone" "sync" "-P" "-vv" "b2:me2/files/thatarebig/2gbfile.omg" "b2eu:metest/bl-2.omg"]
2021/04/01 05:31:59 DEBUG : Creating backend with remote "b2:me2/files/thatarebig/2gbfile.omg"
2021/04/01 05:31:59 DEBUG : fs cache: adding new entry for parent of "b2:me2/files/thatarebig/2gbfile.omg", "b2:me2/files/thatarebig"
2021/04/01 05:31:59 DEBUG : Creating backend with remote "b2eu:metest/bl-2.omg"
2021/04/01 05:32:00 DEBUG : Couldn't decode error response: EOF
2021/04/01 05:32:00 DEBUG : fs cache: renaming cache item "b2eu:metest/bl-2.omg" to be canonical "b2eu:metest/bl-2.omg"
2021-04-01 05:32:00 DEBUG : Couldn't decode error response: EOF
2021-04-01 05:32:00 DEBUG : 2gbfile.omg: Need to transfer - File not found at Destination
2021-04-01 05:32:03 DEBUG : 2gbfile.omg: Starting upload of large file in 21 chunks (id "4_zd1b53b6282228fb47a88051e_f2021a929d5e7f570_d20210401_m043203_c003_v0312006_t0021")
2021-04-01 05:32:04 DEBUG : 2gbfile.omg: Sending chunk 1 length 100663296
2021-04-01 05:32:07 DEBUG : 2gbfile.omg: Sending chunk 2 length 100663296
2021-04-01 05:32:10 DEBUG : 2gbfile.omg: Sending chunk 3 length 100663296
2021-04-01 05:32:13 DEBUG : 2gbfile.omg: Sending chunk 4 length 100663296
2021-04-01 05:32:22 DEBUG : 2gbfile.omg: Done sending chunk 2
2021-04-01 05:32:23 DEBUG : 2gbfile.omg: Sending chunk 5 length 100663296
2021-04-01 05:32:36 DEBUG : 2gbfile.omg: Done sending chunk 3
2021-04-01 05:32:37 DEBUG : 2gbfile.omg: Sending chunk 6 length 100663296
2021-04-01 05:32:44 DEBUG : 2gbfile.omg: Done sending chunk 1
2021-04-01 05:32:45 DEBUG : 2gbfile.omg: Sending chunk 7 length 100663296
2021-04-01 05:33:16 DEBUG : 2gbfile.omg: Done sending chunk 5
2021-04-01 05:33:17 DEBUG : 2gbfile.omg: Reopening on read failure after 729473884 bytes: retry 1/10: unexpected EOF
2021-04-01 05:33:18 DEBUG : 2gbfile.omg: Sending chunk 8 length 100663296
2021-04-01 05:33:25 DEBUG : 2gbfile.omg: Done sending chunk 4
2021-04-01 05:33:26 DEBUG : 2gbfile.omg: Sending chunk 9 length 100663296
2021-04-01 05:33:30 DEBUG : 2gbfile.omg: Done sending chunk 7
2021-04-01 05:33:31 DEBUG : 2gbfile.omg: Sending chunk 10 length 100663296
2021-04-01 05:34:06 DEBUG : 2gbfile.omg: Done sending chunk 10
2021-04-01 05:34:06 DEBUG : 2gbfile.omg: Reopening on read failure after 1031566739 bytes: retry 2/10: unexpected EOF
2021-04-01 05:34:09 DEBUG : 2gbfile.omg: Sending chunk 11 length 100663296
2021-04-01 05:34:10 DEBUG : 2gbfile.omg: Done sending chunk 6
2021-04-01 05:34:12 DEBUG : 2gbfile.omg: Sending chunk 12 length 100663296
2021-04-01 05:34:15 INFO : Signal received: interrupt
2021-04-01 05:34:15 DEBUG : 2gbfile.omg: Cancelling large file upload
2021-04-01 05:34:16 INFO : Exiting...


BTW I've spent 10 minutes trying to find the '2 link' limit new users are limited to in my post to make them not link's so I can post.

Hmm, this is probably a proxy or similar producing an error.

Can you try running with -vv --dump headers and see what produces that error?

Trying now. So earlier today B2 had a switch failure. People lost ability to access their files and login to their accounts on the website (I am one). Now I am ---transfer 8 and getting 52MB/s to sync to EU. (2.4GB RAM utilized). -- So far so good.

I will update if I find more.

1 Like

Very strange. The issue just went away. Thanks for your help.

1 Like