Help with b2 chunk size and sync to backblaze


#1

Sorry if this is a silly question, but I am really having a difficult time understanding how the chunk size and upload cutoff arguments are to be used.

I have some large files I want to upload to backblaze (backup archives +500GB in size. Here is the command I have constructed to run with.

rclone sync -v B: backblazeb2:bucket1 --exclude “/System Volume Information/**” --exclude “$RECYCLE.BIN” --b2-chunk-size=250M --b2-upload-cutoff=5000M --transfers=6

I expected to see the -v output to show chunks of files uploading and fairly quickly of 250M (1gpbs connection) But I do not. Any insight would be helpful!

Thanks!


#2

This means that you want the b2 chunks to be 250M. Remember these are buffered in memory so you’ll need --transfers worth of them, ie 6*250M = 1.5G of memory.

This means that for files < 5000M you want them transferred as single files and not using chunked upload.

You’ll need -vv to see the chunks transferring.


#3

Thank you kind sir. I was worried I had misinterpreted some of the documentation. I’ll try with the -vv to see if I get the type of feed back I am looking for.

The large files seem to spend a lot of time reading the disk but not transferring. I am assuming that this process has to read the entire size of the file on disk and then check the cloud version before transfers will start?


#4

B2 needs the sha1sum of the file before uploading so rclone will be calculating that. This can take some time for very large files.


#5

I am getting closer, but still getting a message on one of my larger files I am trying to upload. Reading documentation at b2 seems to say they support files up to 10TB. This file is well short of that.

File size too big: 5235605504 (400 bad_request)
2018/08/14 12:46:08 ERROR : B2 bucket Veeam1: not deleting files as there were IO errors
2018/08/14 12:46:08 ERROR : B2 bucket Veeam1: not deleting directories as there were IO errors
2018/08/14 12:46:08 ERROR : Attempt 3/3 failed with 1 errors and: File size too big: 5235605504 (400 bad_request)
2018/08/14 12:46:08 Failed to sync: File size too big: 5235605504 (400 bad_request)

Screenshot at: https://vgy.me/EzdFSl.png


#6

Can you try with -vv --dump responses and see if that gives a clue?