B2 Transactions upload

Alright so I started blank with 50GB to upload.

Caps on zero and the command I used was:

rclone copy wasabi:icloud-photos-f B2:icloud-photos-f --fast-list -v

This is what happened after 47GB of transfer to B2:

Daily Download Bandwidth Caps
The first 1 GB are free.
Today: $0.00 (0 bytes)

Daily Class B Transactions Caps
Class B transactions are related to download. The first 2,500 are free.
Today: $0.13 (316,188)

Daily Class C Transactions Caps
The first 2,500 are free.
Today: $0.00 (1,138)

Any ideas on this, this is really not going to work with B2. 50GB is my initial test, I have over 600GB to upload. And the upload is supposed to be "free".

This is the response I've got from the guys at B2, just so you guys know.


Thank you for writing in.

What you are seeing is that the integration is calling our servers to list all the files 316,188 times during that day
We charge $0.004 per 10 000 calls.

To avoid this, you may need to alter your integration to limit the amount of get file info calls (HeadObject) in order to make sure that you are not charged for an excessive amount of calls.
I would suggest that you set the limit to $0 for Class B transactions until the settings are correct, as you mentioned, if you plan to store much more data, and rclone is continuing to make excessive calls, the charge for listing all the files in your bucket can bloat quickly.
The B2 Caps & Alerts section is an area for you to limit the charges that can occur per day, due to storage, downloads or transaction usage. What is a data cap? A data cap in B2 is a limit th...

The prices for B2 calls can be found here: https://www.backblaze.com/b2/b2-transactions-price.html

If you have any further questions, please let us know!

Best regards,

The Backblaze Team

i am curious,
given that wasabi does not have those fees, caps, limits, excessive calls, transactions, charges and so on;
why are you copying your data from wasabi to b2?

the reason i ask is recently, wasabi is not a reliable in terms of consistent upload speeds, as it once was, i am getting a little frustrated sometimes.

I use b2. I have 30,000 files and I run a sync once a week. I just uploaded a bunch of files this weekend. And this is my hits.

This isn't consistent with your results. I copy from a Google drive to b2. I use all defaults except I add fast-list.

What rclone version are you running?

I personally just want a 3rd provider to hold important docs and such. And this only costs me like $1.

me too,

so how many GB for $1.00


This is my version of Rclone

rclone v1.51.0

  • os/arch: darwin/amd64
  • go version: go1.13.7

That's why I'm trying to see what others are getting. I shouldn't be getting this much of transactions. I do not understand.

Almost 300. Not a lot. This is my "I hope I never need to touch it" backup of very important things.

I agree. I've never had an issue. That's a lot of API calls.

Hey, I just want to use B2 as a third option for my backups. I already use the unlimited plan for an iMac in my house but I want to keep my photos and some other stuff in two different locations.

I'm worried about the reliability you talk about. Where did you find such info? I would like to check on it. I have a lot of data on wasabi and plan to use B2 as a mirror.

Care to share exactly the command you are using to sync (or copy) data between two remotes (with B2 as destination). And maybe the rclone version you are using?


I'm using the latest beta and literally no flags except fast-list -vv


you running this using the S3 API or the regular B2?

recently, with the the covid craziness, wasabi upload speeds as not as consistent.
to be clear, in terms of data reliably, i run paranoid and wasabi is rock solid.

but then again, the whole internet seems to have slowed down sometimes recently..
i guess i noticed more than most as i have 1Gbps fiber optic internet connection.

if you want to track wasabi, you can goto https://status.wasabi.com/
if you click subscribe, you can receive emails about status changes.
and i have to say, to their credit, that wasabi sends too many emails every time there is a slow down or maintenance.

Ah well, how much data you have on wasabi? how long you've been using them?

Regular b2. You?

Dang it, S3 new API

  • years=3
  • TB=4

i have a 350+ line of python script to coordinate it.
i have a local backup server and i upload veeam backup files to wasabi.
i stream my media stored in a rclone crypt.
i compress lots of small local files into .7z and upload that.

i am simple minded.
all those api, transactions costs, gives me a headache.