Question re: API calls on cloud storage providers


I’m looking to upload files (varying sizes, bigger the better) to cloud storage and then download parts of them every 5-10 minutes.

I spoke with someone at Box and they limit their filesizes to 15gb, which would put me at thousands of API calls per day per tb…

dropbox enterprise would work but it starts at $33k USD/yr.

need many, many (50? 100?) tb.

They can be 1tb+ in size each, though, if that lessens the load…

google cloud storage for business breaks after awhile because I’ve downloaded the files too much?

any ideas?

Using rclone mount, mind you

So just download parts of the file like in an HTTP Range request.

I’m pretty sure S3, Google Cloud Storage (not google Drive) or Azure Blob storage will cope with the load, but for a price! They are much more expensive than Google Drive, Box etc.

50TB will cost you approx $1500 / month with them.

rclone mount does range requests to read things in the middle of blocks. It could be more efficient though as it does an open ended range request so potentially downloads more data than you need. You’d want to set --buffer 0 to stop that.