Anyone using Microsoft's OneDrive for Business? How well is it working with rclone?

Hello everyone,

As per the topic's title. I'm not keen (big understatement) on anything from Micro$oft, but they have reasonable prices for OneDrive for Business with Unlimited Storage (just $10/mo, according to them) and if it works well, would be a feasible way to get my backups on another cloud provider besides Google (where I currently have an Unlimited account). All other options that I could find for my volume of data (currently around 30TB) is way out of reach financially...

I can see on the rclone's docs that it's currently supported, but the question is, how well is it working? How many GBs are we allowed to upload per day? Any transactions/second limits as in GDrive? Any other quotas or limitations. hidden or otherwise?

Of Course everything I would put there would be encrypted using rclone's encrypted remote, so I'm not too worried about privacy/security/selling all my data to a bordel in Cairo (which I think is one of the main concerns regarding anything from M$), my main concern is with their authorization scheme getting hacked and someone logging as me and deleting everything (which would not be bone-breaking as all my data would also be in GDrive, my reason for a 2nd provider is just for the improbable case Google pulls an Amazon Cloud Drive on us, ie GDrive suddenly going down the drain without giving me time enough to migrate the data to another provider).

Thanks in advance for all your comments, tips etc!

Cheers,
-- Durval.

I have a Onedrive account I've done some testing on, and it works fine overall. Onedrive has a few quirks you should be aware of, like ignoring certain temp-file extensions - but those are pretty much all documented in the backend docs and are fairly minor.

Less options and features than Gdrive for sure, but servicable. Transactions pr. second seem to be on par or slightly above Gdrive, at least on upload.

I do not know a lot about the spesific limitations of upload, API transactions quota ect. like I do for Gdrive. I would be interrested to know more, so someone please mention @thestigma if more detailed info like that pops up in this thread :slight_smile:

Here is some technical info that should be useful though, as Onedrive bussiness shares most of it's limitations with sharepoint:
https://docs.microsoft.com/en-us/office365/servicedescriptions/sharepoint-online-service-description/sharepoint-online-limits?redirectSourcePath=%2farticle%2fSharePoint-Online-and-OneDrive-for-Business-software-boundaries-and-limits-8f34ff47-b749-408b-abc0-b605e1f6d498strong text

Hello @thestigma,

A business one with unlimited storage?

If you have an unlimited account, just authorize an access token for me and I will let you know very quickly how well it works :stuck_out_tongue:

From the "## Features ##" matrix listed on the rclone/MANUAL.md, I see the main issues are "[the business version] using its own hash algorithm" (not sure how serious that would be, apart from having another stupid M$ particularity/idiosincrasy going around), and not supporting duplicate files (which for me is a benediction instead, due to all the troubles I've had with GDrive's such "support" and the need to run dedupes etc).

OK, so for my millions-of-files backups, I will have to use something like borg-backup and then back it up to the cloud, just like in GDrive. No big deal...

My main concern are these, quoting from the rclone's docs:

The entire path, including the file name, must contain fewer than 400 characters for OneDrive, OneDrive for Business and SharePoint Online. If you are encrypting file and folder names with rclone, you may want to pay attention to this limitation because the encrypted names are typically longer than the original ones.

Of course I would use encryption, especially anything connected to M$. So, how serious would that be? Maximum directory depth of N levels, or what?

Cheers,
-- Durval.

5TB bussiness. I would assume that runs on the same system and shares all the same restrictions as unlimited.

Oh you want to run tests you mean? Sure, I could probably lend you temp access to run some tests. I do not have a significant amount of important data on there right now anyway, so that wouldn't be too disruptive for me. I will look and see if there is a way to grant an access-token without full admin rights that I can retract later. If you know how then please save me the time and tell me how - as I have spent limited time with Onedrive so far. I will wait until you respond for this because it's late now anyway, but depending on your answer I can look at it tomorrow if you remind me.

It's mainly an issue in so far that you can't use hashes between Gdrive and Onedrive. This does remove some options like --track-renames which is very useful for running a backup and not having to re-transfer everything just because some folders got renamed or moved.

I suspect you'd have to use some sort of archiving or merging anyway due to maximum files limit.

Only as much of a problem as you make it I think. 400 char is a good bit less than Gdrive's ample limit, but it should still be more than sufficient given that you just structure the data somewhat reasonably. Avoiding excessive folder-nesting usually does the trick. Really depend on how you tend to structure your data generally, so hopefully you already have good habits :slight_smile:

This topic goes into a fair bit of detail of exactly how much longer crypt makes filenames if you want exact info (need to scroll down a bit and be ready for some simple math :slight_smile: )

There seems to be quite a lot of small print attached to that unlimited deal...

Unlimited personal cloud storage for qualifying plans for subscriptions of five or more users, otherwise 1 TB/user. Microsoft will initially provide 1 TB/user of OneDrive for Business storage, which admins can increase to 5 TB/user. Request additional storage by contacting Microsoft support. Storage up to 25 TB/user is provisioned in OneDrive for Business. Beyond 25 TB, storage is provisioned as 25 TB SharePoint team sites to individual users.

Yea, the fine print is sure worth studying ... good catch
25TB is a good amount though and may not be a problem for some. It really depends on just how much of a hoarder you are :slight_smile:

Another unknown that may be pretty important is data-quotas. I have no idea of what those are on Onedrive, and I would not jump in with both feet before having some idea of that.

Hello @thestigma, @ncw,

Thanks for the detailed answer. More, below:

This agrees with my reading of the available info at M$ site.

Thanks! I was kinda joking, but if you are willing, I can and will happily run some tests up to those 5TB to see how it performs, and post the results here so you and everyone can see.

Unfortunately I have exactly zero experience with OneDrive :frowning: And please take your time, I'm going out on a trip today and will have limited internet access for the next 10-15 days, so no hurry.

Hummrmrmr... From the quick reading I gave to the M$ materials, I was under the impression the maximum files limit was per directory, no?

I have some deeply nested directories here, that before encryption would already exceed the 400 char limit... historical reasons and all that. But that is on these millions-of-files datasets that (given the TPS limits) would have to be borg'ed anyway, so it wouldn't be much of a problem.

Good catch, @ncw! So minimum price would be $10*5= $50/month for unlimited. About the same as GDrive which (at least officialy) also [imposes a 5-user minimum for unlimited at $12/mo/user = $60/mo] (https://gsuite.google.com/pricing.html)...

Cheers,
-- Durval.

Ok, assuming it's possible to do without having to give an access-level that would potentially be unsafe for me I will do you that favor and PM you about it within a day or two then. It is useful information to have both for me and the community, so I get something in return out of it.

I see what you are referencing there. I really don't know - I merely skimmed the docs.
I suppose it's something you could test for given some time to run it. It's easy to make a script that generates a few million tiny test-files (let me know if you need help with that).

I'm not sure I read it as that. Or at least - it seems like additional storage beyond 25TB might need to be added as team-sites rather than one huge storage. If that is the case it may not be an issue, but could be an organizational hurdle potentially depending on your needs. I suppose you can always ask microsoft to clarify some of these things once you gather up a few important questions you need answered before you commit.

If I come across any other reasonable alternatives to Onedrive for unlimited storage I will try to keep in mind to update here. Since I'm primarily a Google-guy myself I don't have a full overview of all that is on the market currently though.

I have a OneDrive Business account already grown for 20TB. So far it works pretty good, I use it as my backup for Google Drive unlimited, so far this is what I've noticed:

  1. I didn't notice any upload or download limits, I was able to upload like 6TB in one day with not a single error.
  2. I created my own Client ID and have yet to face a single throttle.
  3. Since I use encryption, I was facing a lot of issues with the 400 characters, ended up settling with obfuscated filenames instead of encrypted to work around it, not a problem since.
  4. There is a maximum filename of 15GB, I believe this can be worked around with Rclone 1.50+ Chunker function, but I have yet to figure out how to set it up and if it's compatible with OneDrive.
  5. The only way to clean the trash can is via the web interface, if you move/modify a lot of files they add up very quickly.

Otherwise I have yet to face a single problem and it's working great. I'll have to see how to work around the 25TB once I get there, right now still got like 2TB left so not really in a rush.

Hope this helps!

Very useful info! Thanks :slight_smile:

Yes, that's something that may be a tolerable workaround. Obfuscation is not encryption, but it's something, and ultimately as long as the contents of the files are still encrypted that is the most important part usually.

It almost certainly is. The chunker should be completely agnostic to what backend it is using.

There is one more thing that I remember now, and want to jot down before I forget... I think in my earlier research in this I found out that Onedrive has file-revisions enabled by default - and keeps quite a lot of them. This would tend to eat up your space fast if there were a lot of changes to files.

There either is - or was - some workaround to disable this but I never got around to implementing or testing this. Is this something that you have done yourself? If so then I would appreciate it if you shared what method you used :slight_smile: MS has for some reason removed the option to do it in the GUI so it's not trivial to do anymore. Just wondering if it is still possible. Having dozens of revisions forcefully applied is something that would annoy me to no end.

Any clue on tips on how to implement this? Should this layer be added after or before encryption? If so, do I need to reupload or just basically remove the "--max-size 15G" flag and keep current files? What would be the best hash type for OneDrive? How would Rclone manage current uploaded files larger than the chunk size?

I do remember the same, while I do not have such problem as I basically upload/sync static files. Yet an easy way via GUI to update this. Open OneDrive (Sharepoint) and click the gear on the top right and select "Site Contents" from there, press the 3 dots next to Documents (actions) and select settings, in there click on Versioning settings. In there just limit to 1 major version and press save. I'm not sure this will limit to one or to two versions but should highly limit the problem.

Either before or after works, but before encryption is preferable because otherwise it becomes obvious what chunks go together as they will get a chunker extension at the end. This only matters for obfuscation-security purposes however, which may not matter to you, but there's no significant detriment to doing it the other, cleaner, way.

Keep current files. Chunker does not assume all files are cunked. Chunked files get an extension that identify them as chunked files and chunker only acts on those. You would have to remove --max-size yes, obviously, or rclone would just ignore larger files before they even reach the chunker.

What hash-type chunker uses is not really relevant to what backend you use it on. That has is internal and only works within the chunker anyway. That hash is only there as a workaround for the fact that the backend won't be capable of calculating it's own hashes for chunked files - so the chunker will do it for you when you upload instead. The ideal hash is therefore "any hash that local system supports". Since you probably don't use a filesystem that natively stores hashes on files it is irellevant as any formt will be CPU-hashed anyway. I'd just use md5 probably. It's fast - and more than good enough for the job.

Rclone will just pass the whole file from the OS to the chunker, the chunker splits them, and then they get uploaded. As far as all rclone commands and parameters are concerned the files are they "real" and "whole" size. The whole idea is that the process should be transparent to the end-user.

I think I tried that and found that you couldn't set it any lower than something silly like 20. I'll re-check it in case I missed something though. Thanks.

Hi, broth you have the second plan, 10 usd more tax?

I currently have the Office 365 E3 pack. It's a small business account, and have over 20 accounts activated, so not sure if it's enforced if you have less than 5 accounts.

Each account pay €19.70/ month?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.