Google Drive 'unlimited' - is there any limit/problem after a lot of Data?

Hey there everyone!

I was playing around a lot with rclone in the last weeks and its just a fantastic program - thanks a lot for that.

What i was wondering these days, as i have a gdrive unlimited now, is how many data people are using with their gdrive unlimited and if it would be possible to have like 250TB for example (for plex only) or if i will get in trouble after hitting 50/100/150TB etc.

The question i am asking myself is if i should fully go with gdrive now or if it would be better to directly spend the money into local storage as gdrive is 10€/Month (in my country) and i am planning for long term.

How much storage are u guys using with your gdrive and do u think google will force the 1TB/User if below 5 Users in the future?

I personally use a single user now and have about 100TB of data out there. Worse case is that they enforce the 5 users and I would easily add 4 more users and go on my merry way.

You never know what can happen though as unlimited can change in a blink of the eye so depending on your needs, you'd have to plan for worse case. For me, if I lost my media, it would be annoying but it is not mission critical data.

1 Like

thanks man for ur fast response, again.
Yeah i mean it wouldnt be to hard loosing the media data.
i could buy 4 new 8TB seagate archive v2 these days for 400€ + shipping, which would be 24tb (or even 32tb if i go the complete unsafe way) and was thinking if i rather should stay with my 4x3tb + gdrive or go with those hdds. as it would be the same as 40 Month gdrive.

But 100TB is a good number for me to hear actually, so even if 100tb would be the magic number, 100tb of raid storage is quite a lot and absolutley worth the price

Using 800TB :yawning_face: single user.
As long you’re a gsuite customer and don’t share copyrighted things on public you’re safe.

do u store everything crypted?

the dedupe factor of unencrypted stuff is much higher and my thoughts about the storage was that the effective storage which is not deduped counts more then the raw number "used"


Then run dedupe once a week.

nah, i ment dedupe from server side. i am not uploading the same files multiple times, doesnt mean that they are not dedupeable from the server side

I don’t understand what you mean.

He means that Google cannot use deduplication on their side if the files are encrypted as if we have the same file and we each have it encrypted, it counts twice.

If it wasn't encrypted, Google would dedupe it basically and only have 1 copy of the file.

That being said, I wouldn't ever have my data anywhere not encrypted as it's my data and they can't see it :slight_smile:

Basically that's what I mean. But even encrypted stuff will get deduped on block level but the factor is much lower as encrypted stuff is random

That's not right as encryption breaks dedupe since the blocks don't look the same. If we have the same file and we both encrypt it with different keys, you get 2 block level different files because of the encryption.

what u are writing is correct but would only count if dedupe would handle files. but the modern dedupe is comparing blocks. so even if u have a 40GB encrypted file and i have a 2GB encrypted file it can be deduped as there is the possibility of identical blocks.

If we both have a file that is the exact same and we encrypt it with different keys, the blocks on the disk are going to be different and you won't be able to dedupe them.

Otherwise, encryption would not work very well if we were able to see the same data at the block level.

There are ways around that but we are doing software encryption before the data is written to disk so it's all going to look different.

Here is an article that talks about it:

i mean this is off topic on my initial question actually.
but i was not talking about 2 files encrypted with different passphrases - i was talking about block dedupe which is not interested in the file itself.
every file has only binary values (encrypted and nonencrypted ofc) (10011001100111101011.....) and if there is a matching block it can be deduped, doesnt matter if the corresponding file is encrypted, not encrypted, big, small or whatever format etc.

there are many studies about the dedupe process especially on encrypted files and it is working ofc

PS: and even if u have totally random data blocks, u have a specific block size (like 2^xy) and if u take it with basic math u will have a certain amount of different block possibilities and if u have more blocks then this number u can be 100% sure that there is a matching block. if the dedupe process would be really perfect u would not have more than this certain amount of individual block possibilities of data but u will need a lot of cpu power and ram...

Have any links to share that show it working with encrypted files? I'd love to read them.

I'm not talking about randomly deduping on a global scale.

The use case I shared was we each have 1 file. We each encrypt it with different keys.

You can't dedupe that at a block level as the blocks are different.

Which was the high level statement in regards to encrypting your data in your Google Drive that 1 person might have 1 TV show and if 20,000 people all have that 1 show and use a different encryption, you can't deduplicate it effectively.

If it was not encrypted, it would have just 1 copy.

I used unlimited google accounts via Ebay, I was transquille more than 2 years, and one fine day, account closed!
so I took over 2 other accounts, and 2 MONTHS later, it starts again :frowning:

I therefore think that the most useful solution is to switch to a G.suite 1 user account and to take advantage that for the moment it does not block us at 1TB.

can you get the expensive monthly subscription by subscribing to the site via a VPN?
in the case where there are several users on the same pro account, the number of API calls and specific to the User or to the pro account itself?
user i think ^^

Sorry for my late response, i was on the cell phone the other day and aht not to much time afterwards.
As we discussed this in university i have no online paper by hand i could reffer to but i can search u something on the weekend.
But basically it should be clear after my last post that it wouldnt make sense that it is not actually working on block level where files dont matter. the dedup rate is nearly the same then (if u have a good encryption) as if u would have random data and its only a matter about the block size u are checking. but even then there are methods with variable block size deduplication. in this scenario u can have a look at the birthday paradox

if you search for "block level deduplication encryption" u will find good stuff as well.

i mean it should be clear that u wont get a unlimited gdrive for "2$ for lifetime account" and i wouldnt put any data on such a drive Oo
and personally i think either pay the price they take in ur country/currency or just dont go this way. as it costs some time as well and i like to have a good running system in the end i wouldnt take the risk to get kicked (in your case "kicked again, and again"). comparing the normal rate and some cheaper country rates u can only save around 2$ and i dont think thats worth it, is it?
(for a unlimited drive 12$/month is not really expensive)

if u need only a couple of TB u can go with Office 365, u can buy 1 year licenses there for around 40$ in sale and u get 6TB Onedrive (u can connect the 5 co-accounts with the main and have for example 5 folders with 1TB each at the main account)

You don't have anything specific to share?

Here are few examples on how it doesn't work well with encryption:

Carbonite also gives s great example on their product and how they handle encrypting data and getting around the dedupe challenge of encrypted data as this picture represents it well:

If they change the "unlimited" on one user, they will likely change it for other amounts of users as well.

In other words, there's no guarantee that you will just be able to add 4 more users and go on about your day. Ending unlimited storage will likely happen concurrent with changes all across the account pricing structure.

If you have tons of data, I would not use google drive and assume it will be there for any length of time.

You asked about local vs. cloud, and for 250TB, my answer is local. Run a backup to the cloud if you are worried about data security. However 250TB or 1PB (seriously dude?) is asking for trouble. Amazon pulled unlimited Amazon Drive; google inevitably will as well. Amazon even released a graph of the distribution of storage space used per user. You know how many users used 1PB? ONE GUY. ONE.

I'm not saying you are ruining it for all of us, but at the same time... you maybe kind of are? Even on glacier that works out to $4200/mo. Google probably doesn't care about a couple TB. But you are costing them literally $50,000 per year. Eventually it will get eliminated. I wouldn't be surprised if users taking up 1PB is the reason they end up axing it. Sorry that's my soap box...

Based on what information? I've been on Google Drive now for about year and a half. There is no reason to not use while it's available as there is no way to know if/when they would change a policy or even what that policy would be if changed.

The question comes down if you want to wait and see and how important the data is.

Dropox has unlimited data as a backup option as well for $60 a month which if they did choose to enforce 5 users puts it at the same price point that Google should be charging.

I can use my unlimited drive how I choose to and you really have no say and passing judgement on other folks on how they use it is not helpful.