Google Drive "abusing"

its completely irrelevant on BLOCK level.
a block exists on bits no matter if the data is encrypted or not.
just search for it online if u like to have more informations...

hi,

if you have a link that shows google is de-duping across its entire cloud, please share...

it is relevant, as each rclone block is crypted with a password and hopefully a salt.

the odds that another block on google has the same hash is close to nil
and the amount of cpu and overheard needed to compare each and every block, across the entire
google cloud, would be huge for so little gain of saved space.
that would mean that google has a global database of every block on planet earth, with its unique id and its unique sha256 checksum.

first of all the block encrypted by rclone doesnt need to be at the same size as the blocks google is using for the deduplication algorithm.
second would be that google is definitly making a lot of calculation on the data while it is uploaded and checking how those data could be handled best in order to safe space.

as google has the biggest server farms all over the earth and as they have incredible big storage pools for gdrive, youtube etc. they definitly have the best deduplication system you can think of.
Youtube alone needs several PB on new space per day.

if the algorithm is really good and the analysis of the incoming data as well, you can have a really high rate of deduplication even on encrypted data with no patterns.
and there are only a limited amount of bit permutations on blocks.

if you have a link that shows google is de-duping across it entires cloud, please share.

why are you mixing up 2 independent statements of mine?
And both of them have nothing to do with your question.

as you directly started to write, when i just finished my post, shows that you are not actually reading the stuff.

i assure you i read your posts before i reply and i do not intend to mix your posts.

you claim that google is de-duping across its entire portfolio of gdrive, gcloud, you-tube, gmail and so on.
sorry i cannot imagine that.
based on your post, you told me to search, i tried to search and i could not find that.

please, if you have that link, share it.

Dedupe works on detection of identical files or blocks of data. Such duplicates occur only with negligible probability (ie, not at all) in random data, and encrypted data should to be indistinguishable from randomness.

So yeah, at first thought you can't dedupe encrypted data...but eh, maybe you can? This basically says 'magic!', but carbonite claims they do it after encryption.

Guys, this thread was created to discuss the topic of abuse, as in circumventing Google's policies and bypassing quotas by using service accounts and/or other methods. While this has already led to Google changing their business model around G Suite and Photos, it will likely (but hopefully not) also lead to the eventual demise of unlimited storage, both loophole and legit.

1 Like

very interesting, that was a good read, thanks.

i could see that working in the limited context as applied to an enterprise and requiring a computer program running on the client that does the encryption and calculating the hash.

i do not see that is something google is doing, planet wide, across all its products.
would love to see a weblink to confirm/deny that.

i depend on wasabi and veeam for my company server backups and i made sure there was no such de-duping going on behind the scenes.
i contacted their tech support and they have since have posted this.
https://wasabi-support.zendesk.com/hc/en-us/articles/360019062892-Does-Wasabi-perform-data-deduplication-or-compression-
and veeam relies on dedicated third-party locally run servers to handle dedup.

aws s3 does not dedupe across its products, and if there was a good reason to, they would do it.
in fact, they offer an additional product that layers on top of s3 that runs client-side that does de-dupe.

hi there,

so this is too off topic for a off topic post?

@Friday13th raised an interesting point, that google might de-dupe.

Most def! I'd say it's an abuse of this thread :wink:

yeah, i get what you wrote, very witty.

but enjoy that compliment while you can, as this off topic post of an off topic post will self-destruct.

and i think you are trying to hijack another of my posts. :upside_down_face:

please start a new post, so i know you read this and as you write, i will delete this post.

1 Like

LOL I was here first! Besides, I use this post to vent and worry about what's gonna happen to my gigglebytes :sob:

Anyway, carry on fellas!

Oh, and I take that back. You were here before me, as usual :stuck_out_tongue:

2 Likes

nope. there is no way to de-dupe encrypted data at any scale that would make a difference. You are probably worse off attempting to de-dupe encrypted data just based upon the compute/time needed and the end result being close to zero gains.

1 Like

I do not trust Google with my privacy anymore and that's sad. I even had to learn how to backup Google Drive with the help of 3rd party solutions.

I personally love Digital Ocean Spaces. if you don't have any ADULT content then check out Pricing – Wasabi

untitled (wasabi.com)

remember to read number 3

Pricing FAQs – Wasabi

really would not matter if a rclone crypt is used.
that is what i do with wasabi, always crypt the data.

hey I have to talk about it! do you used this provider! But due their fair policy total stored data needs to be less than monthly bandwidth use! I can't use it for my website! They told me to use CDN! but that's not a solution for big websites!
Did you tried it on your site or just for personal use?

wasabi is not a good choice for websites, as it can only handle static webpages.

i use wasabi, many TB stored there.

I think need to used it along with Digital Ocean Sapces for backups! But my problem still not resolved! (posted a help req on rclone forum) ! need to learn more about crypt then! & how it decrypt! (later)