Chrome version 147 silently downloads Gemini Nano’s weights.bin file to local storage, sparking major privacy, data, and legal concerns.
I don’t use chrome on my personal computer.
People are shocked when the bad boyfriend (google) that people warned them about, do highly unethical things. That boyfriend only exists to use & abuse you, it has no other purpose anymore.
That “boyfriend” just makes money, he doesn’t know or even care about your existence.
That boyfriend was once sweet, caring, quirky, and innovative once. He tried many things and was better than his older two brothers, the hip one who charged too much and the lame one that your parents were forced to deal with.
Somewhere along the way, he forgot his roots and became Jennifer Lopez.
His name is Jennifer Lopez, he likes burritos, and tacos
Not on my machine lol.
Bloatware, in my Google Chrome!?
No it isn’t. Because I don’t run anything from Google.
Let me guess, you run Linux?
No, I just don’t use anything from Google. What do they offer that is good anymore? Zero things.
I use Arch btw
My arch system runs an arch VM.
I use arch btw
Well I’m glad Google decided to bite the bullet and face the immense, Biblical backlash. So that Mozilla Corporation has a few new thoughts to ponder on.
Their idiot CEO will move forward with forcing AI onto people too. CEOs don’t learn.
What if you keep the file around but write to it and zero size it? Does chrome still download the file again?
Eventually it’ll update.
It’s roughly twice the size of the base app, if it’s the same as Edge, which on my machine is 1.82GB. It’s shady as hell, but “massive” is doing a lot of lifting in this headline.
That is absolutely massive for a browser. Like multiple orders of magnitude too big. Like what are we even doing
And I don’t care if everyone is doing it, it’s still too big!
I wonder if they do it on phones as well. Not everyone has a 256GB+ device, and 4GB would be a significant chunk on even that.
On android phones I believe this is called Gemini Nano which is part of the Gemini app for Android and the android AIcore app. The AI core app supposedly eats up like 4-12GB of storage. I believe it’s a system app that you can’t uninstall.
Might be worth a look at your phone apps especially if you have a pixel phone.
What little I read about this it seems to only be on desktop. For now.
Some suit in the C Suite is having a wet dream though imagining the not so far off future where they can run shit like this on your phone.
What’s wrong with this? Isn’t it preferable to run the LLM locally for privacy reasons? Plus googles open weight models are pretty good for what they are.
It downloads the 4GB local model and then doesn’t use it. Instead chrome LLM usage is routed through their online LLM service instead.
Chrome’s most recent release, version 147, now includes an AI Mode pill in the omnibox, however, this routes queries to cloud-based AI servers. The local model is not used by that AI, instead it powers features like “Help me write”.
They don’t tell you they are doing it, for one.
Even though this 4GB is local, the “omnibox” at the top that also uses Gemini shit doesn’t use this 4GB and still goes to the web to determine the slop it wants to provide.
But…
Isn’t that a good thing?
I mean, running an LLM locally is much more private than running it somewhere in the cloud at a provider that gets your raw data, isn’t it?
All your data stays on your device, while making it much, much harder for Google to argument why it should be uploaded to their data centers.All your data stays on your device
You don’t seriously believe that, do you? They just use your device’s memory and CPU, thus your electricity to shovel through your data and then sending all valuable data to their servers.
For clarity sake, that’s not what’s happening here. (Don’t misunderstand this comment as defending google, I could write a book about how much they suck)
The model downloaded is a LLM called Gemini Nano, and it’s used for things like “help me write”, checking if an incoming message is scam, summaries, etc.
Don’t worry about it itself being a spyware. It’s not; but for argument sake, if we were to assume that it was: they already know a lot about you through their usual apps and services, and get a lot more info out of you through them. This LLM would hardly move that needle.
The actual issue is that they download it for everyone, even if their devices don’t match the minimum requirements. And without consent. And to enable it, you need to go through several menus, as the default behaviour is to use the cloud (this could change eventually, my understanding is that in this update they’re just laying the foundation)
But, it’s Google that we’re talking about. Last year they were sentenced to pay a fine for spying on users despite them having their tracking settings off. And it wasn’t the first time iirc. This kind of behaviour is par for the course with them
It’s already been pointed out in multiple threads that the terms of service specify that even if it uses the on board model it still sends your queries to Google.
Yeah, it’s what I said. Right now it’s defaulting to the cloud
It’s not a good thing if you don’t want a freaking LLM to begin with. Hidden 4GB download for a feature I can’t give a single fuck about is ridiculous.
If the reporting is accurate, your data is still sent to Google’s servers for processing. This doesn’t appear to improve privacy, it’s more like an extension of the user surveillance business model that Google has pursued in the past decade.
I’m reminded of when they pinky swore that they weren’t dissecting your data in incognito tabs.
They lied. And nothing ever really happened to them for it. Proof is that they still have the audacity to do anti consumer shit like this and not even think twice.
Also if I was someone who wanted to run an LLM locally there are many options other than whatever crap google is putting out. You can’t trust them at all with even a morsel of your data.
If someone chooses to do that then yes its a better option, but 4GB of LLM shouldn’t just be shipped in a browser.
If they are doing this without user knowledge, I wouldn’t trust that everything the LLM ingests stays local either, until proven otherwise. Also, not everyone wants to have a local LLM running on their browser eating up 4GB of space.
If I choose to install and use an LLM on my device, sure. That doesn’t mean Google should take it upon themselves to ship one baked into the browser, with no way to opt out or remove it without it being re-downloaded.
Assuming Google will respect privacy is certainly a take.
Say the following single line below out loud:
I am the product.
The model could interact with everything on the PC without overhead on connection or servers, or user consent and then report back compressed reports. And who knows, maybe even training the model in a distributed way with users interactions with the PC.
Sure, but privacy isn’t the only issue. It still consumes a ton of energy all for basically nothing. So you are paying that electric bill, as well as as the wear and tear on your GPU.












