Disclosure checklist, with file paths and a contrast case

Local AI in your browser, the silent install: a disclosure checklist

In May 2026 Alexander Hanff published forensic notes showing Google Chrome silently writes a roughly 4 GB Gemini Nano weights file into your user profile, with no install-time prompt and a re-download behavior if you delete it. Edge ships the same pipeline. If you arrived here from an X thread about that, the practical part is two terminal lines below. The interesting part is the four-question disclosure test you can run on any tool that says the words local AI, and what a clean answer to those four questions looks like in a real app on the same machine.

M
Matthew Diakonov
7 min read

Direct answer (verified 2026-05-13)

If you use Chrome on macOS and your machine meets the hardware bar, yes, your browser has very likely written a roughly 4 GB on-device AI model into your user profile without prompting you. The path is ~/Library/Application Support/Google/Chrome/<Profile>/OptGuideOnDeviceModel/weights.bin. Microsoft Edge ships the same Optimization Guide pipeline in its own profile folder. Brave Leo does not do this; Leo's local model lane is opt-in via Bring Your Own Model. To stop Chrome from re-downloading after a delete, you have to open Chrome Settings, find the on-device AI section, and toggle it off. Source for these facts: Hanff's published forensics at thatprivacyguy.com, corroborated by Snopes and Malwarebytes.

Step 1, check your disk

Two commands. The first asks the file system whether the folder exists for any of your Chrome profiles and how big it is. The second confirms the actual weights file. No accounts, no extensions, no third-party tools. If both come back empty, your machine either never qualified for the rollout or you already disabled the toggle.

Local check, no Chrome restart required

A note on the Edge path: it includes a literal space in Microsoft Edge, so when you adapt the command for Edge make sure your shell quoting preserves the space. Many copy-pasted Chrome guides break at this exact step on macOS.

Step 2, what the file actually is, and what to do about it

weights.bin is the serialized Gemini Nano model. It is what Chrome's Built-in AI APIs are supposed to call when a page or Chrome surface asks for a local-only completion. Hanff's forensics noted that some Chrome surfaces still route through Google's cloud even when the local model is resident on disk, so users were paying the storage cost without consistently getting the local-only privacy benefit. That decoupling, model on disk versus request stays local, is the part that makes a disclosure gap feel sharper than it would in a clean opt-in design.

To actually remove and prevent re-download, the trash bin is not enough. The Optimization Guide will refetch on the next eligible session. The full sequence is: Chrome Settings, AI or System subsection (the label has moved between Chrome 125 and 130), find the on-device AI toggle, switch it off, restart Chrome, then remove the folder. On older Chrome builds where the toggle has not landed yet, the workaround is chrome://flags, search for Optimization Guide On Device Model, set to Disabled, restart Chrome, then trash the folder. Edge has no consumer-facing toggle today; the enterprise lever is the Windows GenAILocalFoundationalModelSettings policy.

Step 3, the four-question disclosure test

Any tool that uses the words local AI in its marketing can be put through four small questions. The questions are not opinions, they are observable facts about how the tool ships and runs. The Chrome-on-Mac case fails three of the four. A well-disclosed local AI tool on the same machine should pass all four. Run them in order:

Question 1

Does the tool ship a multi-gigabyte model weight as part of its normal operation, and if so, does the user get a prompt before it hits disk?

Chrome fails this. The roughly 4 GB write happens without any install-time dialog, just background eligibility checks against free disk and RAM. A passing answer is either no bundled model (the tool brings glue, the user brings the model) or yes with a macOS dialog the OS draws, not one the app draws.

Question 2

If I delete the artifact, does it come back?

Chrome fails this too. weights.bin refetches after deletion. A passing answer is no, or only after the user takes a clearly named action to opt back in.

Question 3

Where can I read what the tool does, in code, in five minutes?

Chromium is open, but the relevant behavior is spread across OptimizationGuide internals and is not a five-minute read. A passing answer is a single file or a short page that names every permission, every model endpoint, and every persistent artifact. MIT-licensed apps with a single permissions module are the easy case here.

Question 4

Where can a non-technical reader see, in plain English, what the tool can and cannot do?

A passing answer is a public, linked-from-the-product safety page that enumerates capabilities and limits. Chrome's earlier privacy claim about Gemini Nano was quietly removed per Decrypt's May 2026 reporting, which is the opposite move. The fix is structural: link from the in-app permission screen out to a public page that lives at a stable URL.

The contrast case: what a clean local AI tool on the same Mac looks like

Fazm is a macOS app I work on. It is a computer-use agent that drives the browser, writes code, edits documents, and runs Google Apps. It is in the same category as the on-device AI surface Chrome is shipping, in the sense that it runs locally, sees the screen, and takes actions. The interesting comparison is not the feature set, it is the disclosure shape, because both live on the same disk and both need permission to be useful. Here is the side by side, anchored on observable, file-level facts:

FeatureChrome on-device AIFazm (consumer disclosure shape)
Where the model lands on diskOptGuideOnDeviceModel/weights.bin inside the Chrome profile, no prompt before writeNo bundled local model; users plug in their own Claude, Codex, or local endpoint they already trust
Consent surface before downloadNone at install; opt-out toggle in Settings began rolling out Feb 2026macOS TCC dialogs for microphone and screen recording on first launch, drawn by the OS, not the app
Re-download if you delete the fileYes, Chrome refetches weights.bin on the next eligible sessionNo persistent model files to refetch; revoking permission in System Settings stops the app cold
Source you can auditChromium open, but on-device model fetcher behavior buried in OptimizationGuide internalsMIT-licensed app source on GitHub; PermissionsPage.swift is one file, ~270 lines, readable in five minutes
Public doc that says what is local and what is notPrivacy claim about Gemini Nano was quietly removed from Chrome docs per Decrypt's May 2026 reportingPublic /safety page enumerating what the app can and cannot do, linked from the in-app Permissions screen

The Fazm column is not aspirational. Every row maps to a file you can read or a dialog you can see. The permissions screen lives at Desktop/Sources/MainWindow/Pages/PermissionsPage.swift in the open source repo. The public safety page is at fazm.ai/safety, linked directly from the in-app permission screen.

Scale, briefly

One person finding a 4 GB file in a profile folder is a footnote. Multiply by a Chrome install base in the billions and the same write becomes a real number. Hanff worked out the distribution side using public Chrome user estimates and standard datacenter carbon math:

0 GB

Size of weights.bin written into your Chrome profile, measured on Apple Silicon in Hanff's report

~0 GWh

Energy estimate for distributing the model once to a billion Chrome devices (Hanff, May 2026)

~0 t CO2e

Implied carbon for the same one-shot distribution, before any re-download cycles

Numbers are Hanff's; they assume a one-time distribution without accounting for delete-and-refetch cycles, which on the current Chrome behavior would push the real number higher.

Chrome silently downloads a ~4 GB Gemini Nano weights file to your device. If you delete it, Chrome re-downloads it. There is no clear consent flow.
A
Alexander Hanff
Privacy researcher, May 2026 forensic report

What about the law

The argument Hanff and follow-up coverage make is narrow and worth stating cleanly. Article 5(3) of the ePrivacy Directive, 2002/58/EC, requires prior informed consent before storing data on a user's terminal equipment, with a tightly drawn strictly necessary exception. A 4 GB language model is not strictly necessary to load a web page. Article 5(1) of the GDPR requires lawfulness, fairness, and transparency. Article 25 requires data protection by design and by default. Opt-out toggles added eight months after rollout do not retroactively create consent for the install that already happened. Whether a regulator brings a formal case is a separate question, but the analytical gap between silent install and Article 5(3) is not subtle.

For consumer apps that ship outside the browser, the same principles push in a useful direction even when the apps fall outside the ePrivacy directive's exact scope. The cheapest way to stay safe is structural: route every persistent artifact through a macOS-drawn permission dialog, document what each one does in a public file, and make the in-app screen link to that file. None of that is hard. It is just a different default than the one Chrome picked.

Want a local AI tool that passes all four disclosure questions

A short call walks through your stack, what you would replace, and what disclosure shape would survive a privacy-team review.

Frequently asked questions

Where exactly does Chrome put the 4 GB file on macOS?

Inside your Chrome user data folder, in a subfolder called OptGuideOnDeviceModel, in a file called weights.bin. On a default macOS install with one profile the full path is ~/Library/Application Support/Google/Chrome/Default/OptGuideOnDeviceModel/weights.bin. If you use Chrome profiles, each one gets its own copy under Profile 1, Profile 2, and so on. The file is the Gemini Nano weights, not user data, but it still lives in your user profile and counts against your free disk.

Will Chrome download it again if I just trash the file?

Yes. Chrome's Optimization Guide subsystem treats the absent file as missing-state and re-fetches it on the next eligible session. Privacy researcher Alexander Hanff documented this in May 2026 by deleting weights.bin on a clean Apple Silicon profile and tailing fseventsd kernel events as Chrome silently pulled it back. To stop the re-download you have to disable the on-device AI surface in Settings, which is the toggle Google began rolling out in February 2026. Removing the file without flipping the toggle is a temporary cleanup at best.

Does Microsoft Edge do the same thing?

Edge is Chromium-based and inherits the same Optimization Guide pipeline. It stores its own copy of the on-device model in its own OptGuideOnDeviceModel folder inside the Edge user data dir. On Windows IT admins can block this with the GenAILocalFoundationalModelSettings group policy. Microsoft has not pushed a public consumer disclosure or opt-in dialog for this; the model is shipped on the same silent-install pattern as Chrome.

Is Brave Leo doing this too?

No. Brave Leo's local model lane is opt-in via the Bring Your Own Model setting. The user goes to Settings, Leo, Add New Model, and types in an Ollama endpoint or another local server. Nothing gets downloaded automatically. Brave's docs explicitly warn users that local models still have their own privacy implications. This is the cleanest disclosure pattern among mainstream browsers in 2026.

Is the on-device model the thing that powers Help me write?

Partially. The Gemini Nano weights are what the Built-in AI APIs are supposed to call when a website or Chrome surface asks for a local-only completion. Hanff's forensics showed that some Chrome surfaces, including parts of Help me write, still route through Google's cloud even when the local model is fully resident on disk, so users were paying the storage cost without getting the promised local-only privacy benefit in every case. The decoupling of model-on-disk and request-stays-local is the part that makes the disclosure problem worse.

What did Google say about consent?

Google's public response in early 2026 was that they added an opt-out toggle in Chrome Settings to disable and remove the model. Critics, including Hanff and follow-up coverage in The Register, Decrypt, and Malwarebytes, pointed out that this is opt-out, not opt-in, so it does not satisfy ePrivacy Directive Article 5(3), which requires prior informed consent before storing data on terminal equipment. Decrypt also reported that Google quietly removed an earlier privacy claim from its Chrome AI docs. The toggle helps individuals reclaim disk and stop the re-download, but it does not fix the consent question.

How does Fazm handle the same situation, since it also runs locally?

Fazm is a Mac app you download from fazm.ai. It does not ship with a multi-gigabyte local model bundled inside. The model that powers reasoning is your own: a Claude API key, a Codex OAuth login, or a local endpoint you configured yourself. The only models Fazm exercises locally without asking are tiny voice components for transcription, and even those go through standard macOS microphone permission. On first launch Fazm forces the user through the macOS TCC permission dialogs for microphone and screen recording, and the Permissions page in the app links to a public safety page that documents what each permission does. The contrast is structural, not just messaging: silent profile-folder write versus a system permission dialog the OS draws.

I am on macOS. What is the one command that tells me if I have the file?

Run du -sh "$HOME/Library/Application Support/Google/Chrome/"*/OptGuideOnDeviceModel 2>/dev/null in Terminal. If any line prints a size around 4G followed by a path, you have the model on disk. To check Edge, swap the path for ~/Library/Application Support/Microsoft Edge/. If nothing prints, your profile either never met the hardware bar or you already disabled the on-device AI surface. The hardware bar Google publishes is roughly 22 GB free disk, 16 GB RAM, and 4 CPU cores or more.

How did this page land for you?

React to reveal totals

Comments ()

Leave a comment to see what others are saying.

Public and anonymous. No signup.