Google Chrome deployed a 4GB Gemini Nano AI model to user devices without explicit consent, and the installation persists even after deletion. The browser automatically re-downloads the model if users remove it, according to Decrypt's investigation. The mechanism operates independent of Chrome's visible AI Mode button, which users encounter in the interface. This rollout affects eligible devices running recent Chrome versions, expanding Google's on-device AI infrastructure without transparent user control.
The silent installation marks a shift in how tech giants approach AI deployment. Rather than waiting for user opt-in, Google embeds machine learning models directly into widely-used software. The 4GB footprint consumes significant storage and bandwidth on affected machines. Users cannot permanently disable the download through standard Chrome settings, forcing them into a cycle where deletion triggers automatic reinstallation.
The disconnect between the visible AI Mode button and the actual Gemini Nano model raises questions about Google's intent. Users clicking the AI Mode button interact with a different system entirely, leaving the 4GB download operating in the background for undisclosed purposes. This architectural choice mirrors patterns seen in other Google products where machine learning models run silently to power features like on-device processing for privacy or performance reasons.
Privacy advocates flagged the practice as problematic. Users lose granular control over what runs on their machines, and the automatic reinstallation prevents meaningful opt-out. Storage-constrained devices face particular strain from the involuntary 4GB allocation.
Google hasn't clearly communicated the model's purpose or function in public documentation, relying instead on quiet deployment across Chrome's billion-user base. The strategy reflects confidence in Chrome's dominance but generates friction around user autonomy and transparency.
THE TAKEAWAY: Google's covert Gemini Nano deployment in Chrome reveals the tech industry's willingness to install infrastructure first and disclose later, prioritizing AI capability expansion over user choice and transparency.
