WeSearch

The cost of Google's AI defaults and the illusion of choice

·9 min read · 0 reactions · 0 comments · 3 views
#ai privacy#google gemini#data collection#user consent#dark patterns
The cost of Google's AI defaults and the illusion of choice
⚡ TL;DR · AI summary

Google emphasizes user privacy in its AI development, stating that it does not use personal content from Gmail or Drive to train Gemini's foundational models, but the AI may still process user data for specific tasks and retain inputs and outputs that could include personal information. Users can opt out of data collection through settings like Gemini Apps Activity, but the process is complex and often obscured by confusing interfaces and dark patterns that discourage disengagement. While Google claims to filter personal data from training sets, there is no transparent way for users to verify how effectively this is done, raising concerns about the true extent of privacy protection.

Original article
Ars Technica
Read full at Ars Technica →
Opening excerpt (first ~120 words) tap to expand

AI’ll be watching you The hidden cost of Google’s AI defaults and the illusion of choice Google says it respects user privacy in AI, but the reality is not so black and white. Ryan Whitwam – Apr 30, 2026 7:00 am | 33 Gemini's privacy controls are multifaceted and often confusing. Credit: Aurich Lawson Gemini's privacy controls are multifaceted and often confusing. Credit: Aurich Lawson Text settings Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only Learn more Minimize to nav Many people are hoping—nay, praying—that the potential AI bubble will burst soon. But to hear Google tell it, generative AI is the future, and the company’s products have to change to keep up with the technical reality.

Excerpt limited to ~120 words for fair-use compliance. The full article is at Ars Technica.

Anonymous · no account needed
Share 𝕏 Facebook Reddit LinkedIn Threads WhatsApp Bluesky Mastodon Email

Discussion

0 comments

More from Ars Technica