

AI Is Breaking Google
37 snips May 29, 2024
Lily Ray, a 15-year veteran of SEO, discusses Google's Search Generative Experience, highlighting its negative impact on search results. The podcast delves into AI-generated misinformation, ethical SEO practices, and the need for Google accountability.
AI Snips
Chapters
Transcript
Episode notes
Africa K Country
- Google's generative AI initially claimed no African country starts with K.
- It cited a forum post quoting a ChatGPT hallucination and a website mistakenly listing Kenya.
More AI Hallucinations
- Google's AI suggested using Elmer's glue on pizza to prevent cheese slippage.
- It also claimed a dog played in the NBA and misidentified a sex toy as a workout device.
Google's AI Doesn't Comprehend
- Google's AI doesn't comprehend content; it predicts answers based on training data.
- This leads to hallucinations, like recommending eating a rock daily based on The Onion.