
Using AI at Work: AI in the Workplace & Generative AI for Business Leaders Episode 82: Using AI at Work to Win in Search and LLM Discovery with Zak Ali
25 snips
Dec 15, 2025 Zak Ali, General Manager at Finder US, touches on the shifting landscape of search as AI technology evolves. He emphasizes that SEO is far from dead, explaining how LLMs like ChatGPT select sources based on credibility signals. Zak reveals Finder's strategy of focusing on long-tail queries to boost visibility in AI searches and shares insights on fostering a culture of AI experimentation within non-technical teams. With a mix of practical tips and a forward-looking perspective, he highlights the significant impact of AI on content strategy and user discovery.
AI Snips
Chapters
Transcript
Episode notes
How LLMs Choose Sources
- Large language models surface sources they find most credible based on training data and web signals like editorial standards and external citations.
- If you rank on page one of search engines via good SEO, you become a candidate for being cited inside LLM answers.
Triangulate Model Queries With Inspect Tools
- Inspect ChatGPT network requests (inspect element → network → copy JSON) to see the exact search terms it used.
- Use those terms to protect existing citations and to target additional keywords the model queries.
Use Recency As A Trust Signal
- Add clear recency signals like updated dates and 2025 in titles when appropriate to increase trust for LLMs.
- Keep page information accurate because LLMs weigh recency to reduce hallucination risk.
