David Bombal cover image

#515: Phishing the AI: Zero-Click NIGHTMARE

David Bombal

00:00

What Is Prompt Injection and How Attackers Evade Guardrails

Pascal defines prompt injection, shows how attackers rephrase queries to bypass LLM safety, and explains intent detection challenges.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app