

Hiding in plain sight with vibe coding.
23 snips Jun 14, 2025
Ziv Karliner, Co-Founder and CTO of Pillar Security, dives into the world of AI-driven coding tools like GitHub Copilot and Cursor. He reveals alarming new vulnerabilities that hackers can exploit through Vibe Coding. Specifically, he discusses the 'Rules File Backdoor,' a technique that allows attackers to embed malicious code within seemingly legitimate instructions. This highlights the urgent need for developers to adopt new security strategies as they navigate the dual-edged sword of rapid AI adoption in software development.
AI Snips
Chapters
Transcript
Episode notes
Rule Files as Attack Vectors
- Rule files onboard AI coding assistants with project-specific best practices and context.
- Attackers can embed harmful instructions in these files, creating backdoors in generated code.
Marketplace Rule File Attack Example
- An attacker can submit seemingly legitimate rule files with hidden Unicode characters to marketplaces.
- These files then cause AI assistants to insert malicious code, fooling developers and bypassing review.
AI Used to Mislead Developers
- Attack instructions can use AI's intelligence to mislead developers about malicious code additions.
- This challenges the concept of humans effectively supervising AI actions when the attack is invisible.