

8th Layer Insights
Perry Carpenter | N2K Networks
Get ready for a deep dive into what cybersecurity professionals often refer to as the "8th Layer" of security: HUMANS. Welcome to 8th Layer Insights (8Li). This podcast is a multidisciplinary exploration into how the complexities of human nature affect security and risk. Author, security researcher, and behavior science enthusiast Perry Carpenter taps experts for their insights and illumination. Topics include cybersecurity, psychology, behavior science, communication, leadership, and more.
Episodes
Mentioned books

Jun 9, 2025 • 14min
The Costume of Truth: Why We Trust the Logo, the Lab Coat, and the Lanyard
Discover the art of 'credibility theater' and how superficial trust cues can enable deception. Explore historical and modern scams that exploit our tendency to trust uniforms and logos more than facts. Delve into the psychology behind this phenomenon and learn to recognize manipulation in everyday life. Uncover the risks posed by advanced AI on authenticity and join in on humorous discussions about creative projects, such as a fake documentary that turns animated ham sandwiches into societal disruptors!

May 28, 2025 • 13min
Feel First, Think Never: Your Emotions Are the Exploit
Explore the dark arts of emotional hijacking, where fear and urgency bypass rational thought. Learn how scammers exploit your emotions to manipulate decisions, from panic phishing to viral outrage. Delve into historical examples, like the Reichstag fire, revealing how strong feelings often cloud critical scrutiny. The discussion also touches on AI's role in ethical dilemmas and manipulative behavior, emphasizing the importance of recognizing and questioning your reactions. A captivating look at the intersection of emotion, decision-making, and technology!

7 snips
May 20, 2025 • 12min
Too Easy to Be True: The Fluency Trap and the Lie That Slides Right Past You
Explore the fine line between truth and deception as cognitive fluency makes fake information seem all too believable. Discover why grainy photos trick us into trust, while slogan-driven disinformation thrives on simplicity. Learn about the dangers of AI impersonation scams and how even experts can be fooled by staged leaks. Armed with insights on fluency bias, you’ll find strategies to sharpen your critical thinking and guard against misleading narratives in today’s information landscape.

8 snips
May 14, 2025 • 12min
The Plausibility Effect
Explore the intriguing dynamics of the plausibility effect, where belief trumps truth, leading people to accept even the most ridiculous misinformation. Delve into historical examples and learn strategies to sharpen your critical thinking skills. The conversation also touches on the rise of deepfakes and their role in disinformation campaigns, highlighting the urgent need to understand and combat these modern threats. Get ready to engage with the sneaky cognitive biases that shape our perceptions of reality!

4 snips
May 5, 2025 • 14min
👁️ Look Here: Why Every Great Deception Starts with Stolen Attention
Explore the intriguing world of attention theft, where con artists and tricksters vie for your focus. Delve into historical deception techniques, like Operation Fortitude, and learn how modern scams employ urgency and authority to distract and manipulate. Discover practical defenses to guard your cognitive space and sharpen your critical thinking skills. This engaging discussion not only uncovers the art of deception but also empowers you to reclaim your attention in an era of constant distractions.

4 snips
Apr 29, 2025 • 12min
Artifacts of Deception (Deceptive Minds - issue #2)
Explore the intricacies of deception and how narratives shape our perception. Discover why flashy tricks, like deepfakes, can distract from deeper manipulations at play. Learn about historical deceptions that reveal the mechanics behind misleading information. The discussion emphasizes the importance of critical thinking in today's digital age, alongside practical strategies to navigate this minefield. Stay vigilant against AI misuse and uncover ways to better evaluate what you see and hear.

Apr 29, 2025 • 7min
Deceptive Minds (the audio experience): Issue #1
Dive into the fascinating world of deception, where the line between reality and illusion blurs. Explore how both external and internal factors shape our beliefs and perceptions. Discover powerful tools to combat misinformation and self-deception. Gain insight into how to navigate the complexities of being fooled in everyday life. This engaging audio experience challenges you to confront the tricks your mind plays and empowers you to think critically.

Dec 27, 2024 • 38min
The FAIK Files | Holiday Special: AI Safety Update
Note: We're posting Perry's new show, "The FAIK Files", to this feed through the end of 2024. This will give you a chance to get a feel for the new show and subscribe to the new feed if you want to keep following in 2025. Welcome back to the show that keeps you informed on all things artificial intelligence and natural nonsense. In our holiday episode, Mason opens a rather unique Christmas present from Perry, we invite a special guest to help explain the infamous "Paperclip Maximizer" thought experiment, and we discuss an interesting (and somewhat disturbing) new AI Safety paper from Apollo Research.Want to leave us a voicemail? Here's the magic link to do just that: https://sayhi.chat/FAIKYou can also join our Discord server here: https://discord.gg/cThqEnMhJz*** NOTES AND REFERENCES ***An interesting cluster of new AI safety research papers:
Apollo research: Frontier Models are Capable of In-context Scheming (Dec 5, 2024)
YouTube Video: Apollo Research - AI Models Are Capable Of In Context Scheming Dec 2024
YouTube Video: Cognitive Revolution - Emergency Pod: o1 Schemes Against Users, with Alexander Meinke from Apollo Research
OpenAI o1 System Card (Dec 5, 2024)
Anthropic: Alignment Faking in Large Language Models (Dec 18, 2024)
Anthropic: Sycophancy to subterfuge: Investigating reward tampering in language models (June 17, 2024)
Fudan University: Frontier AI systems have surpassed the self-replicating red line (Dec 9, 2024)
Other Interesting Bits:
The Paperclip Maximizer thought experiment explanation
Theory of Instrumental Convergence
iPhone Game: Universal Paperclips
VoxEU: AI and the paperclip problem
Real Paperclips! 500 Pack Paper Clips (assorted sizes)
OpenAI Announces New o3 Reasoning Model:
OpenAI's "12 Days of Ship-mas" announcement page
YouTube video: OpenAI's announcement of their o3 Model
TechCrunch: OpenAI announces new o3 models
Wired: OpenAI Upgrades Its Smartest AI Model With Improved Reasoning Skills
TechCrunch: OpenAI trained o1 and o3 to ‘think’ about its safety policy
Matthew Berman YouTube video: OpenAI Unveils o3! AGI ACHIEVED!
NewScientist: OpenAI's o3 model aced a test of AI reasoning – but it's still not AGI
Yahoo Finance: OpenAI considers AGI clause removal for Microsoft investment
*** THE BOILERPLATE ***About The FAIK Files: The FAIK Files is an offshoot project from Perry Carpenter's most recent book, FAIK: A Practical Guide to Living in a World of Deepfakes, Disinformation, and AI-Generated Deceptions.
Get the Book: FAIK: A Practical Guide to Living in a World of Deepfakes, Disinformation, and AI-Generated Deceptions (Amazon Associates link)
Check out the website for more info: https://thisbookisfaik.com
Check out Perry & Mason's other show, the Digital Folklore Podcast:
Apple Podcasts: https://podcasts.apple.com/us/podcast/digital-folklore/id1657374458
Spotify: https://open.spotify.com/show/2v1BelkrbSRSkHEP4cYffj?si=u4XTTY4pR4qEqh5zMNSVQA
Other: https://digitalfolklore.fm
Want to connect with us? Here's how:Connect with Perry:
Perry on LinkedIn: https://www.linkedin.com/in/perrycarpenter
Perry on X: https://x.com/perrycarpenter
Perry on BlueSky: https://bsky.app/profile/perrycarpenter.bsky.social
Connect with Mason:
Mason on LinkedIn: https://www.linkedin.com/in/mason-amadeus-a853a7242/
Mason on BlueSky: https://bsky.app/profile/pregnantsonic.com
Learn more about your ad choices. Visit megaphone.fm/adchoices

Dec 20, 2024 • 59min
The FAIK Files | The Butcher Will Scam You Now
Note: We're posting Perry's new show, "The FAIK Files", to this feed through the end of the year. This will give you a chance to get a feel for the new show and subscribe to the new feed if you want to keep following in 2025. Welcome back to the show that keeps you informed on all things artificial intelligence and natural nonsense. Warning: today's episode gets a bit dark as we chat with seasoned prosecutor and founder of Operation Shamrock, Erin West, about a devastating combination of attacks known as "Pig Butchering" scams. We go deep into how they work and what we can do about them.Want to leave us a voicemail? Here's the magic link to do just that: https://sayhi.chat/FAIKYou can also join our Discord server here: https://discord.gg/cThqEnMhJz*** NOTES AND REFERENCES ***Learn more about Erin West
Erin's LinkedIn Profile
Operation Shamrock
Pig Butchering Scams:
CNN Story featuring Erin West: Killed by a scam: A father took his life after losing his savings to international criminal gangs. He’s not the only one
CNN Story: Hear how this man lost $1M in a 'pig butchering' crypto scam
CNN Story: Myanmar-based gangs force trafficking victims to scam Americans online
YouTube video: John Oliver episode
FBI Internet Crime Complaint Center (IC3)
Ok, doomer! Let's talk P(doom):
NY Times article: Silicon Valley Confronts a Grim New A.I. Metric
FastCompany article: P(doom) is AI's latest apocalypse metric. Here's how to calculate your score
Wikipedia Entry
PauseAI P(doom) records
*** THE BOILERPLATE ***About The FAIK Files: The FAIK Files is an offshoot project from Perry Carpenter's most recent book, FAIK: A Practical Guide to Living in a World of Deepfakes, Disinformation, and AI-Generated Deceptions.
Get the Book: FAIK: A Practical Guide to Living in a World of Deepfakes, Disinformation, and AI-Generated Deceptions (Amazon Associates link)
Check out the website for more info: https://thisbookisfaik.com
Check out Perry & Mason's other show, the Digital Folklore Podcast:
Apple Podcasts: https://podcasts.apple.com/us/podcast/digital-folklore/id1657374458
Spotify: https://open.spotify.com/show/2v1BelkrbSRSkHEP4cYffj?si=u4XTTY4pR4qEqh5zMNSVQA
Other: https://digitalfolklore.fm
Want to connect with us? Here's how:Connect with Perry:
Perry on LinkedIn: https://www.linkedin.com/in/perrycarpenter
Perry on X: https://x.com/perrycarpenter
Perry on BlueSky: https://bsky.app/profile/perrycarpenter.bsky.social
Connect with Mason:
Mason on LinkedIn: https://www.linkedin.com/in/mason-amadeus-a853a7242/
Mason on BlueSky: https://bsky.app/profile/pregnantsonic.com
Learn more about your ad choices. Visit megaphone.fm/adchoices

Dec 13, 2024 • 53min
The FAIK Files | AI Gone Wild: Worrisome Leaks, Misguided Conspiracies, and More
Join an engaging discussion about AI's surprising roles, like how ChatGPT helped solve tech issues in a home studio. Dive into the mystery of 'David Mayer,' whose name sparked wild conspiracy theories and raised important privacy questions. Explore Tencent's innovative open-source video AI that's shaking up the landscape. And don't miss the latest drama of leaked OpenAI models, adding fuel to the AI dumpster fire. This mix of humor and insights makes for a fascinating exploration of modern tech and its quirks!