The Shifting Privacy Left Podcast cover image

The Shifting Privacy Left Podcast

Latest episodes

undefined
Aug 22, 2023 • 1h 4min

S2E24: "Cloud-Native Privacy Engineering via DevPrivOps" with Elias Grünewald (TU Berlin)

This week’s guest is Elias Grünewald, Privacy Engineering Research Associate at Technical University, Berlin, where he focuses on cloud-native privacy engineering, transparency, accountability, distributed systems, & privacy regulation. In this conversation, we discuss the challenge of designing privacy into modern cloud architectures; how shifting left into DevPrivOps can embed privacy within agile development methods; how to blend privacy engineering & cloud engineering; the Hawk DevOps Framework; and what the Shared Responsibilities Model for cloud lacks. Topics Covered:Elias's courses at TU Berlin: "Programming Practical Privacy: Web-based Application Engineering & Data Management" & "Advanced Distributed Systems Prototyping: Cloud-native Privacy Engineering"Elias' 2022 paper, "Cloud Native Privacy Engineering through DevPrivOps" - his approach, findings, and frameworkThe Shared Responsibilities Model for cloud and how to improve it to account for privacy goalsDefining DevPrivOps & how it works with agile developmentHow DevPrivOps can enable formal privacy-by-design (PbD) & default strategiesElias' June 2023 paper, "Hawk: DevOps-Driven Transparency & Accountability in Cloud Native Systems," which helps data controllers align cloud-native DevOps with regulatory requirements for transparency & accountabilityEngineering challenges when trying to determine the details of personal data processing when responding to access & deletion requestsA deep-dive into the Hawk 3-phase approach for implementing privacy into each DevOps phase: Hawk Release; Hawk Operate; & Hawk MonitorHow open sourced project, TOUCAN, is documenting conceptual best practices for corresponding phases in the SDLC, and a call for collaborationHow privacy engineers can convince their management to adopt a DevPrivOps approachRead Elias' papers, talks, & projects:Cloud Native Privacy Engineering through DevPrivOpsHawk: DevOps-driven Transparency and Accountability in Cloud Native Systems CPDP Talk: Privacy Engineering for Transparency & Accountability TILT: A GDPR-Aligned Transparency Information Language & Toolkit for Practical Privacy EngineeringTOUCAN Guest Info:Connect with Elias on LinkedInContact Elias at TU BerlinSend us a text Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
undefined
Aug 15, 2023 • 46min

S2E23: "Navigating the Privacy Engineering Job Market" with George Ratcliffe (Stott & May)

This week, my guest is George Ratcliffe, Head of the Privacy GRC & Cryptography Executive Search Practice at recruitment firm, Stott & May.In this conversation, we discuss the current market climate & hiring trends for technical privacy roles; the need for higher technical capabilities across the industry;  pay ranges within different technical privacy roles; and George’s tips and tools for applicants interested in, entering, and/or transitioning into the privacy industry. Topics Covered:Whether the hiring trends are picking back up for technical privacy rolesThe three 'Privacy Engineering' roles that companies seek to hire for and core competencies: Privacy Engineer, Privacy Software Engineer, & Privacy Research EngineerThe demand for 'Privacy Architects'IAPP's new Privacy Engineering infographic & if it maps with how companies approach hiring Overall hiring trends for privacy engineers & technical privacy rolesAdvice technologists who want to grow into Privacy Engineer, Researcher, or Architect rolesCapabilities that companies need or want in candidates that they can't seem to find; & whether there are roles that are harder to fill because of a lack of candidates & skill setsWhether a PhD is necessary to become a 'Privacy Research Engineer'Typical pay ranges across technical privacy roles: Privacy Engineer, Privacy Software Engineer, Privacy Researcher, Privacy ArchitectDifferences in pay for a Privacy Engineering Manager vs an Independent Contributor (IC) and the web apps for crowd-sourced info about roles & salary rangesWhether companies seek to fill entry level positions for technical privacy rolesHow privacy technologists can stay up-to-date on hiring trendsResources Mentioned:Check out episode S2E11: Lessons Learned as a Privacy Engineering Manager with Menotti Minutillo (ex-Twitter & Uber)IAPP Defining Privacy Engineering Infographic Check out Blind and Levels for compensation benchmarkingGuest Info:Connect with George on LinkedInReach out to Stott & May for your privacy recruiting needsSend us a text Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnBuzzsprout - Launch your podcastDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
undefined
Aug 1, 2023 • 37min

S2E22: Why You Need an 'Outside-In' Approach to Privacy Risk Monitoring with Sanjay Saini (Privaini)

Get ready for an eye-opening conversation with Sanjay Saini, the founder and CEO of Privaini, a groundbreaking privacy tech company. Sanjay's journey is not only impressive due to his role in creating high-performance teams that have built entirely new product categories, but also for the invaluable lessons he learned from his grandfather about the pillars of successful companies - trust and human connections. In our discussion, Sanjay shares how Privaini is raising the privacy bar by constructing the world's largest repository of company privacy policies and practices. It's a fascinating dive into the future of privacy risk management.Imagine being able to gain full coverage of your external privacy risks with continuous monitoring. Wouldn't that revolutionize your approach to risk management? That's exactly what Privaini is doing! Sanjay explains how Privaini utilizes AI to analyze, standardize, and derive meaningful "privacy views" and insights from vast volumes of publicly-available data. Listen in to understand how Privaini's innovative approach is helping companies gain visibility into their entire business network to make quicker, more informed decisions. Topics Covered:What motivated Sanjay to found companies that bring trusted systems to market and why he founded Privaini  to  focus on continuous privacy risk monitoringHow to quantitatively analyze & monitor privacy risk throughout an entire 'business network' and what Sanjay means by 'business network'Which stakeholders benefit from using the Privaini platformThe benefits to calculating a "quantified privacy risk score" for each company in your business network to effectively monitor privacy riskHow Privaini leverages AI to discover external data about companies' privacy posture and why it must be used in a responsible and deliberate wayWhy effective privacy risk monitoring of a company's business network requires an “outside-in” approachThe importance of continuous monitoring & the benefits to using an 'outside-in' approachWhat it takes to set up an enterprise's network with Privaini for full coverage of external privacy risksThe recent Criteo fines and how Privaini could have helped Criteo surface privacy risks about its vendorsWhy Sanjay believes learning about the “right side” of the equation is necessary in order to "shift privacy left."Guest Info:Connect with Sanjay on LinkedInLearn more about PrivainiSend us a text Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
undefined
Jul 11, 2023 • 56min

S2E21: Containing Big Tech, Federal Privacy Law, & Investing in Privacy Tech with Tom Kemp (Kemp Au Ventures)

This week’s guest is Tom Kemp: author; entrepreneur; former Co-Founder & CEO of Centrify (now called Delinia), a leading cybersecurity cloud provider; and a Silicon Valley-based Seed Investor and Policy Advisor. Tom led campaign marketing efforts in 2020 to pass California Proposition 24, the California Privacy Rights Act, (CPRA), and is currently co-authoring the California Delete Act bill.In this conversation, we discuss chapters within Tom’s new book, Containing Big Tech: How to Protect Our CIVIL RIGHTS, ECONOMY, and DEMOCRACY; how big tech is using AI to feed into the attention economy; what should go into a U.S. federal privacy law and how it should be enforced; and a comprehensive look at some of Tom’s privacy tech investments. Topics Covered:Tom's new book - Containing Big Tech: How to Protect Our Civil Rights, Economy and DemocracyHow and why Tom’s book is centered around data collection, artificial intelligence, and competition. U.S. state privacy legislation that Tom helped get passed & what he's working on now, including: CPRA, the California Delete Act, & Texas Data Broker RegistryWhether there will ever be a U.S. federal, omnibus privacy law; what should be included in it; and how it should be enforcedTom's work as a privacy tech and security tech Seed Investor with Kemp Au Ventures and what inspires him to invest in a startup or notWhat inspired Tom to invest in PrivacyCode, Secuvy & Privaini Why having a team and market size is something Tom looks for when investing. The importance of designing for privacy from a 'user-interface perspective' so that it’s consumer friendlyHow consumers looking to trust companies are driving a shift left movementTom's advice for how companies can better shift left in their orgs & within their business networksResources Mentioned:The California Consumer Privacy Act (amended by the CPRA)The California Delete ActGuest Info:Follow Tom on LinkedInKemp Au VenturesPre-order Containing Big Tech: How to Protect Our CIVIL RIGHTS, ECONOMY, and DEMOCRACY Send us a text Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
undefined
Jul 5, 2023 • 43min

S2E20: Location Privacy, Data Brokers & Privacy Datasets with Jeff Jockisch

This week’s guest is Jeff Jockisch, Partner at Avantis Privacy and co-host of the weekly LinkedIn Live event, Your Bytes = Your Rights, a town hall-style discussion around ownership, digital rights, and privacy. Jeff is currently a data privacy researcher at PrivacyPlan, where he focuses specifically on privacy data sets. In this conversation, we delve into current risks to location privacy; how precise location data really is; how humans can have more control over their data; and what organizations can do to protect humans’ data privacy. For access to a dataset of data resources and privacy podcasts, check out Jeff’s robust database — the Shifting Privacy Left podcast was recently added.Topics Covered:Jeff’s approach to creating privacy data sets and what “gaining insight into the privacy landscape” means.How law enforcement can be a threat actor to someone’s privacy, using the example of Texas' abortion lawWhether data brokers are getting exact location information or are inferring someone’s location.Why geolocation brokers had not considered themselves data brokers.Why anonymization is insufficient for location privacy. How 'consent theater' coupled with location leakage is an existential threat to our privacy.How people can protect themselves from having data collected and sold by data and location brokers.Why apps permissions should be more specific when notifying users about personal data collection and use. How Apple and Android devices treat Mobile Ad ID (MAID) differently and how that affects your historical location data.How companies can protect data by using broader geolocation information instead of precise geolocation information. More information about Jeff's LinkedIn Live show, Your Bytes = Your Rights.Resources Mentioned:Avantis PrivacyPrivacy PlanThreat modeling episode with Kim Wuyts"Your Bytes = Your Rights" LinkedIn LiveThe California Delete ActPrivacy Podcast DatabaseContaining Big Tech Guest Info:Follow Jeff on LinkedInSend us a text Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
undefined
Jun 27, 2023 • 45min

S2E19: Privacy Threat Modeling - Mitigating Privacy Threats in Software with Kim Wuyts (KU Leuven)

This week's guest is Kim Wuyts, Senior Postdoctoral Researcher at the DistriNet Research Group at the Department of Computer Science at KU Leuven. Kim is one of the leading minds behind the development and extension of LINDDUN, a privacy threat modeling framework that mitigates privacy threats in software systems.In this conversation, we discuss threat modeling based on the Threat Modeling Manifesto Kim co-authored; the benefits to using the LINDDUN privacy threat model framework; and how to bridge the gap between privacy-enhancing technologies (PETs) in academia and the commercial world.    Topics Covered:Kim's career journey & why she moved into threat modeling.The definition of 'threat modeling,' who should threat model, and what's included in her "Threat Modeling Manifesto."The connection between threat modeling & a 'shift left' mindset / strategy.Design patterns that benefit threat modeling & anti-patterns that inhibit.Benefits to using the LINDDUN Privacy Threat Modeling framework for mitigating privacy threats in software, including the 7 'privacy threat types,' associated 'privacy threat trees,' and examples.How "privacy threat trees' refine each threat type into concrete threat characteristics, examples, criteria & impact info.Benefits & differences between LINDDUN GO and LINDDUN PRO.How orgs can combine threat modeling approaches with PETs to address privacy risk.Kim's work as Program Chair for the International Workshop on Privacy Engineering (IWPE), highlighting some anticipated talks.The overlap of privacy & AI threats, and Kim's recommendation of The Privacy Library of Threats 4 AI ("PLOT4AI") Threat Modeling Card DeckRecommended resources for privacy threat modeling, privacy engineering & PETs.How the LINDDUN model & methodologies have been adopted by global orgs.How to bridge the gap between the academic & commercial world to advance & deploy PETs.Resources Mentioned:The Threat Modeling ManifestoLINDDUN Privacy Threat Model  STRIDE threat modelThreat Modeling Connect CommunityElevation of Privilege card gamePlot4AI (privacy & AI threat modeling) card deckInternational Workshop on Privacy Engineering (IWPE)Guest Info:Follow Kim on LinkedInSend us a text Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
undefined
May 16, 2023 • 48min

S2E18: Making Digital Contact Cards Private, Shareable & Updatable with Brad Dominy (Neucards)

I am delighted to welcome my next guest, Brad Dominy. Brad is a MacOS and iOS developer and Founder & Inventor of Neucards, a privacy-preserving app that enables secure shareable and updatable digital contacts. In this conversation, we delve into why personally managing our digital contacts has been so difficult and Brad's novel approach to securely manage our contacts, architected with privacy by design and default.Contacts have always been the “junk drawer” of digital data, where people have information that they want to keep up-to-date, but are rarely able to based on current technology. The vCard standard is outdated, but is the only standard that works across iOS, Android, and Microsoft. It is still the most commonly used contact format, but lacks any capacity for updating contacts. Once someone exchanges their contact information with you, it then falls on you to keep that up-to-date. This is why Brad created Neucards: to gain the benefits of sharing information easily, privately (with E2EE) and receiving updates across all platforms.Topics Covered:Why it is difficult to keep our digital contacts up-to-date across devices and platforms.Brad describes his career journey that inspired him to invent Neucards; the problems Neucards solves for; and why this became his passion project for over a decadeWhy companies haven’t innovated more in the digital contacts spaceThe 3 main features that make Neucards different from other contact appsHow Neucards enables you to share digital contacts data easily & securelyNeucards' privacy by design and default approach to sharing and updating digital contactsHow you can use NFC tap tags with Neucards to make the process of sharing digital contacts much easierWhether Neucards can solve the "New phone, who dis?" problemWhether we will see an update to the vCard standard or new standards for digital contactsNeucards' roadmap, including a 'mask communications' featureThe importance of language; the difference between 'privacy-preserving' vs. 'privacy-enabling' architectural approachesResources Mentioned:Learn about NeucardsDownload the Neucards iOS appGuest Info:Follow Brad on LinkedInSend us a text Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
undefined
May 9, 2023 • 46min

S2E17 - Noise in the Machine: How to Assess, Design & Deploy 'Differential Privacy' with Damien Desfontaines (Tumult Labs)

In this week’s episode, I speak with Damien Desfontaines, also known by the pseudonym “Ted”, who is the Staff Scientist at Tumult Labs, a startup leading the way on differential privacy. In Damien’s career, he has led an Anonymization Consulting Team at Google and specializes in making it easy to safely anonymize data. Damien earned his PhD and wrote his thesis at ETH Zurich, as well as his Master's Degree in Mathematical Logic and Theoretical Computer Science.Tumult Labs’ platform makes differential privacy useful by making it easy to create innovative privacy and enabling data products that can be safely shared and used widely. In this conversation, we focus our discussion on Differential Privacy techniques, including what’s next in its evolution, common vulnerabilities, and how to implement differential privacy into your platform.When it comes to protecting personal data, Tumult Labs has three stages in their approach. These are Assess, Design, and Deploy. Damien takes us on a deep dive into each with use cases provided.Topics Covered:Why there's such a gap between the academia and the corporate worldHow differential privacy's strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction & usabilityWhen to use "local" vs "central" differential privacy techniquesAdvancements in technology that enable the private collection of dataTumult Labs' Assessment approach to deploying differential privacy, where a customer defines its 'data publication' problem or questionHow the Tumult Analytics platform can help you build different privacy algorithms that satisfies 'fitness for use' requirementsWhy using gold standard techniques like differential privacy to safely release, publish, or share data has value far beyond complianceHow data scientists can make the analysis & design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks & number of tasks that you can possibly answerDamien's work assisting the IRS & DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards projectHow to address security vulnerabilities (i.e. potential attacks) to differentially private datasetsWhere you can learn more about differential privacyHow Damien sees this space evolving over the next several yearsResources Mentioned:Join the Tumult Labs SlackLearn about Tumult LabsGuest Info:Connect with Damien on LinkedInLearn more on Damien’s websiteFollow 'Ted' on TwitterSend us a text Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
undefined
May 2, 2023 • 60min

S2E16: Words with Impact; Communication Tips for Privacy Technologists with Melanie Ensign (Discernible)

I'm delighted to welcome guest, Melanie Ensign, Founder and CEO of Discernible, where she helps organizations adopt effective communication strategies to improve risk-related outcomes. She's managed security & privacy communications for some of the world's most notable brands, including Facebook, Uber & AT&T.Melanie counsels executives and technical teams to cut through internal politics, dysfunctional inertia & meaningless metrics. For the past 10 years, she's also led the press department & communication strategy for DEF CON. Also, Melanie is an accomplished scuba diver and brings lessons learned preventing, preparing for & navigating unexpected high-risk underwater incidents to her work in security & privacy. Today's discussion focuses on the importance of communication strategies and tactics for privacy engineering teams. Topics Covered:Melanie's career journey and how she leveraged her experience in shark science to help executives get over their initial fears of the unknown in security & privacyHow Melanie guides and supports technical teams at Discernible on effective communicationsHow to prevent 'Privacy Outrage'The value of preventing privacy snafus rather than focusing only on crisis commsHow companies can use technical communication strategies & tactics to earn trust with the publicThe problem with incentives - why most social media metrics have been bullshit for far too longWhy Melanie decided to leave big tech to start DiscernibleInsight into the 7 Arthur W. Page Society Principles, a 'code of ethics' for communications professionalsWhat makes for a good PR story that the media would want to coverWhy press releases are mostly ineffective except for announcing funding raisesThe importance of educating the community for which you're buildingMelanie's advice to Elon Musk, who does not invest in a comms teamWhat OpenAI could have done differently, and whether their go-to-market strategy was effectiveThe importance of elevating Compliance teams to Business Advisors in the eyes of stakeholdersResources Mentioned:Subscribe to the Discernible newsletterDiscover Github's ReadMe NewsletterLearn about the Arthur W. Page PrinciplesGuest Info:Follow Melanie on LinkedInFollow Melanie on MastodonSend us a text Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
undefined
Apr 18, 2023 • 40min

S2E15: 'Watching the Watchers: Transparency & Control Research' with Umar Iqbal, PhD (University of Washington)

This week's guest is Umar Iqbal, PhD, a Postdoctoral Scholar at the Paul G. Allen School of Computer Science & Engineering at the University of Washington, working in the Security and Privacy Research Lab. Umar focuses his research on two themes: 1) bringing transparency into data collection and usage practices, and 2) enabling individuals to have control over their own data by identifying & restricting privacy-invasive data collection & usage practices of online servicesHis long-term research vision is to create an environment where users can reap the benefits of technology without losing their privacy by enabling preemptive privacy protections and establishing 'checks & balances' on the Internet. In this discussion, we discuss his previous and current research with a goal of empowering people to protect their privacy on the Internet. Topics Covered:Why Umar focused his research on transparencyUmar's research relating to transparency, data collection & use, with a focus on Amazon's smart speaker & metadata privacy and potential EU regulatory enforcementHis transparency-related work related to browsers & API's, and the growing problem of using fingerprinting techniques to track people without consentHow Umar plans to bring control to individuals by restricting online privacy-invasive data collectionsHow he used a ML technique to detect browser fingerprinting scripts based on their functionalityUmar's research to determine the prevalence of online tracking & measure how effective currently-available tracker detection tools are His research on early detection of emerging privacy threats (e.g., 'browser fingerprinting' & 'navigational tracking', etc.) and his investigation of privacy issues related to IoT (e.g., smart speakers & health & fitness bands that analyze people's voices)How we can ensure strong privacy guarantees and make a more accountable InternetWhy regulations need technological support to be effective for enforcementUmar's advice to developers / hackers looking for 'privacy bugs' via dynamic code analysis and a discussion of the future of 'privacy bug bounties'Resources Mentioned:Read Umar's papers: Google Scholar CitationsGuest Info:Learn about Umar on his websiteConnect with Umar on LinkedInFollow Umar on TwitterSend us a text Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnBuzzsprout - Launch your podcastDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode