In this week’s episode, I speak with Damien Desfontaines, also known by the pseudonym “Ted”, who is the Staff Scientist at Tumult Labs, a startup leading the way on differential privacy. In Damien’s career, he has led an Anonymization Consulting Team at Google and specializes in making it easy to safely anonymize data. Damien earned his PhD and wrote his thesis at ETH Zurich, as well as his Master's Degree in Mathematical Logic and Theoretical Computer Science.
Tumult Labs’ platform makes differential privacy useful by making it easy to create innovative privacy and enabling data products that can be safely shared and used widely. In this conversation, we focus our discussion on Differential Privacy techniques, including what’s next in its evolution, common vulnerabilities, and how to implement differential privacy into your platform.
When it comes to protecting personal data, Tumult Labs has three stages in their approach. These are Assess, Design, and Deploy. Damien takes us on a deep dive into each with use cases provided.
Topics Covered:
- Why there's such a gap between the academia and the corporate world
- How differential privacy's strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction & usability
- When to use "local" vs "central" differential privacy techniques
- Advancements in technology that enable the private collection of data
- Tumult Labs' Assessment approach to deploying differential privacy, where a customer defines its 'data publication' problem or question
- How the Tumult Analytics platform can help you build different privacy algorithms that satisfies 'fitness for use' requirements
- Why using gold standard techniques like differential privacy to safely release, publish, or share data has value far beyond compliance
- How data scientists can make the analysis & design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks & number of tasks that you can possibly answer
- Damien's work assisting the IRS & DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project
- How to address security vulnerabilities (i.e. potential attacks) to differentially private datasets
- Where you can learn more about differential privacy
- How Damien sees this space evolving over the next several years
Resources Mentioned:
Guest Info:
Send us a text
Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.
Shifting Privacy Left MediaWhere privacy engineers gather, share, & learn
Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.
Copyright © 2022 - 2024 Principled LLC. All rights reserved.