

Privacy-Preserving Decentralized Data Science with Andrew Trask - TWiML Talk #241
Mar 21, 2019
In this engaging discussion, Andrew Trask, a PhD student at the University of Oxford and leader of the OpenMined Project, delves into the intricacies of privacy-preserving AI. He shares his journey from music to machine learning, discussing the importance of differential privacy and secure multi-party computation. Trask highlights the advancements in the OpenMined community, emphasizing future developments in decentralized data science, particularly in finance and healthcare. The conversation underscores the need for robust privacy tools in a data-driven world.
AI Snips
Chapters
Transcript
Episode notes
Andrew's Introduction to Machine Learning
- Andrew Trask's interest in machine learning began in a college AI course.
- His experience at Digital Reasoning exposed him to the challenges of machine learning on private data.
The Magic of Homomorphic Encryption
- Homomorphic encryption allows computation on encrypted data, opening new possibilities for privacy-preserving machine learning.
- Trask's research combines cryptography, statistics, and machine learning to address this underserved area.
Core Concepts in Privacy-Preserving ML
- Differential privacy, secure multi-party computation (MPC), and federated learning are key for privacy-preserving machine learning.
- Homomorphic encryption, while powerful, is slow, making MPC a faster alternative.