

AI on your phone? Tim Dettmers on quantization of neural networks — #41
Aug 10, 2023
Tim Detmer, leader in quantization, discusses his background and research program. Topics include large language models, quantization, democratization of AI technology, future of AI. Other chapters cover challenges of dyslexia, neural network progress, quantization in improving efficiency, cost and efficiency of consumer GPUs for AI models, compression and algorithms in network transfer, and limitations of neural networks.
Chapters
Transcript
Episode notes
1 2 3 4 5 6 7
Introduction
00:00 • 3min
Overcoming Challenges and Pursuing a Career in Mathematical Software Development
02:36 • 13min
Reflection on Kaggle and the Progress of Neural Networks
15:14 • 3min
Improving Efficiency Through Quantization
18:03 • 22min
Cost and Efficiency of Consumer GPUs for AI Models
40:13 • 13min
Exploring Compression, Quantization, and Algorithms in Neural Network Transfer
53:40 • 2min
Reusing Mining Rigs and Limitations of Neural Networks
55:57 • 11min