This chapter discusses the founding of Mosaic and the obstacles to accessibility in training large language models. It highlights the efficiency of the human brain compared to AI and explores the challenges and costs of training GPT-3. The chapter also emphasizes the importance of efficiency and scalability in the development of large language models.
The Sunday Times’ tech correspondent Danny Fortson brings on Naveen Rao, founder of MosaicML, to talk about the efficiency of the human brain (3:30), slashing the cost to train AI models (8:00), how this is like the evolution of the car (11:40), selling to Databricks (14:30), how the AI market will evolve (17:00), the fallacy of AI doomerism (21:00), growing up in eastern Kentucky (22:30), plunging into the dotcom boom (24:50), why he studied neuroscience (27:10), selling his previous startup to Intel (32:00), solving intelligence (34:40), and what he tells his kids about the future (40:00).
Hosted on Acast. See acast.com/privacy for more information.