Scaling Theory

#8 – Sara Hooker: Big AI, The Compute Frenzy, and Grumpy Models

Aug 5, 2024
Sara Hooker, VP of Research at Cohere and a recognized AI innovator, shares insights on scaling laws and their limits, emphasizing how smaller models can outperform larger ones. She discusses the balance between open-source and proprietary models, highlighting the need for inclusivity, particularly for multilingual capabilities. Sara also tackles data accessibility challenges and copyright issues affecting AI training, and reflects on how her diverse upbringing informs her approach to innovative research practices. Expect a thought-provoking conversation on AI's future!
Ask episode
Chapters
Transcript
Episode notes