The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch cover image

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch

20VC: Spending $2M to Train a Single AI Model: What Matters More; Model Size or Data Size | Hallucinations: Feature or Bug | Will Everyone Have an AI Friend in the Future & Raising $150M from a16z with Noam Shazeer, Co-Founder & CEO @ Character.ai

Aug 31, 2023
Noam Shazeer, co-founder and CEO of Character.AI, is a celebrated computer scientist known for his work in AI and NLP, including his role at Google Brain. He dives into the critical balance between model size and data size, questioning their true value in the evolving AI landscape. Shazeer shares insights on barriers to AI, his journey at Google, and the unique role startups can play amidst larger corporations. He also contemplates the future of AI, emphasizing the importance of fostering real human connections through technology.
35:09

Episode guests

Podcast summary created with Snipd AI

Quick takeaways

  • The size of the model and the amount of computation required are the main challenges in AI.
  • Character.ai prioritizes building versatile and usable models.

Deep dives

The Challenge of Model Size and Computation

The size of the model and the amount of computation required to train it are the main challenges in AI. Training a larger model for a longer time is desired, but the number of computational operations needed is a limiting factor. For example, a model was trained last summer, and it took $2 million worth of compute cycles. The advancement of hardware plays a crucial role in addressing these challenges and pushing AI technology forward.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner