20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside
Oct 7, 2024
auto_awesome
Eiso Kant, Co-founder and CEO of Poolside.ai, dives into the competitive world of Artificial General Intelligence (AGI) funding, sharing insights on their recent $500M raise. He discusses how Poolside differentiates itself from other AI models and the challenges of competing with giants like Nvidia. The conversation touches on scaling laws, the future of model performance, and whether $600M is sufficient to remain relevant in the rapidly evolving AI landscape. Kant also explores the dynamics of AI infrastructure and the emotional journey of entrepreneurs.
Poolside is uniquely positioned in the AGI race by focusing on AI tools specifically for software development, leveraging vast code datasets.
The interplay between compute accessibility and data generation is crucial, with Poolside generating synthetic data to enhance AI model performance.
Startups must carve out unique niches amid aggressive corporate spending and price wars in AI development to ensure survival and growth.
Deep dives
The Race for AGI
The journey towards Artificial General Intelligence (AGI) is likened to a race, emphasizing urgency and commitment. There is an ongoing competition among companies to establish themselves in the field of AGI, and the current period is viewed as a foundational moment that will shape the future. Investment rounds, like the recent $500 million funding, are critical for companies to gain traction in this race. The speakers stress the importance of capitalizing on this moment by giving everything they have towards achieving significant advancements in AI.
Poolside's Unique Approach
Poolside aims to bridge the gap between human capabilities and machine intelligence by focusing on software development. The approach differs from other AI companies by concentrating on building AI tools specifically designed for coding, harnessing the vast amounts of code data already generated by the world. A critical insight is that while current language models exhibit strong performance on general tasks, they often lack in specialized areas like reasoning and planning. By leveraging a comprehensive dataset of approximately three trillion tokens of usable code, Poolside is uniquely positioned to create advanced AI solutions tailored for software developers.
Challenges of Learning Models
The complexity of learning models is highlighted, particularly their dependency on large datasets and the inefficiencies inherent in their design. While AI models demonstrate great capabilities in certain domains, they struggle in others due to insufficient foundational data. The chat discusses the importance of incorporating intermediate reasoning processes as a crucial element in the learning environment for AI to improve in real-world applications. The models must learn not only from final outputs but also from iterations and failures experienced during the coding process, which is a significant gap in current AI strategies.
Compute and Data as Bottlenecks
The relationship between compute power and data accessibility is a focal point of the conversation, revealing that compute is a primary bottleneck in advancing AI models. Recent advancements have improved learning efficiencies, but the need for further data generation remains critical. Poolside’s strategy includes generating synthetic data to enhance model performance and build better reinforcement learning algorithms. Given the rapid pace of development in AI, the race hinges on combining improvements in compute with innovative data generation to maintain a competitive edge.
Future Trends and Market Dynamics
The discussion addresses the evolving landscape of AI development and the anticipated consolidation of smaller companies in the market. With large corporations setting aggressive spending trends, it is essential for startups to carve out unique niches to survive. An impending price war in the AI space creates both challenges and opportunities, as companies strive to offer more efficient and lower-cost solutions. Ultimately, the synergy between human talent, innovative research, and efficient infrastructure is crucial for nurturing AI advancement and sustaining long-term growth.
Eiso Kant is the Co-Founder and CTO of Poolside.ai, building next-generation AI for software engineering. Just last week, Poolside announced their $500M Series B valuing the company at $3BN. Prior to Poolside, Eiso founded Athenian, a data-enabled engineering platform. Before that, he built source{d} - the world’s first company dedicated to applying AI to code and software.
1. Raising $600M to Compete in the AGI Race:
What is Poolside? How does Poolside differentiate from other general-purpose LLMs?
How much of Poolside’s latest raise will be spent on compute?
How does Eiso feel about large corporates being a large part of startup LLM provider’s funding rounds?
Why did Poolside choose to only accept investment from Nvidia?
Is $600M really enough to compete with the mega war chests of other LLMs?
2. The Big Questions in AI:
Will scaling laws continue? Have we reached a stage of diminishing returns in model performance for LLMs?
What is the biggest barrier to the continued improvement in model performance; data, algorithms or compute?
To what extent will Nvidia’s Blackwell chip create a step function improvement in performance?
What will OpenAI’s GPT5 need to have to be a gamechanger once again?
3. Compute, Chips and Cash:
Does Eiso agree with Larry Ellison; “you need $100BN to play the foundation model game”? What does Eiso believe is the minimum entry price?
Will we see the continuing monopoly of Nvidia? How does Eiso expect the compute landscape to evolve?
Why are Amazon and Google best placed when it comes to reducing cost through their own chip manufacturing?
Does Eiso agree with David Cahn @ Sequoia, “you will never train a frontier model on the same data centre twice”?
Can the speed of data centre establishment and development keep up with the speed of foundation model development?
4. WTF Happens to The Model Layer: OpenAI and Anthropic…
Does Eiso agree we are seeing foundation models become commoditised?
What would Eiso do if he were Sam Altman today?
Is $6.6BN really enough for OpenAI to compete against Google, Meta etc…?
OpenAI at $150BN, Anthropic at $40BN and X.ai at $24BN. Which would Eiso choose to buy and why?
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode