This chapter explores the rise of Cerebras Systems as a major contender against NVIDIA in the AI hardware sector, emphasizing their innovative 'hugeafers' and enhanced inference capabilities. It details the launch of Cerebras' inference cloud service and compares performance metrics with traditional services, raising questions about future dynamics in AI technology. Additionally, it highlights the broader competitive landscape and market implications for both companies amidst evolving investment trends and product advancements.
Our 181st episode with a summary and discussion of last week's big AI news!
With hosts Andrey Kurenkov and Jeremie Harris
Read out our text newsletter and comment on the podcast at https://lastweekin.ai/
If you would like to become a sponsor for the newsletter, podcast, or both, please fill out this form.
Email us your questions and feedback at contact@lastweekinai.com and/or hello@gladstone.ai
In this episode:
- Google's AI advancements with Gemini 1.5 models and AI-generated avatars, along with Samsung's lithography progress.
- Microsoft's Inflection usage caps for Pi, new AI inference services by Cerebrus Systems competing with Nvidia.
- Biases in AI, prompt leak attacks, and transparency in models and distributed training optimizations, including the 'distro' optimizer.
- AI regulation discussions including California’s SB1047, China's AI safety stance, and new export restrictions impacting Nvidia’s AI chips.
Timestamps + Links:
- (00:00:00) Intro / Banter
- (00:03:08)Response to listener comments / corrections
- Tools & Apps
- Applications & Business
- Projects & Open Source
- Research & Advancements
- Policy & Safety
- Synthetic Media & Art
- (02:14:06) Outro