Open source models enable community contributions and progress in machine learning.
Deepening collaboration with AWS and utilizing hardware accelerators for improved ML deployment and cost savings.
Deep dives
Collaboration with AWS for Open Source Models and Infrastructure
Hugging Face expresses gratitude to AWS for their continuous support and sponsorship. AWS offers a range of ML and AI services to make ML accessible across industries. They prioritize innovation in areas like infrastructure, tools like Amazon SageMaker, and AI services such as Amazon Code Whisperer. Purpose-built ML accelerators for training and inference on AWS are also available.
Interview with Jeff Budier from Hugging Face
Host Sam Charington interviews Jeff Budier, Head of Product at Hugging Face, on topics like open source and generative AI. Jeff discusses his background and Hugging Face's mission to create an AI that can have interactive and fun conversations. They reflect on the recent surge of AI news and how the landscape of AI is evolving. They also emphasize the importance of open source in advancing machine learning and making it accessible to all.
GPT-4 and the Open Source Landscape of Models
The release of GPT-4, along with other announcements, marks a shift in AI where new models are being released in a more commercialized manner. Hugging Face emphasizes the importance of open source models to enable the community to make contributions and progress together. They discuss the need for the right tool for specific tasks, as well as the potential cost and performance limitations of large-scale LLMs. Hugging Face highlights their commitment to providing accessible open source models and collaborating with other organizations.
Collaboration with AWS for Training and Deployment
Hugging Face deepens its collaboration with AWS to train new foundational models on their supercomputer cluster. They also work closely with AWS engineering teams to enhance the adoption of machine learning within companies. Hugging Face mentions their efforts to provide a seamless developer experience between Hugging Face and Amazon SageMaker, enabling customers to easily deploy and fine-tune models with cost control. They discuss the integration with hardware accelerators like Tranium and Inferentia, which offer significant performance improvements and cost savings.
Today we’re joined by Jeff Boudier, head of product at Hugging Face 🤗. In our conversation with Jeff, we explore the current landscape of open-source machine learning tools and models, the recent shift towards consumer-focused releases, and the importance of making ML tools accessible. We also discuss the growth of the Hugging Face Hub, which currently hosts over 150k models, and how formalizing their collaboration with AWS will help drive the adoption of open-source models in the enterprise.
The complete show notes for this episode can be found at twimlai.com/go/624
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode