Unsupervised Learning

by Redpoint Ventures
undefined
7 snips
Apr 4, 2023 • 33min

Ep 4: Fixie.ai CEO Matt Welsh on How LLMs Will Change the Way We Work

Jordan and Erica chat with Fixie.ai co-founder and former Harvard CS professor on the journey from researcher to Google eng leader and then to startup life prior to founding Fixie, a new platform for building LLM-based apps (a Redpoint portfolio company). We also talk about having Mark Zuckerberg in his CS class, ChatGPT Plugins and the AI ecosystem, and what the future might look like for our kids with AGI. You can find Matt on Twitter (@mdwelsh) and learn more about Fixie at https://www.fixie.ai/(1:06) - Matt talks about his early days at Berklee and Harvard(6:45) - Making the jump to the start-up world(8:15) - Matt explains what Fixie does(11:30) - How Matt thinks about use cases for Fixie(14:20) - How work might change with AI's integration(17:04) - The future of LLM's(20:30) - Matt's take on the race to AGI (25:20) - How Matt thinks about the future of the world for his kids and how they might use AI(29:24) - Quick fire round With your co-hosts:@jasoncwarner- Former CTO GitHub, VP Eng Heroku & Canonical@ericabrescia- Former COO Github, Founder Bitnami (acq’d by VMWare)@patrickachase- Partner at Redpoint, Former ML Engineer LinkedIn@jacobeffron- Partner at Redpoint, Former PM Flatiron Health
undefined
Mar 21, 2023 • 48min

Ep 3: NEAR CEO Illia Polosukhin on the Origins of the Transformer Paper and The Overlap Between AI and Crypto

Jacob and Jason sit down with NEAR CEO and Transformer paper author Illia Polosukhin. They discuss Illia’s fascinating journey from Ukraine to Google and AI to crypto, the origin story behind the “Attention Is All You Need” paper and the overlap between AI and crypto. Illia also shared his thoughts on AGI and the problems that excite him most in AI right now. You can find Illia on Twitter (@ilblackdrago) and learn more about NEAR (@nearprotocol)(00:39) - Welcoming Illia; how he became interested in AI, transitioning into Crypto and explaining NEAR (02:40)  - Walking through Illia's story more in depth(07:46) - How the Transformer Paper came to be(11:24) - Understanding the Transformer Papers' impact(18:28) - The overlap of Crypto and AI and how Illia sees the future of how they develop together(26:47) - Illia's views on AGI(30:46) - Optimism vs pessimism of the future of machine learning as a tool(41:32) - What problems Illia sees in AI right now(45:03) - Rapid fire questions(47:36) - Where to learn more about NEAR and Illia With your co-hosts:@jasoncwarner- Former CTO GitHub, VP Eng Heroku & Canonical@ericabrescia- Former COO Github, Founder Bitnami (acq’d by VMWare)@patrickachase- Partner at Redpoint, Former ML Engineer LinkedIn@jacobeffron- Partner at Redpoint, Former PM Flatiron Health 
undefined
Mar 7, 2023 • 46min

Ep 2: Databricks CTO Matei Zaharia on scaling and orchestrating large language models

Patrick and Jacob sit down with Matei Zaharia, Co-Founder and CTO at Databricks and Professor at Stanford. They discuss how companies are training and serving models in production with Databricks, where LLMs fall short for search and how to improve them, the state of the art AI research at Stanford, and how the size and cost of models is likely to change with technological advances in the coming years. (0:00) - Introduction(2:04) - Founding story of Databricks(6:03) - PhD classmates using early version of spark for Netflix competition(6:55) - Building applications with MLFlow(9:55) - LLMs and ChatGPT(12:05) - Working with and fine-tuning foundation models(13:00) - Prompt engineering here to stay or temporary?(15:12) - Matei’s research at Stanford. The Demonstrate-Search-Predict framework (DSP)(17:42) - How LLMs will be combined with classic information retrieval systems for world-class search(19:38) - LLMs writing programs to orchestrate LLMs(20:36) - Using LLMs in Databricks cloud product(24:21) - Scaling LLM training and serving(27:29) - How much will cost to train LLMs go down in coming years?(29:22) - How many parameters is too many?(31:14) - Open source vs closed source?(35:19) - Stanford AI research - Snorkel, ColBERT, and More(38:58) - Matei getting a $50 amazon gift card for weeks of work(43:23) - Quick-fire round With your co-hosts:@jasoncwarner- Former CTO GitHub, VP Eng Heroku & Canonical @ericabrescia- Former COO Github, Founder Bitnami (acq’d by VMWare) @patrickachase- Partner at Redpoint, Former ML Engineer LinkedIn @jacobeffron- Partner at Redpoint, Former PM Flatiron Health
undefined
25 snips
Feb 22, 2023 • 49min

Ep 1: Hugging Face CEO Clem Delangue on The Future of Open vs Closed Source in AI

Jacob and Jason sit down with Hugging Face CEO Clem Delangue and discuss trends in who’s using Hugging Face, the future of closed source vs open source in machine learning, why Clem compares large closed foundation models to Formula One Cars, how enterprise AI teams will evolve and AI safety. (0:00) - Introduction(1:37) - Welcome Clem(1:57) - Starting Hugging Face(5:42) - Influence of ChatGPT(15:47) - Use cases of large vs. small platforms(18:44) -  Should large language models be open?(30:20) - What’s next for Hugging Face?(43:06) - Rapid fire(47:02) - Learn more about Hugging Face With your co-hosts:@jasoncwarner- Former CTO GitHub, VP Eng Heroku & Canonical @ericabrescia- Former COO Github, Founder Bitnami (acq’d by VMWare) @patrickachase- Partner at Redpoint, Former ML Engineer LinkedIn @jacobeffron- Partner at Redpoint, Former PM Flatiron Health 
undefined
Feb 16, 2023 • 2min

Unsupervised Learning: Trailer

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app