Practically Intelligent cover image

Open source LLM wave + How YC companies build with LLMs

Practically Intelligent

00:00

Introduction

The hosts discuss the development of open source models and how people are distilling and quantizing them for self-hosting use. They highlight a team of researchers who distilled and quantized Meta's open source Wama model into a 7 billion per liter version, achieving similar performance to chat GPT.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app