2min chapter

The Data Exchange with Ben Lorica cover image

LLMs Are the Key to Unlocking the Next Generation of Search

The Data Exchange with Ben Lorica

CHAPTER

The Future of User Experience

The advantage of this architecture we sketched out is a lot of the, I mean, the validated answers reside in your own knowledge base or vector database. And you're just using the LLM to do some summarization at the back end. So then at that point, you may not need this one trillion parameter model. Maybe you can get by with like a low single digit billion Parameter Model. Today, today we actually stream because of the fact that the generative step is much more expensive and has higher latency. But I think that our current late latencies are not representative of the latencies you'll see in six months from now. It's going to get much quicker across the

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode