Programming Throwdown cover image

161: Leveraging Generative AI Models with Hagay Lupesko

Programming Throwdown

CHAPTER

How to Leverage LLMs to Build a Website

The second way is take an open source model and either use it as is or fine-tuned for your needs. It's fairly accessible to the fairly cheap and available. And so what about serving? I want to use the time we have left talk about that. Typically, serving today is done with FP16 or more specifically, BF16. So selling parameters times two bytes, that's, you know, 14 gigabytes. That actually does fit on kind of good GPUs today, like the NVIDIA A140 gigs or even the A10s,.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner