Thinking Elixir Podcast cover image

154: Serving Up AI with Sean Moriarity

Thinking Elixir Podcast

CHAPTER

How to Train a Model to Follow Instructions

The original GPT three model was like a completion model. So essentially you were telling the model to follow your instructions. And so they started training models specifically to follow instructions. I'm of the opinion that prompt engineering is not necessarily useful, but it seems like one of those things people have too quickly accepted as being normal.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner