Newcomer Podcast cover image

Amazon Bedrock & BabyAGI (with Jon Turow)

Newcomer Podcast

CHAPTER

How to Chain Together Prompts and Build Context for Machine Learning

There's two kinds of concepts here. One is called chaining where I can actually ask a sequence of prompts of a model. And the model will remember as context, the questions and answers that came before it. So think about asking in chat GPT, who were the last five presidents of the United States? Right. You don't need to repeat the question that you just asked. It's going to know that you referred to the presidents of the U.S. The other concept is an ensemble where instead of using one single model for everything, we can actually use a combination of models for slightly different parts of our application.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner