The Cerebral Valley Podcast cover image

Amazon Bedrock & BabyAGI (with Jon Turow)

The Cerebral Valley Podcast

00:00

How to Chain Together Prompts and Build Context for Machine Learning

There's two kinds of concepts here. One is called chaining where I can actually ask a sequence of prompts of a model. And the model will remember as context, the questions and answers that came before it. So think about asking in chat GPT, who were the last five presidents of the United States? Right. You don't need to repeat the question that you just asked. It's going to know that you referred to the presidents of the U.S. The other concept is an ensemble where instead of using one single model for everything, we can actually use a combination of models for slightly different parts of our application.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app