DevOps and Docker Talk: Cloud Native Interviews and Tooling cover image

Local GenAI LLMs with Ollama and Docker

DevOps and Docker Talk: Cloud Native Interviews and Tooling

NOTE

Inaccuracies in model predictions

A demonstration at DockerCon 2023 showcased a scenario where a model was used to answer a question on reading PDFs using Lang chain. However, the model's answer lacked a real-world basis and essentially made up information. This highlights the challenge of models generating inaccurate outputs when they lack relevant data, as they tend to predict the most likely course based on existing information, even if it is not factual.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner