AI Engineering Podcast

Building Semantic Memory for AI With Cognee

8 snips
Nov 25, 2024
Vasilije Markovich, a data engineer and AI specialist from Montenegro, discusses enhancing large language models with memory. He highlights the challenges of context window limitations and forgetting in LLMs, introducing hierarchical memory to improve performance. Vasilije dives into his creation, Cognee, which manages semantic memory, emphasizing its potential applications and the blend of cognitive science with data engineering. He shares insights from building an AI startup, the importance of user feedback, and future developments in open-source AI technology.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

LLM Memory as In-Context Learning

  • "Memory" in LLMs is based on in-context learning, managing context windows like a feature store.
  • This differs from training/fine-tuning, offering cheaper and potentially unlimited context.
INSIGHT

Forgetting in LLMs

  • LLMs exhibit "forgetting" due to context window limits and prioritization of recent information.
  • This can be problematic in chain-of-thought prompting where earlier information gets lost.
INSIGHT

Multi-Turn Interactions and Context

  • Multi-turn interactions complicate LLM memory by needing shared context management.
  • Without it, you're talking to the same system despite the illusion of different turns.
Get the Snipd Podcast app to discover more snips from this episode
Get the app