
Vector Podcast Trey Grainger - Wormhole Vectors
Nov 7, 2025
Trey Grainger, lead author of "AI Powered Search" and founder of Search Kernel, dives into the cutting-edge concept of Wormhole Vectors. He explains how these vectors interconnect various types of data spaces, enhancing hybrid search capabilities. Trey simplifies complex ideas, detailing behavioral embeddings derived from user interactions and the roles of semantic knowledge graphs. He shares practical applications and innovative methods to combine dense and sparse vectors, all while emphasizing the transformative potential of wormholes in search technology.
AI Snips
Chapters
Books
Transcript
Episode notes
Embeddings Represent Meaning As Points
- Embeddings are points in a continuous vector space that capture meaning via latent features.
- Averaging embeddings produces meaningful interpolations like a 'puppy Darth Vader' between concepts.
Dense And Sparse Spaces Are Complementary
- Dense embeddings compress semantic attributes into latent dimensions, while sparse vectors map directly to lexical terms.
- Both dense and sparse vectors are valid vector spaces but support different query paradigms and strengths.
Hybrid Search Often Treats Spaces Separately
- Hybrid search often runs sparse and dense searches separately and fuses results (e.g., RRF).
- Treating spaces separately misses opportunities to translate meaning between them for better recall and relevance.


