"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis cover image

E5: The Embedding Revolution: Anton Troynikov on Chroma, Stable Attribution, and future of AI

"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis

NOTE

Linear Transformations and Embedding Spaces in Machine Learning

There has been a belief that a linear transform is insufficient to transform one embedding space into another when the semantics are similar, but recent research has shown that this may not be the case./nModels tend to learn similar representations of the same thing, making mapping between them straightforward./nResearch in machine learning could benefit from a more empirical approach, observing and demonstrating findings like the invariant nature of embeddings to rotations and scaling./nThe invariant nature of operations in vector space opens up the possibility of performing computations on data without knowing its specific content, potentially enabling homomorphic computation.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner