3min snip

My First Million cover image

Brainstorming ChatGPT Business Ideas With Billionaire Dharmesh Shah

My First Million

NOTE

Reducing Concepts Down to Vectors

Vector embeddings are a way to represent concepts using numbers in multiple dimensions. In one dimension, a point on a line can be described by its distance from the origin. In two dimensions, a point is defined by its coordinates, and the distance between points can be calculated mathematically. The same applies to three dimensional space. However, in an abstract world with 1000 dimensions, any concept, such as a paragraph or a tweet, can be reduced to a point in this space using a set of 1000 numbers called a vector. The distance between vectors represents the semantic distance or the meaning-based relationship between concepts, rather than just keyword matching. This concept of vector embedding allows for measuring the similarity between concepts, even if they use different words. This approach presents a significant opportunity in AI to move away from crude keyword-based matching and enable more intelligent and nuanced search capabilities.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode