6min chapter

Machine Learning Street Talk (MLST) cover image

#59 - Jeff Hawkins (Thousand Brains Theory)

Machine Learning Street Talk (MLST)

CHAPTER

Achieve Near Linear Speed Up in Sparse Networks With Cerebrus

Cerebrus chip has 800 an 50 thousand cores and forty gigabites of a memory on board. Thee cerebrus cores never multiply by zero, so all the zeros are filtered out. This in turn provides a performance advantage by doing useful work during those cycles,. Not to mention the power and efficiency savings. But is there something fundamentally special about sparsity? Numenta certainly seem to think so. They anecdotally point to the brain as being sparse. But the brain is probably sparse for the same reason that i don't to get up every morning and travel to every city in the uk. Doesn't seem like a profound insight, honest.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode