The Gradient: Perspectives on AI cover image

Thomas Dietterich: From the Foundations

The Gradient: Perspectives on AI

CHAPTER

Error Correcting Output Codes and Multi-Class Learning

The speakers discuss a paper on solving multi-class learning problems through error correcting output codes. They explore the idea of representing output phonemes using a distributed representation and draw an analogy to error correction codes. They also mention other ensemble techniques like boosting, bagging, and random forests, and discuss the possibility of deep learned representations having error correcting behavior.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner