Some people say that all that’s necessary to improve the capabilities of AI is to scale up existing systems. That is, to use more training data, to have larger models with more parameters in them, and more computer chips to crunch through the training data. However, in this episode, we’ll be hearing from a computer scientist who thinks there are many other options for improving AI. He is Alexander Ororbia, a professor at the Rochester Institute of Technology in New York State, where he directs the Neural Adaptive Computing Laboratory.
David had the pleasure of watching Alex give a talk at the AGI 2024 conference in Seattle earlier this year, and found it fascinating. After you hear this episode, we hope you reach a similar conclusion.
Selected follow-ups:
Real Talk About MarketingAn Acxiom podcast where we discuss marketing made better, bringing you real...
Listen on: Apple Podcasts Spotify
Digital Disruption with Geoff Nielson Discover how technology is reshaping our lives and livelihoods.
Listen on: Apple Podcasts Spotify