Exploring the practical uses and advancements of small language models, particularly in chip design and device integration. The chapter highlights how companies like Microsoft are leveraging these models for providing efficient and intelligent solutions locally without the need for cloud servers.
In 2023, the AI industry spent an estimated $50 billion on Nvidia chips, with the purpose of training AI models. The payoff for all that spend, according to Sequoia Capital, is $3 billion in revenue. Is that a return worth bragging about?
RIcky Mulvey talks with Fool analyst Asit Sharma about how investors might think about companies’ AI spend. They also discuss:
- The rate of improvement for AI models
- How non-Mag 7 companies are using AI
- And one company that’s spending smartly on the new technology.
Take a look at the Gartner Hype Cycle.
Host: Ricky Mulvey
Guest: Asit Sharma
Producer: Mary Long
Engineer: Tim Sparks
Companies discussed: GOOG, MSFT, NVDA, ARM, AMD, ORCL
Learn more about your ad choices. Visit megaphone.fm/adchoices