Dr. Lisa Su, CEO of AMD, talks about the chip supply chain and AMD's competition with Nvidia's H100 chip. Lisa discusses AMD's new chip MI300 that aims to match the H100's speed. She also shares insights on AMD's use of AI, manufacturing capacity, and the challenges of regulating AI. The podcast concludes with a discussion on the Xilinx acquisition and the future of gaming.
The chip supply chain is facing tight availability of high-end GPUs for AI models, prompting AMD to develop the MI300 chip to compete with Nvidia's H100 in terms of speed.
Creating hardware-agnostic software frameworks like PyTorch enables developers to write code that can run on different hardware platforms, fostering innovation and flexibility in AI programming.
Deep dives
AI, Chip Supply Chain, and the MI300 Chip
The podcast episode explores the current state of the chip supply chain, particularly in relation to AI and the high demand for GPUs. Dr. Lisa Sue, CEO of AMD, discusses the global chip shortage and the spike in demand for high-end GPUs that power AI models. She mentions that while there is a good overall balance of supply and demand, the availability of GPUs for large language model training and inference is tight. AMD is working on a new chip called the MI300, which aims to compete with Nvidia's H100 chip in terms of speed. Moreover, efforts are being made to ensure software compatibility between Nvidia and AMD. Dr. Lisa also touches on the Chips in Science Act, which aims to build chip manufacturing capacity in the United States, and discusses the importance of diversifying the supply chain and increasing manufacturing capacity to meet the growing demand for chips.
AI Software Stacks and the Open Approach
The podcast episode delves into the topic of AI software stacks and the different approaches taken by companies. Dr. Lisa Sue highlights the importance of hardware-agnostic software and the role of frameworks like PyTorch in enabling developers to write code that can run on different hardware platforms. She emphasizes the significance of an open approach, allowing for innovation and collaboration across the industry. The goal is to make AI programming more seamless and provide developers with choice and flexibility in utilizing different hardware infrastructures for their AI applications.
Regulation and Considerations for AI
The podcast episode touches upon the topic of AI regulation and the considerations surrounding it. Dr. Lisa Sue acknowledges the importance of safety and privacy in AI development while recognizing the challenges in implementing regulatory measures effectively. She expresses willingness to be part of the discussion and emphasizes the need for collaboration and public-private partnerships to address concerns related to data privacy, bias prevention, and adherence to export controls. Dr. Lisa acknowledges the complexity of this issue and the ongoing learning process in order to strike the right balance between technological advancement and responsible AI usage.
Long-Term Bets and Vision for the Future
The podcast episode concludes with a discussion on long-term bets and strategic decisions in the chip industry. Dr. Lisa Sue mentions that AMD designs with a five-plus-year cycle in mind, aiming to anticipate future trends in computing. She highlights the company's focus on AI and the commitment to deliver high-performance chips for various applications, including gaming. Dr. Lisa expresses confidence in the gaming market, emphasizing the importance of providing choice and catering to different types of gaming platforms, such as PCs, consoles, and cloud-based gaming services.
Today, we’re bringing you something a little different. The Code Conference was this week, and we had a great time talking live onstage with all of our guests. We’ll be sharing a lot of these conversations here in the coming days, and the first one we’re sharing is my chat with Dr. Lisa Su, the CEO of AMD.
Lisa and I spoke for half an hour, and we covered an incredible number of topics, especially about AI and the chip supply chain. The balance of supply and demand is overall in a pretty good place right now, Lisa told us, with the notable exception of these high-end GPUs powering all of the large AI models that everyone’s running. The hottest GPU in the game is Nvidia’s H100 chip. But AMD is working to compete with a new chip Lisa told us about called the MI300 that should be as fast as the H100. You’ll also hear Lisa talk about what companies are doing to increase manufacturing capacity.
Finally, Lisa answered questions from the amazing Code audience and talked a lot about how much AMD is using AI inside the company right now. It’s more than you think, although Lisa did say AI is not going to be designing chips all by itself anytime soon.