Grock company uses special chips connected in a network where each chip acts as a processing unit and a router communicating with neighboring chips. Their software minimizes latency variations, enabling a group of chips to function as a single efficient unit, simplifying programming and leading to high predictability. This innovative approach allows for impressive 10 times faster model processing compared to leading AI models. The concept is likened to a conductor in a supermarket coordinating movement efficiently, reducing wait times and optimizing operations. Major players like Google, Amazon, and Meta are also investing in AI chips for cloud computing.
When it comes to the chips used in artificial intelligence, one firm has the market locked up. We look at the rivals minded to steal Nvidia’s crown. The death toll from the war in Gaza has been disputed since the start; we cut through the numbers to find a reliable estimate (10:19). And our correspondent examines the great rematches of fiction (16:07).
Get a world of insights for 50% off—subscribe to Economist Podcasts+. For more information about how to access Economist Podcasts+, please visit our FAQs page or watch our video explaining how to link your account.