12min chapter

The Megyn Kelly Show cover image

Megyn Goes Behind-The-Scenes of Her Trump Rally Speech, Plus Key States to Watch, with Henry Olsen | Ep. 935

The Megyn Kelly Show

CHAPTER

Understanding Virginia's Election Dynamics

This chapter examines key congressional districts in Virginia, focusing on voter turnout trends in the 3rd and 7th districts to predict election outcomes. It discusses the unpredictability of elections, insights from past events like the 2016 election, and the critical need for patience in interpreting polling data. The chapter also highlights phenomena like the 'red mirage' from the 2020 elections, emphasizing how changing voting patterns impact real-time election reporting.

00:00
Speaker 2
No, I think it's a smart point to make because I think a lot of what is missing from so many investment discussions, especially as it pertains to tech and AI and all the excitement and hoopla and kind of hopping them to borrow a term from the cannabis industry that's happening over there. I think it behooves investors to think about the nuance and the specificity of different stocks and the different tiers in AI. So speaking to the different tiers, if we could click out a couple layers back, how would you think about or how do you define the different AI tiers?
Speaker 1
Yeah, so it's a great thing to visualize or at least mentally understand as you go to invest that AI has different layers. You have the players who are in the hardware business who are driving. So you have to think about artificial intelligence. The name implies it's artificial, you require some kind of silicon to run these data computations. You need a compute platform. So you start at the ground level and you say, I need chips, silicon, storage, memory to even think about processing the data that AI requires. And the data is immense. We're talking tens of billions of parameters for these AI models, especially generative AI. But there's going to be more than generative AI. And there already is. There's plenty of other facets of the end result. You got anything from recommendation systems to machine reading, healthcare and diagnostics and things like that, reading images and trying to diagnose things using machine learning. So you have a lot of different facets of it, but it all starts with the hardware aspect. It all starts, how do you process all that data? And there's a lot of players in it. You have the end videos, the AMDs and the Intel's. They're all competing for the GPU side of it, the accelerator part of it. But those accelerators are made up of chips, which are made by Taiwan semiconductor. It has memory made by Micron and SK Hynix and Samsung. You have networking in the interconnects that go with them, Melanox, which is an NVIDIA subsidiary and makes the infinity in Finneban. But you have others like Aristo networks who are creating that build out capability of an entire data center, not just linking two GPUs or eight GPUs together, but linking servers and linking nodes of servers together. That requires fast switches that requires hardware. But what Aristo is finding is that it requires a great software stack. So while they're in the hardware business, they have a great software stack and that goes back to NVIDIA too, and which is one of the reasons I tell NVIDIA over AMD or Intel is that they have the software stack. They have everything from the operating system all the way down to the libraries and the frameworks that are run for these AI applications. You can't exclude any one thing. But on a high level, those are the guys that are building the foundation. They're the ones essentially selling the picks and the shovels of a gold rush to borrow an analogy. And just like in a gold rush, the ones who are making the most money were the ones selling those foundational tools to go and dig out the gold or in this case to go and compute the data or provide it with data. And I'd be remiss to miss a Western Digital also who provides solid-state drives on top of Micron and top of several other companies. So you have the GPUs, you have the motherboards, you have the interconnects, you have the networking, you have the memory, you have the storage. So that all comes together. Now you go one level to the right and somebody's got to put all that together. So you have what I call the AI middlemen. You have the super micro who are putting together these servers, interacts, you have the Dell's Lenovo's HP enterprises. They're all then building all those into a product that gets shipped off and actually installed in a data center. So you have this layer that is basically the supply chain further down the line, not the ones supplying necessarily the chips in the silicon like Taiwan semiconductor might be. But you have these guys who are then taking all that, putting it together and making a product. They may be overclocking some of this components, the memory, the other side pieces of the systems on the chip. But they're packaging it together and then providing the AWS side of things, the cloud providers. But more importantly, even the enterprise and the private cloud providers because NVIDIA can contract and get their own servers built. That's not a big deal. But somebody, a company, take a global company that does IT work. So a Deloitte or something like that or a general dynamics IT. And they have many data centers that they own that run their own products. They're going to need to purchase those racks on those servers for generative AI or whatever AI they're looking to use or get out of. So these middlemen provide that product. So it contains the AMD chip or it contains the NVIDIA chip or the Intel chip. And it goes into this ready to slide into a rack, plug into networking, configure it and you're good to go. So they serve a purpose. They can't be left out because without them, you don't get from point A to point C. They're kind of that point B to carry those products across. And then finally, you get to what I call the retail side of AI, which is really the end user where you get the output, the data output and the result that you would expect with Microsoft Copilot or Bing or Gemini with Google. And that's how the whole story comes together is now you take it, somebody has to create this application, run this model, decide on data, generate the output, the inferencing that then has to happen to give you the answer or the result to the end user, somebody like us sitting on the other side of the screen typing in something to Google. But then you have others like I talked about before, meta platforms that are using it mainly internally as a recommendation system so that they have better engagement with their users, that the users are seeing stuff that's relevant to them, that they keep scrolling. And you know, whether you agree or not, their business is to keep you glued to that screen. And from there, it's to serve advertisements. So they not only recommend more content that keeps you engaged, but it then provides great targeting for advertisements that are relevant and then providing the analytics of those ads for the marketer who wanted to run those ads. So they have like a three-pronged approach where it's we got to have more better content that's relevant and surface that to the user so we keep them engaged and on the app longer or Instagram, Facebook, you know, across all their family of apps. And then we have to recommend relevant ads that are going to get clicked and that are going to be seen. And then we need to analyze how, what did that do? Did that lead to a sale? Did that lead to a visit? How long did they look at it? How many impressions was that? Where did they come from? And provide that analytics back to the marketers. And they're using AI to provide that in a better, faster, and this is why meta platforms was able to lay off, you know, not only just moderators of content, but you know, people who were manually going out and working with like the Coca-Cola's or the Pepsi's or the general mills of advertising and building, you know, manual ad products or manual ad campaigns. The AI is now able to do that and understand and provide marketing content in the sense of, hey, we run around and add it looks like this, it sounds like this, it's going to have this AI can generate. So they have the generative AI on top of it for content creation for advertisers. So you go through these three tiers and you see how everyone has a benefit and everyone, you know, like, for example, the Microsoft, the the meta platforms, they're paying the Nvidia's and the Super Micro's and the Dell's that money, but then they're turning around and selling a better product so that they can get more money from their user base or their advertisers. And you have this own AI economy that has sprung up. And and this is why I tell my readers and my subscribers, you can't listen to people saying that AI is just hype or that it's vaporware or, you know, it's going to go away. Now this, this is revolution, truly revolutionized it and everybody goes, well, this time can't be different. Yeah, but once every couple decades, there is that situation, you know, you have the internet, you have the internet on set, you have the iPhone moment. And now you have the AI moment. It's it's just that next one of those three in the last three decades to bring about this new revolution that we only can go forward at this point. We can't back out of it because too many other competitors are now doing the same thing where if you don't do it, you fall behind your risk going out of business. And if you want to get in business, you better have a better AI approach to the next guy. So this, this isn't something that's just going to be fleeting or going to go away. Now, will it digest? Sure. Yeah, I think it's going to digest. I think there's going to be tons of spending hundreds of billions of dollars of spending over these next two years. And even in the past year, that's going to have to get digested. But without going too far into the weeds here, we're also cannibalizing other, other aspects of the same and data center environment. We're not going towards CPU based computing anymore. We're going to GPU based computing. And the CEO of NVIDIA is right that this is the time that we go to a different compute mechanism where this is why AMD is struggling, even though it has more AI revenue, especially across this year, it's CPU side is hurting. There's a reason why Intel's CPU business is hurting. And it's because nobody wants to admit it, but there is a cannibalization going on. And I think that's going to mitigate some of that. Some of the cooling down or some of the consolidation of this capital spending is that we're actually taking over another, it's taking over another industry in and of itself. And that's the CPU based industry. So, yeah, there's a lot that goes in there. It's a long winded way of saying there's, I have built out three tiers where you have the hardware guys, the middlemen and the retailers. And there's winners in everyone. And there's losers in everyone. It's the ones that are going to get left behind because they didn't keep up or they didn't find the next big thing or they didn't skate to where the puck is going as opposed to where the puck
Speaker 2
already is. Yeah, there's so much to digest here. As you say, when life changes so radically, there's good parts and there's bad parts. Speaking to the cannibalization of the sector that it finds itself in, is there going to be a time in the near future where you will favor some of the tiers over others in terms of where we are in the cycle?
Speaker 1
I'm not sure I will favor one tier over the other necessarily. To your point though, I mean, if companies stop spending on hardware and want to digest what they've already built out and these data centers need some time to consolidate that volume, yeah, maybe the semiconductor side is not going to be where you want to be. The hardware side may be a better cooling off while the retail side cuts back on spending and increases profit. So, you might have that give and take across both of them. But, I mean, again, I would be more specific on it's a ticker by ticker basis on and who's benefiting. So, even in the worst of times for the semiconductors and Vidya might be still doing better than the rest and might correct less than the rest as far as stock returns go. Just as Metaplyiforms might be better than the rest and might correct less than the rest because it has the best approach to AI and it's taking market share handover fist from Google and whoever else, big competitors in taking advertising market share. It cuts back on capital expenditures and investors are happy because they're not spending as much. But, again, it really depends on where it is in the cycle on the stock chart as well to as much as what individual winners and losers are going to see in each tier.

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode