

Let's Know Things
Colin Wright
A calm, non-shouty, non-polemical, weekly news analysis podcast for folks of all stripes and leanings who want to know more about what's happening in the world around them. Hosted by analytic journalist Colin Wright since 2016. letsknowthings.substack.com
Episodes
Mentioned books

Oct 10, 2023 • 15min
Nvidia
This week we talk about AMD, graphics processing units, and AI.We also discuss crypto mining, video games, and parallel processing.Recommended Book: The Story of Art Without Men by Katy HesselTranscriptFounded in 1993 by an engineer who previously designed microprocessors for semiconductor company AMD, an engineer from Sun Microsystems, and a graphics chip designer and senior engineer from Sun and IBM, NVIDIA was focused on producing graphics-optimized hardware because of a theory held by those founders that this sort of engineering would allow computers to tackle new sorts of problems that conventional computing architecture wasn't very good at. They also suspected that the video game industry, which was still pretty nascent, but rapidly growing, this being the early 90s, would become a big deal, and the industry was already running up against hardware problems, computing-wise, both in terms of development, and in terms of allowing users to play games that were graphically complex and immersive.So they scrounged about $40k between them, started the company, and then fairly quickly were able to attract serious funding from Silicon Valley VCs, initially to the tune of $20 million. It took them a little while, about half a decade, to get their first real-deal product out the door, but a graphics accelerator chip they release in 1998 did pretty well, and their subsequent product, the GeForce 256, which empowered consumer-grade hardware to do impressive new things, graphically, made their company, and their GeForce line of graphics cards, into an industry standard piece of hardware for gaming purposes.Graphics cards, those of the dedicated or discrete variety, which basically means it's a separate piece of hardware from the motherboard, the main computer hardware, gives a computer or other device enhanced graphics powers, lending it the ability to process graphical stuff separately, with tech optimized for that purpose, which in turn means you can play games or videos or whatnot that would otherwise be sluggish or low-quality, or in some cases, it allows you to play games and videos that your core system simply wouldn't be capable of handling. These cards are circuit boards that are installed into a computer's expansion slot, or in some cases attached using a high-speed connection cable.Many modern video games require dedicated graphics processors of this kind in order to function, or in order to function at a playable speed and resolution; lower-key, simpler games work decently well with the graphics capabilities included in the core hardware, but the AAA-grade, high-end, visually realistic stuff almost always needs this kind of add-on to work, or to work as intended.And these sorts of add-ons have been around since personal computers have been around, but they really took off on the consumer market in the 1980s, as PCs started to become more visual—the advent of Windows and the Mac made what was previously a green-screen, number and character-heavy interface a lot more colorful and interactive and intuitive for non-programmer users, and as those visual experiences became more complex, the hardware architecture had to evolve to account for that, and often this meant including graphics cards alongside the more standard components.A huge variety of companies make these sorts of cards, these days, but the majority of modern graphics cards are designed by one of two companies: AMD or Nvidia.What I'd like to talk about today is the latter, Nvidia, a company that seems to have found itself in the right place at the right time, with the right investments and infrastructure, to take advantage of a new wave of companies and applications that desperately need what it has to offer.—Like most tech companies, Nvidia has been slowly but surely expanding its capabilities and competing with other entities in this space by snapping up other businesses that do things it would like to be able to do.It bought-out the intellectual assets of 3dfx, a fellow graphics card-maker, in late-2000, grabbed several hardware designers in the early 2000s, and then it went about scooping-up a slew of graphics-related software-makers, to the point where the US Justice Department started to get anxious that Nvidia and its main rival, AMD, might be building monopolies for themselves in this still-burgeoning, but increasingly important to the computing and gaming industry, space.Nvidia was hit hard by lawsuits related to defects in its products in the late 20-aughts, and it invested heavily in producing mobile-focused systems on a chip—holistic, small form-factor microchips that ostensibly include everything device-makers might need to build smartphones or gaming hardware—and even released its own gaming pseudo-console, the Nvidia shield, in the early 20-teens.The company continued to expand its reach in the gaming space in the mid-to-late-20-teens, while also expanding into the automobile media center industry—a segment of the auto-industry that was becoming increasingly digitized and connected, removing buttons and switches and opting for touchscreen interfaces—and it also expanded into the broader mobile device market, allowing it to build chips for smartphones and tablets.What they were starting to realize during this period, though—and this is something they began looking into and investing in, in earnest, back in 2007 or so, through the early 20-teens—is that the same approach they used to build graphics cards, basically lashing a bunch of smaller chip cores together, so they all worked in parallel, which allowed them to do a bunch of different stuff, simultaneously, also allowed them to do other things that require a whole lot of parallel functionality—and that's in contrast to building chips with brute strength, but which aren't necessarily capable of doing a bunch of smaller tasks in parallel to each other.So in addition to being able to show a bunch of complex, resource-intensive graphics on screen, these parallel-processing chip setups could also allow them to, for instance, do complex math, as is required for physics simulations and heavy-duty engineering projects, they could simulate chemical interactions, like pharmaceutical companies need to do, or—and this turned out to be a big, important use-case—they could run the sorts of massive data centers tech giants like Google and Apple and Microsoft were beginning to build all around the world, to crunch all the data being produced and shuffled here and there for their cloud storage and cloud computing architectures.In the years since, that latter use-case has far surpassed the revenue Nvidia pulls in from its video game-optimized graphics processing units.And another use-case for these types of chip architectures, that of running AI systems, looks primed to take the revenue crown from even those cloud computing setups.Nvidia's most recent quarterly report showed that its revenue tied to its data-center offerings more than doubled over the course of just three months, and it's generally expected that this revenue will more than quadruple, year-over-year, and all of this despite a hardware crunch caused by a run on its highest-end products by tech companies wanting to flesh-out their AI-related, number-crunching setups; it hasn't been able to meet the huge surge in demand that has arisen over the past few years, but it's still making major bank.Part of why Nvidia's hardware is so in demand for these use-cases is that, back in 2006, it released the Compute Unified Device Architecture, or CUDA, which is a programming language that allows users to write applications for GPUs, graphics processing units, rather than conventional computing setups.This is what allows folks to treat these gobs of parallel-linked graphics processing units like highly capable computers, and it's what allows them to use gaming-optimized hardware for simulating atoms or managing cloud storage systems or mining Bitcoin.CUDA now has 250 software libraries, which is huge compared to its competitors, and that allows AI developers—a category of people who are enjoying the majority of major tech investment resources at the moment—to perch their software on hardware that can handle the huge processing overhead necessary for these applications to function.Other companies in this space are making investments in their software offerings, and the aforementioned AMD, which is launching AI-focused hardware, as well, uses open source software for its tech, which has some benefits over Nvidia's largely proprietary libraries.Individual companies, too, including Amazon, Microsoft, and Google, are all investing in their own, homegrown, alternative hardware and software, in part so they can be less dependent on companies like Nvidia, which has been charging them an arm-and-a-leg for their high-end products, and which, again, has been suffering from supply shortages because of all this new demand.So these big tech companies don't want to be reliant on Nvidia for their well-being in this space, but they also want to optimize their chips for their individual use-cases they're throwing tons of money at this problem, hoping to liberate themselves from future shortages and dependency issues, and to maybe even build themselves a moat in the AI space in the future, if they can develop hardware and software for their own use that their competition won't be able to match.And for context, a single system with eight of Nvidia's newest, high-end GPUs for cloud data center purposes can cost upward of $200,000, which is about 40-times the cost of buying a generic server optimized for the same purposes; so this is not a small amount of money, considering how many of those systems these companies require just to function at a base level, but these companies are still willing to pay those prices, and are in fact scrambling to do so, hoping to get their hands on more of these scarce resources, which further underlines why they're hoping to make their own, viable alternatives to these Nvidia offerings, sooner rather than later.Despite those pressures to move away to another option, though, Nvidia enjoys a substantial advantage in this market, right now, because of the combination of its powerful hardware and the CUDA language library.That's allow it to rapidly climb the ranks of highest-value global tech companies, recently becoming the first semiconductor company to hit the $1 trillion valuation mark, bypassing Tesla and Meta and Berkshire Hathaway, among many other companies along the way, and something like 92% of AI models are currently written in PyTorch—a machine learning framework that uses the Torch library, and which is currently optimized for use on Nvidia chips because of its cross-compatibility with CUDA; so this advantage is baked-into the industry for the time-being.That may change at some point, as the folks behind PyTorch are in the process of evolving it to support other GPU platforms, like those run by AMD and Apple.But at the moment, Nvidia is the simplest default system to work with for the majority of folks working in AI; so they have a bit of a head start, and that head start was in many ways enabled and funded by their success in the video game industry, and then the few years during which they were heavily funded by the crypto-mining industry, all of which provided them the resources they needed to reinforce that moat and build-out their hardware and software so they were able to become the obvious, default choice for AI purposes, as well.So Nvidia is absolutely killing it right now, their stock having jumped from about $115 a share a year ago to around $460 a share, today, and they're queued up to continue selling out every product they make as fast as they can make them.But we're entering a period, over the next year or two, during which that dominance will start to be challenged, more AI code transferable to other software and hardware made by other companies, and more of their customers building their own alternatives; so a lot of what's fueling their current success may start to sputter if they aren't able to build some new competitive advantages in this space, sometime very soon, despite their impressive, high-flying, stock-surging, valuation-ballooning performance over these past few years.Show Notes* https://www.wsj.com/articles/SB10001424052702304019404577418243311260010* https://www.wsj.com/articles/SB121358204084776309* https://www.wsj.com/tech/ai/how-nvidia-got-hugeand-almost-invincible-da74cae1* https://www.reuters.com/technology/chatgpt-owner-openai-is-exploring-making-its-own-ai-chips-sources-2023-10-06/* https://www.theinformation.com/articles/microsoft-to-debut-ai-chip-next-month-that-could-cut-nvidia-gpu-costs* https://en.wikipedia.org/wiki/PyTorch* https://innovationorigins.com/en/amd-gears-up-to-challenge-nvidias-ai-supremacy/* https://techcrunch.com/2023/10/07/how-nvidia-became-a-major-player-in-robotics/* https://en.wikipedia.org/wiki/Graphics_card* https://en.wikipedia.org/wiki/Nvidia This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Oct 3, 2023 • 17min
Methane
This week we talk about natural gas, plumes, and satellites.We also discuss firedamp, AI detection, and emission numbers.Recommended Book: Excellent Advice for Living: Wisdom I Wish I'd Known Earlier by Kevin KellyTranscriptMethane, the name for a chemical made up of one part carbon, four parts hydrogen, is incredibly abundant on earth as it's formed by both geological and biological processes—the former when organic materials are heated up and have massive amounts of pressure applied to them, underground, and the latter through a process called methanogenesis, which basically means certain types of Archaea, a type of life, exhaling methane.That sort of respiration mostly occurs in organic-breakdown situations, where these microscopic organisms live: so landfills and in the bottom of lakes, where dead stuff falls and is torn apart at a microscopic level by these tiny creatures, but also in the guts of cows and termites and similar beasties, which rely upon their symbiosis with these archaea to help them process the stuff they eat—which they otherwise wouldn't be able to break up and use on their own.Methane was originally discovered, in the sense that it was noted and quantified, back in the late-18th century, when the Italian physicist and chemist, Alessandro Volta—who among other things also lent his name to an electrical measurement and who is credited with inventing the battery—who was studying marsh gas, marshes being a huge natural source of methane, as it's filled with the sorts of critters that break apart biological materials and release methane as a byproduct. We've known about this gas for a while, then, and history is filled with examples of different cultures making use of it in relatively simple ways, as an energy source. And on that note, methane is the primary constituent of what we today call natural gas, though the name methane was only coined by 1866 by a German chemist, August Wilhelm von Hoffman, who derived the term from methanol, which is the flammable, colorless liquid often called wood alcohol which is from whence the gas was first detected and isolated, and before that different cultures referred to it only adjacently, usually because it caused issues they couldn't quite quantify, like, for instance, causing deaths in coal mines—the deathly, gas-pocket-laden air, until methane became an official thing, sometimes referred to as firedamp, which was scary because it could suffocate everyone, or it could explode.Today, methane, mostly as a constituent of natural gas, is harvested and shuttled all over the world to be burned as a fossil fuel; and similar to other fossil fuels, like oil and coal, that burning releases energy, producing heat, which is used to spin a turbine or heat water in a steam generator. Natural gas is, in the modern world, generally considered to be superior to other fossil fuel options because it burns relatively cleanly, in terms of pollution, compared to other options, which is nice for folks in the areas where this burning is taking place, but it also releases relatively less CO2 into the atmosphere per unit of heat it produces when it's used for energy, so although it's still very much a fossil fuel and emits greenhouse gases into the atmosphere, it's the best of bad options in many ways, and can be stored and transported in forms that make it quite versatile and even more energy-dense—it can be refined and pressurized into a liquid, for instance, which makes transport substantially easier and each unit of natural gas more useful, but that also allows it to be used as rocket fuel and for similar high-intensity utilities, which is not something that can be said of otherwise comparable options.What I'd like to talk about today is the role of methane in a world that's shifting toward renewable energy, and why this fossil fuel, which is generally superior to other fossil fuel options, is associated with some unique problems that we're scrambling to solve.—Back in June of 2023, scientists announced that they had discovered evidence of a massive methane plume in Kazakhstan.This plume—the consequence of a leak at a methane prospecting site in this methane-rich country—was later confirmed to be the result of an accident at one of a local energy company's wells at a gas field on June 9, and the company said they were doing what they could to address the issue, and that the purported gas plume was actually just hot clouds of vapor containing minimal amounts of methane; a misidentification, in other words.The scientists who flagged the plume, though, said this wasn't the case: the satellites they used to identify it contain high spectral resolution imaging hardware, and they don't tend to mistake water vapor for methane—that may have been possible with previous technologies, but these new ones aren't prone to that type of false-positive.The satellites noted at least nine individual instances of methane plumes erupting from this single site in the month leading up to July 23, alone, and those findings were then confirmed by scientists using similar technologies with the SRON Netherlands Institute for Space Research—and that's alongside the original group's use of two different satellites, the EU's Sentinel-5P and the Italian Space Agency's Prism satellite, the former of which used a spectrometer that was designed specifically to detect methane in this way.These researchers, using these findings, were able to estimate an emission-rate of somewhere between 35 and 107 metric tons of methane, per hour, into the atmosphere, from this one leak, alone, which has thus caused the same amount of short-term climate damage, in terms of heat amplifying greenhouse effects, as the annual emissions of somewhere between 814,000 and nearly 2.5 million US cars, making it the worst confirmed methane leak from a single source in all of 2023—so far, at least.And "so far" is doing a lot of work, there, as these sorts of satellites have become increasingly effective tools in researchers' toolkits for identifying these types of leaks, and the software they use to crunch the raw data provided by these increasingly sophisticated detection tools has led to a small revolution in the ability to both notice and pinpoint the source of methane plumes, globally, even in areas where such plumes would have previously gone un-noted, and thus, unaddressed.And this is important, if you're the sort of person who cares about the amplifying effects of human industry and other endeavors on climate change, because methane, in addition to its explosive volatility and capacity to degrade air quality and mess with ecosystems at ground-level, methane is thought to be responsible for about 30% of the total greenhouse effects we're seeing, today, because—despite only sticking around in the atmosphere for about 7 to 12 years, compared to potentially hundreds of years for CO2—methane is also about 80-times more potent than CO2, in this regard.So in the short-term, which in this case means the around a decade a given methane particle persists in the atmosphere, it's way, way worse in terms of heat-trapping, compared to CO2.And though that effect will subside faster than CO2, which can stick around for many generations, rather than a decade or so, we're still churning a lot of methane up there, so this isn't a one-off, temporary thing, it's persistent, the methane that goes away being replaced by more of the same, and those temporary impacts can have long-term repercussions, like melting ice caps, contributing to droughts and floods and extreme storms, and drying up areas that would periodically see irregular wildfires, causing much larger and more potent versions of the same, which in turn churns all the CO2 contained in those trees or peatlands or whatever else that are now burning, into the atmosphere.So temporary boosts of this magnitude in greenhouse gas effects are not temporary—they can last far past the period in which the gases are actually up there, because of how substantially, and in practical terms, permanently, they change the circumstances on the earth, below.All of which has led to waves of investment in being able to detect methane leaks, because while many energy companies are incentivized to cap leaky wells, in part because doing so potentially gives them a source of natural gas they can then turn around and sell as fuel, some such entities are more than happy to allow these leaks to just keep leaking, because the cost of identifying and handling leaks is higher than what they can expect to get from capturing and selling that gas, or in some cases because the entities in question are beyond strict regulations that would necessitate they care or act to begin with; there are no consequences for such atmospheric pollution in many parts of the world.The same is generally true even in more dense and ostensibly regulatorily rich areas like Russia, which is expected to churn by far more CO2-equivalents worth of methane into the atmosphere from leaks and gas burning than any other country—though the US comes in second, followed by Qatar, Iran, Saudi Arabia, and China at a distance sixth.This is an issue in fairly remote and rural places like Kazakhstan, then, where there's a lot of energy and mining infrastructure, but not so many people, or regulatory bodies with teeth, but also in places like the US, where methane gas leaks are estimated to pump something like 6.5 million metric tons of this gas into the atmosphere every single year, which is roughly the equivalent of the yearly emissions of about 2.5 million US passenger vehicles.There are means of addressing this issue, and they're generally referred to as "methane abatements," a term that encompasses everything from plugging or tapping those leaks to what cattle are fed—cows emitting a lot of methane because of how they're bred, kept, and fed, and how their microbiota processes that feed.Fundamental to these abatement options, though, is figuring out where and how to apply them in the first place.Governments around the world are thus beginning to aggregate the data they have, providing local governance and businesses with the resources they need to start addressing this issue, but the rollout has been slow, in part because the resolution of our view has been quite low, until just recently.A trio of satellites, including the aforementioned Sentinel 5P, alongside the Sentinel-3 and Sentinel-2, the data they collectively generate paired with machine learning—a type of what we broadly might call artificial intelligence software—has allowed researchers to produce a wealth of automatically produced data on this subject, at a far more granular level than has been possible until now, which in turn has allowed governing bodies to parse that data and identify super-emitters, the worst of the worst in terms of these leaks, while also providing more specific, down to the individual well in an oil facility or in some cases the specific location on a pipeline, where these leaks are occurring; these satellites can also provide estimates as to how much methane is being leaked at a given location, which in turn can help nations, organizations, and corporations prioritize their abatement efforts, accordingly.We're still in the frontier-stage of this sort of detection and amelioration, but there's more on the way, with satellites optimized for methane detection of this kind launching in the coming years—one of them, the $90 million MethaneSAT, is meant to help global regulators pinpoint hotspots and identify potential underreporting by various entities, which in turn should help put more pressure on those that are intentionally concealing their leaks: something that'll be especially important for holding companies like those in Russia, which are supported in this concealing by their government, to account for their chronically underreported emissions.These satellites and similar detection tools, though, aren't of much use without efforts to act upon their findings at ground level, just as all the good intentions in the world wouldn't be enough to staunch the upward flow of this gas into the atmosphere, lacking the data required to tell us where to look and what needs to be done.What we're really looking at, then, is a moment in time, beginning in 2023, but really kicking into high gear in 2024 through 2030, which is when many countries' first-step, big-deal climate commitments come due, a moment in which a confluence of detection and remediation efforts and techniques is finally emerging, and this confluence could allow us to significantly reduce this category of greenhouse gas emissions, which is great, because up to 75% of methane emissions are thought to be solvable in this way.Such efforts, in turn, could reduce the rise in global temperatures from greenhouse gases by something like 25%, all unto itself; an incredible win, if we can keep the momentum going and incentives aligned as these new resources begin to spin-up and interoperate and give the folks trying to solve this particular problem the tools they need to do so. Show Notes* https://en.wikipedia.org/wiki/Methane* https://www.epa.gov/gmi/importance-methane* https://archive.ph/ODvEK* https://www.iea.org/energy-system/fossil-fuels/methane-abatement* https://www.iea.org/fuels-and-technologies/methane-abatement* https://www.esa.int/Applications/Observing_the_Earth/Copernicus/Sentinel-5P/Tropomi* https://www.bbc.com/news/science-environment-66811312* https://www.sciencedirect.com/science/article/pii/S0034425723002675* https://acp.copernicus.org/articles/23/9071/2023/* https://en.wikipedia.org/wiki/Methane_emissions* https://www.edf.org/climate/methane-crucial-opportunity-climate-fight* https://climate.mit.edu/ask-mit/how-much-does-natural-gas-contribute-climate-change-through-co2-emissions-when-fuel-burned* https://www.theguardian.com/environment/2023/mar/06/revealed-1000-super-emitting-methane-leaks-risk-triggering-climate-tipping-points* https://climate.nasa.gov/vital-signs/methane/* https://www.state.gov/publication-of-u-s-government-funded-methane-abatement-handbook-for-policymakers/* https://www.esa.int/Applications/Observing_the_Earth/Copernicus/Trio_of_Sentinel_satellites_map_methane_super-emitters* https://www.cpr.org/2023/08/17/methane-satellite-ball-aerospace-boulder/ This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Sep 26, 2023 • 22min
Video Game Engines
This week we talk about Unity, Unreal, and Godot.We also discuss fee structures, user revolts, and indie game-makers.Recommended Book: How Big Things Get Done by Bent Flyvbjerg and Dan GardnerShow Notes* https://www.statista.com/outlook/dmo/digital-media/video-games/worldwide* https://www.billboard.com/pro/ifpi-global-report-2023-music-business-revenue-market-share/* https://www.cnbc.com/2022/07/07/video-game-industry-not-recession-proof-sales-set-to-fall-in-2022.html* https://en.wikipedia.org/wiki/Video_game_industry* https://www.washingtonpost.com/video-games/2022/08/22/are-video-games-recession-proof-sort-experts-say/* https://www.gamedeveloper.com/blogs/unity-s-pricing-changes-are-trying-to-solve-too-many-problems-at-once* https://www.gamedeveloper.com/business/unity-apologizes-to-devs-reveals-updated-runtime-fee-policy* https://www.theverge.com/2023/9/22/23882768/unity-new-pricing-model-update* https://www.theverge.com/2023/9/15/23875396/unity-mobile-developers-ad-monetization-tos-changes* https://www.theverge.com/2023/9/12/23870547/unit-price-change-game-development* https://www.washingtonpost.com/video-games/2022/08/22/are-video-games-recession-proof-sort-experts-say/* https://www.investopedia.com/articles/investing/022216/how-microtransactions-are-evolving-economics-gaming.asp* https://digitalcommons.uri.edu/srhonorsprog/902/* https://www.investopedia.com/articles/investing/053115/how-video-game-industry-changing.asp* https://finmodelslab.com/blogs/operating-costs/video-game-company-operating-costs* https://www.makeuseof.com/ways-the-rising-costs-of-games-affect-the-industry/* https://codeswholesale.com/blog/5-ways-to-make-money-in-the-gaming-industry/* https://gamemaker.io/en/blog/cost-of-making-a-game* https://www.gamedesigning.org/learn/video-game-cost/* https://www.reuters.com/technology/video-gaming-revenue-grow-26-2023-console-sales-strength-report-2023-08-08/* https://www.statista.com/outlook/dmo/digital-media/video-games/worldwideTranscriptDepending on how inclusive you are with your measurements and the specific numbers you're tallying, the global video game market is expected to pull in somewhere between $187.7 and $334 billion in revenue in 2023.That's somewhere between 2.6% and 13.4% above 2022 numbers—and again, those figures are pretty far apart because different entities keeping tabs on this industry measure different things, some only looking at direct sales of video games and in-game items, while others look at connected sub-industries, like e-gaming events and service jobs that do customer support for game companies.Whichever end of that spectrum you look at, though, the global video game industry is a behemoth that's growing every year, and its income surpassed that of the music and film industries, combined, years ago, the global film industry expected to bring in around $92.5 billion in 2023, while the global music industry pulls a paltry $26.2 billion.The video game market is continuing to grow at a fairly stellar pace, compared to other entertainment categories, as well. And while it was shown not to be entirely recession proof, as had been claimed since the financial crisis of 2007 and 2008, when it remained one of the few industries still growing steadily, that growth balking a bit in 2022, when the industry contracted by 1.2%, it grew substantially at the beginning of the COVID-19 pandemic, and has largely maintained that growth since, which has allowed entities operating in this space to claim more and more entertainment-related marketshare, which in turn has shifted the center of gravity in the media world toward video games and away from other leisure options, including things like travel, vacations, and other things you wouldn't typically think of as being competitors of the video game market.Since video games really took off, hitting the mainstream in the 1980s, and becoming a big deal in the 1990s with the emergence of user-friendly consoles and 3D graphics, the economics of video games have changed substantially.Once, video game companies sold games that would play on a user's computer, then consoles—which are basically gaming-focused mini-computers that plug into a customer's TV, or can be carried around in their pockets—those quickly became the new default for many gamers, creating a more optimized gaming experience, though also introducing a new cost for game-makers, as they typically need to pay something to the console-maker to use their tech and have their products work on these platforms.Retail stores became increasingly important to the gaming industry's budgetary concerns around this time, as they would need to take a cut of the sale price of everything they sold, but also have the flexibility to offer deals to their customers, to incentivize purchases and lure them away from other game stores.And further toward the base of the development stack, as games became more sophisticated and refined, game-makers had to spend more money on high-end hardware, but also higher-end software tools that would allow them to develop the games, polish them so they could compete with other offerings, and in some cases use what's often called "middleware" to serve as a scaffolding for their game projects—software tools that are sometimes referred to as game engines.All of which has made the process of producing video games a lot more complex and expensive, and as the industry has become more popular, roping-in more and more customers, more and more entities have popped up, intent on making their own games; and that's fed a spiral toward higher-costs and more complex game-making processes, leading to a lot of enrichment in some cases, and quite a few new business models optimized for different platforms and styles of game, but also quite a few bankruptcies and hostile takeovers, even seemingly successful video game companies sometimes falling short or investing too much in a game that flops, leaving them with insufficient resources to keep the lights on or produce their next product.What I'd like to talk about today is a recent scandal in the video game industry related to one of those middleware, game engine-making companies, and how they're scrambling to make things right after seemingly losing much of their goodwill and credibility essentially overnight.—In early September, 2023, a game engine company called Unity announced that it would be changing its pricing structure, effective Jan 1, 2024, and that set off a wave of outrage and anger from its users, most of which are individual game-makers and game-making companies.To understand why this response was so widespread and vehement, it's helpful to understand a bit about how game engines work and their role in the modern video game industry.Fundamentally, a game engine is a piece of software that serves as a framework for making video games.So while it's not a simple "click a button, get a game" sort of setup, it does dramatically reduce the amount of time and effort required to produce a finished game product, giving users—game-developers of all shapes and sizes—level-editors, physics engines, rendering engines that help them more easily produce and edit 2D and 3D graphics, collision detection tools, which basically track and control how things bump up against each other in the game and what happens when they do, alongside more basic media tools like those that allow for the creation and editing of audio, animations, video content, text, and the like.Modern game engines also help developers keep the size of their games moderated without losing too much quality, they help with memory management on the developers' computers, they can provide artificial intelligence tools and software that helps them build-out multiplayer functionality—it's a really big and powerful toolkit, so the engine that game-makers choose to use is important, and it shapes every other decision they make, and in some ways the final product, too, because of how easy or difficult things are to do within their specific scaffolding.Unity makes a very popular game engine that was originally released in 2005 as a Mac-specific product, but it has since become multiplatform, allowing developers to make games for all sorts of computers, consoles, mobile devices, and virtual reality interfaces.It's perhaps most popular in the mobile gaming space, as it's relatively easy to learn compared to other engines, and is fairly lightweight; and because the mobile gaming space has been growing so rapidly, that's meant Unity has become increasing popular and widespread as a tool, which in turn has had the spillover effect of making it more popular on other platforms, as well—because folks making a mobile game might go on to make a Playstation game next, and may decide to stick with the engine they know, or a gaming company might decide to perch all their games upon the same game engine because that's just a lot easier, both in terms of keeping things simple for developers, and in terms of the costs associated with using a bunch of different engine.The pricing models used by these game engines vary quite a bit from company to company, but typically they make money by selling licenses to use their products; there's generally a free tier for folks learning to use their tools and who make games below a certain threshold of popularity and profit, but at a certain point they'll need to buy the right to use the engine, which generally also comes with a few bonus perks, like better analytics and error reporting options.This system has worked for everyone for a long time now, and though some developers have balked at the idea of paying Unity and similar companies for their engines, opting for free and open source options like Godot, instead, the larger gaming industry has generally oriented itself around just a few primary, paid options, including the Unreal Engine owned by Epic Games, the maker of Fortnite, among many other offerings, and Unity, which since its release has been used to make more than 750,000 games, alongside non-game offerings, like augmented reality experiences in Microsoft's HoloLens headset, about 90% of Samsung Gear VR content, machine learning programs like Google's TensorFlow, and even film content, like the backgrounds for the 2019 real-life version of The Lion King and engineering blueprints, like those for cars and buildings.All of which partially explains why so many people were up in arms about the changes Unity announced, seemingly out of nowhere, to their fee structure in early September.The old Unity model, again, included a free version of Unity for folks operating below a certain threshold—that threshold has been $200,000 for a while now—and after that folks would pay a monthly fee to use the engine, and that fee would typically cost about $400 per year per game, though it varied quite a bit as folks paid per seat—that is, per developer using the engine—and based on the size of the studio and game they're working on.Unity's newly announced pricing model, in contrast, would keep a free tier, but would remove some of the cheaper payment options, nudging people up to higher yearly rates, while, importantly, also tacking-on a small fee, somewhere between a cent and twenty cents, for each installation of a game that uses the Unity engine, after a threshold has been crossed.The announcement also said that Unity would use a secret, internal method of determining download numbers, and folks would be on the hook, in some cases, for something closer to $2,000 a year per game, rather than $400-ish, though that number would also vary wildly based on a game's popularity and reach.This sparked all kinds of concerns, as it was an additional fee on top of existing fees, costing game-makers more over time, and without providing any new value in exchange, and because it was retroactive, so everyone who had ever used Unity for any game would be on the hook for this new payment structure—meaning, all those 750,000 games or so would potentially be new sources of revenue for Unity, but would be burdened with new expenses for the folks who made them.All sorts of immediate concerns bubbled to the surface of the gaming community, ranging from worries that small, indie devs would be priced out of the market—folks without big bank accounts to draw upon, and who aren't making games that bring in tons of revenue—to concerns related to the concept of putting a price-tag on downloads: would trolls be able to aim hefty fees on developers they don't like by repeatedly installing and uninstalling their games? Would Unity's tracking software be legit? Would it differentiate new downloads from redownloads, or would someone who buys a game, paying for it once, conceivably be a drain on the developer's bank account forever into the future, because they might install it over and over again, over time, on multiple devices?This outcry was also laden with a heavy sense of betrayal because it seemed to violate Unity's terms of service, and that outcry grew even louder and more betrayal-laden when it became clear, as folks went back to check the end-user license agreement they'd signed, that Unity had quietly, in the preceding months, gone through and edited its EULA to basically allow themselves to do what they had done, even though previous versions said they would never do such a thing.The first week after this announcement, as the gaming world unified against Unity, the company's stock tumbled around 16.5% from where it was before the announced change, which is the opposite of what the company had hoped to accomplish—industry analysis suggests that the company is trying to shore-up its numbers, never having been profitable, but finding itself especially pressed for cash right now, and hoping to avoid being in the same situation in the future.What seems to have happened is they tried to do too much at once, essentially grabbing at immediate cash as much as possible, while also trying to scale-up their future prospects by giving themselves a means of benefitting from the success of the games that use their engine; this isn't an entirely novel concept, as their competitor, Unreal, charges a 5% revenue share from game-makers using their engine, but because this was new, out of nowhere, seemed to come about without the folks running Unity checking-in with anyone in the gaming industry to see if it would be alright, and if so to see what sorts of numbers would be tenable for their business models, and because it was retroactively applied using a seemingly pretty skeezy, secretive method of basically giving themselves permission, on the down-low, after swearing up and down they would never do exactly this—all of it went over quite badly, the gaming world revolted against them, near-universally, and this has led to a huge exodus from Unity to other platforms, including the free and open source Godot, which has quite suddenly received a wave of funds from some of the more successful indie game studios out there, and newfound attention from folks who are learning they can relatively simply port their games from Unity to Godot, saving them the future hassle and expense of dealing with the former.The alternative floated by some gaming studios and individual makers was to simply pull their games from shelves, and this was also threatened, especially in cases where the games are free to play, and thus tend to garner huge numbers of downloads, but don't make money on all the people who install their game—which means their work would become huge weights around their ledgers, losing them money each year, rather than earning them money.It took more than a week, but the higher-ups at Unity eventually made some noises about having heard the game-making world and feeling bad about releasing this new model without first seeking their input, and they said they would take another stab at things and get back to them.They then released a new plan, a new pricing model, that seems to have infuriated people substantially less—a revamp that still includes changes, but apparently less catastrophic ones.The new plan says they'll rely on game-maker-reported numbers to tally downloads, and they've raised the revenue cap at which folks need to upgrade to $200,000, so below that and you can keep the low-tier Unity Personal plan, which is excluded from this new pricing model, and that roughly lines up with where things were before—and any game that makes less than $1 million in 12 months will also be exempt from the additional, per-install fee.Perhaps most importantly, though, Unity is now saying games made with previous versions of their engine won't be beholden to this new pricing model, nor would they need to abide by the new terms of service, which among other things says their games need to include a big, Made by Unity splash screen at the beginning, and only those that use the new version being released in 2024 would be required to pay based on downloads, though developers can choose to pay a 2.5% revenue share rather than using the per-installation model—and there's some indication that if they report install numbers, the company will choose whichever is the lowest fee for them, automatically, and charge them that.All of which seems to have cooled things down quite a lot, though a fair bit of damage has already been done to the company's reputation in the industry; many game-makers are still saying they're intending to port their games away as soon as they're able, and that they won't use Unity in the future, because the people in charge of the company have shown their true colors, have shown that they're willing to renege on previous commitments and promises, and burn the goodwill they've earned over the years, in order to pull in more money, to fill the gaps in their balance sheets.The company is investing in a big PR push to try to win people back and polish their now-tarnished brand, but it could be a while before they manage to do so, if indeed they do manage to do so.In the meantime, industry alternatives have seen a big boost in attention and use, and there's a chance we could see more entrants in this space, popping up to take advantage of the hole left by Unity's flub, and introducing entirely new business models that may further innovate on what we've already seen, and allow entirely new game-world business models to arise and flourish. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Sep 19, 2023 • 22min
Antiretroviral Therapies
This week we talk about HIV, AIDS, and ART.We also discuss HAART, the Berlin Patient, and potential future cures.Recommended Book: Allergic by Theresa MacPhailShow Notes* https://www.unaids.org/en/resources/fact-sheet* https://hivinfo.nih.gov/understanding-hiv/fact-sheets/hiv-treatment-basics* https://clinicalinfo.hiv.gov/en/glossary/antiretroviral-therapy-art* https://www.paho.org/en/topics/antiretroviral-therapy* https://journals.lww.com/jaids/fulltext/2010/01010/declines_in_mortality_rates_and_changes_in_causes.13.aspx* https://link.springer.com/article/10.1007/s13181-013-0325-8* https://academic.oup.com/jac/article/73/11/3148/5055837?login=false* https://journals.lww.com/jaids/fulltext/2016/09010/narrowing_the_gap_in_life_expectancy_between.6.aspx* https://en.wikipedia.org/wiki/Tenofovir_disoproxil* https://en.wikipedia.org/wiki/Management_of_HIV/AIDS* https://www.verywellhealth.com/cart-hiv-combination-antiretroviral-therapy-48921* https://www.cdc.gov/hiv/risk/art/index.html* https://www.freethink.com/health/cured-of-hiv* https://www.jstor.org/stable/3397566?origin=crossref* https://www.nytimes.com/1982/05/11/science/new-homosexual-disorder-worries-health-officials.html* https://pubmed.ncbi.nlm.nih.gov/23444290/* https://my.clevelandclinic.org/health/diseases/4251-hiv-aids* https://web.archive.org/web/20080527201701/http://data.unaids.org/pub/EPISlides/2007/2007_epiupdate_en.pdf* https://www.thelancet.com/journals/lanhiv/article/PIIS2352-3018(23)00028-0/fulltextTranscriptIn mid-May of 1981, the queer community-focused newspaper, the New York Native, published what would become the first-ever article on a strange disease that seemed to be afflicting community members in the city.What eventually became known as AIDS, but which was at the time discussed by medical professionals primarily in terms of its associated diseases, was clinically reported upon for the first time less than a month later, five official cases having been documented in an interconnected group of gay men and users of injectable drugs, who came to the attention of doctors for not being inherently immunocompromised, but still somehow contracting a rare type of pneumonia that only really impacted folks with severely impaired immune systems.In subsequent years, doctors started using a range of different terms for HIV and AIDS, calling them at different times and in different contexts the lymphotophic retrovirus, Kaposi's sarcoma and opportunistic infections, and the 4H disease, referring to heroine users, hemophiliacs, homosexuals, and Haitians, the four groups that seemed to make up almost all of the confirmed afflicted patients.The acronym GRID, for gay-related immune deficiency was also used for a time, but that one was fairly rapidly phased out when it became clear that this condition wasn't limited to the gay community—though those earlier assumptions and the terminology associated with them did manage to lock that bias into mainstream conversation and understanding of AIDS and HIV for a long time, and in some cases and in some locations, to this day.By the mid-80s, two research groups had identified different viruses that seemed to be associated with or responsible for cases of this mysterious condition, and it was eventually determined (in 1986) that they were actually the same virus, and that virus was designated HIV.HIV, short for Human Immunodeficiency Virus, is a retrovirus that, if left untreated, leads to Acquired Immunodeficiency Syndrome, or AIDS, in about 50% of patients within ten years of infection.So HIV is the virus, AIDS is a condition someone with HIV can develop after their immune system is severely damaged by the infection, and there are a bunch of diagnostic differentiations that determine when someone has transitioned from one category to the other, but in general folks with HIV will experience moderate flu- or mono-like symptoms, alongside swollen lymph nodes and rashes and throat problems and sores across their bodies in the early stages of infection, and as things progress, they develop opportunistic infections of the kind that can only really latch onto a human when their immune system is weakened or shut down. While AIDS, arriving after the immune system is well and truly damaged, brings with it a slew of opportunistic infections and associated issues, the afflicted person potentially developing all sorts of cancers, sarcomas, persistent infections, and extreme versions of the flu-like, mono-like symptoms they may have suffered earlier on.We don't know for certain how and where HIV originated—and that's true of both kinds, as there's an HIV-1 and HIV-2 virus, the former of which accounts for most infections, the latter of which is less common, and less overall infectious—but both HIV types seem to have been transmitted to humans from non-human primates somewhere in West-central Africa in the early 20th century, possibly from chimpanzees in southern Cameroon, but that's pretty speculative, and there's some evidence that these diseases may have made the leap several times; so while there's a pretty good chance, based on what we know now, that the disease made it into humans and mutated approximately somewhere in that vicinity, sometime in the early 20th century, possibly via chimps hunted and eaten by locals as bushmeat, we really don't know for certain.There are reports of what were probably HIV as far back as 1959 in the Belgian Congo, but that's a bit speculative, too, and based on imperfect notes from the time.Back then, though, and through the 1980s, folks who contracted HIV and who were not treated would typically die within 11 years of being infected, and more than half of those diagnosed with AIDS in the US from 1981 through 1992 died within 2 years of their diagnosis; such a diagnosis was a death sentence, basically; it was a really horrible and scary time.Today, the outlook for folks who contract HIV is substantially better: the life expectancy of someone who contracts the virus and who is able to get treatment is about the same as someone who is not infected; the disease isn't cured, but the level of HIV virus in the blood of a person receiving treatment is so small that it's no longer transmissible, or even detectable.What I'd like to talk about today is a new therapy that's making those sorts of outcomes possible, how some few people have now been cured of HIV entirely, and what's on the horizon in this space.—Antiretroviral therapy, or ART, typically consists of a combination of drugs based on those that were originally combined in this way in 1996 by researchers who announced their findings at the International AIDS Conference in Vancouver—they called their approach highly active antiretroviral therapy, or HAART, and this combo was based on findings from earlier drugs that addressed one of HIV's seven stages of development—but because they all hit that same, single stage, the virus was rapidly developing an immunity to them, and they were universally pretty toxic, with horrible side-effects.What's more, this drug cocktail increased patients' life expectancy by about 24 months, on average—which is a lot, about two years, but considering all those side effects, which included severe liver problems and anemia, the extra months of life generally weren't very pleasant extra months.In 1995, a class of drugs called protease inhibitors were introduced, which prevented HIV from making copies of itself using the body's structural proteins.That, combined with the effects of other, existing retrovirals, which hindered the virus's ability to hijack the body's cells to make more of itself, turned out to be a substantial improvement over just one or the other approach.The announcement in 1996 was notable because the researchers involved were able to knock the viral load in their patients down to an undetectable level, and then keep it there, by using three drugs from each of those two antiviral classes, those two different approaches.So HAART was a major improvement over what came before, but it was still imperfect; deaths tied to HIV plummeted by 50% in the US and Europe in just three years, but the life expectancy of folks using this therapy was still low compared to other people; someone who contracted HIV in their 20s and went on this therapy was still only expected to live till their early 50s; way better than a two-year increase, but still plenty of room for improvement.In addition to that lifespan duration limitation, the HAART bundle of therapies was just really difficult to maintain.Some people experienced a dramatic redistribution of body fat, some developed heart arrhythmias or insulin resistance or peripheral neuropathy or lactic acidosis—which is basically a toxic buildup of the acid that results from metabolism, which is usually cleared naturally, but when it doesn't, it's potentially deadly.Anything less than absolutely perfect adherence to the treatment schedule was also potentially deleterious to the desired outcomes; it wasn't a forgiving regimen, with some of the drugs requiring three capsules be taken every 8 hours, and there was a chance that if a portion of a dose of one drug was missed, or not administered on time, the virus could develop an immunity to it and the whole thing would fall apart.Consequently, the HAART regimen was generally reserved until things got really bad, and that meant it didn't have a very large effect on the infected population, and those who did benefit from it suffered consequences, alongside those benefits.The change in terminology from HAART to ART arrived in 2001 when a drug called Viread, the brand name for tenofovir disoproxil, was released and added into the mix, replacing some of the most toxic and cumbersome of the previous therapies with a single pill per day, and one that came with far fewer, and far less extreme, side effects.In 2005 it was finally demonstrable, with a bunch of data, that beginning this type of therapy early rather than waiting until things get really bad was worth the trade-offs—researchers showed that if folks received access to ART upon diagnosis, severe HIV associated and non HIV associated illnesses were reduced by 61%.As of 2016 there was still an average life expectancy gap between folks with HIV who received early care and people who were not infected of about 8 years, but that gap has been steadily closing with the introduction of new, easier to use, less side effect prone drugs—drugs that tend to attack the virus at different stages, and which take different approaches to hindering and blocking it—alongside innovations in how the drugs are delivered, like introducing substances that are converted by the body into the desired drug, which massively cuts the requisite dosage, in turn lessening the strain on the body's organs and the potential side effects associated with taking a higher dose of the drug, itself.We've also seen the advent of fixed-dose combination drugs, which are exactly what they sound like: a single pill containing the entire combination of drugs one must take each day, which makes a combination therapy much easier to administration and stick with, which in turn has substantially reduced the risk of severe side effects, and prevented mutations that might otherwise make a patient's virus more immune to some component of the drug cocktail.Some newer options just use two drugs, too, compared to the previous three-or-more, and most of these have been shown to be just as effective as the earlier, more bodily stressful combinations, and a recent, 2021 drug is injectable, rather than deliverable in pill-form, and can be administered just once a month—though a version of this drug, sold under the name Cabenuva, has been approved for administration every other month.So things in this corner of the medical world are looking pretty good, due new approaches and innovations to existing therapy models.These models remain imperfect, but they're getting better every year, and contracting HIV is no longer a death sentence, nor does it mean you'll always be infectious, or even detectably infected: the amount of HIV virus in one's blood can be kept undetectably low for essentially one's entire life, so long as one is able to get on the right therapy or combination of therapies and stick with it.That said, the global HIV pandemic is far from over, and access to these drugs–many of which are pricy, if you don't have insurance that will cover them—is not equally distributed.As of late-2022, the UN's official numbers indicate that about 39 million people, globally, have HIV, about 1.3 million were infected in 2022, and about 630,000 died from AIDS-related illnesses that year.That said, of those 39 million or so who are infected, nearly 30 million are receiving some kind of antiretroviral therapy, and about 86% of people who are estimated to be infected know their status, so they can seek such therapies, and/or take other precautious to protect themselves and others; though that also means about 5.5 million people, globally, have HIV and don't realize it.Here's a really remarkable figure, though: among people who are infected and know they are infected, about 93% of them were virally suppressed as of 2022.That's astonishing; 93% of people who have HIV and are aware of it are on some kind of therapy that has allowed them to suppress the virus so that it's nearly undetectable—the difference between the two, by the way, is that suppressed means 200 copies of the HIV virus per milliliter of blood, while undetectable is generally considered to be less than 50 copies per milliliter.So huge leaps in a relatively short period of time, and a massive improvement in both duration and quality of life for folks who might otherwise suffer mightily, and then die early, because of this virus and its associated symptoms.That said, there are some interesting, new approaches to dealing with HIV on the horizon, and some of them might prove to be even more impactful than this current batch of incredibly impactful ART options.As of September 2023, five people have been confirmed cured of HIV; not suppressed and not with viral loads at undetectable levels: cured.The first of these cured people, often referred to as the Berlin Patient, received a stem cell transplant from a bone marrow donation database that contained a genetic mutation called CCR5 Delta 32, which makes those who have it essentially immune to HIV infection.Three months after he received the transplant and stopped taking ART, doctors were unable to find any trace of the virus in his blood.He died from cancer in 2020, but there didn't seem to be any HIV in his blood from when he received the stem cell transplant, onward, and that happened in the early 2000s, and was formally announced to the medical community in 2008.At least two other people—two that we know about, anyway—have been cured of HIV using the same method; though at the moment at least, this option is severely limited as it requires that patients have a bone marrow match in donor databases, and that one of those donors have that specific, relatively rare mutation; so with existing science and techniques, at least, this is unlikely to be a widespread solution to this problem—though a 2017 experiment used stem cells derived from umbilical cord blood from a baby with that mutation to treat a woman' leukemia and cure her HIV, so there's a chance other approaches that make use of the same basic concept might be developed, opening this up to more people.Cancer drugs may also help some people with HIV: a drug that's been approved to treat several cancers called Venetoclax seems to also bind to a protein that helps HIV-infected T cells dodge the body's immune system and survive, and that realization has led to a series of experiments that showed HIV was suppressed in mice receiving this drug—though it bounced back a week later, and two weeks later in mice receiving both this drug and ART.This is unlikely to be a solution unto itself, then, but there's a chance either an adjusted version of this drug, or this drug in combination with other therapies, might be effective; and there's a clinical trial testing the efficacy of Venetoclax in human HIV patients at the end of this year, and another in 2024, so we may soon know if its safe and desirable to use this drug alongside ART, and that may, in turn, lead to a better understanding of how to amplify the drug's effects, or apply this method of hindering HIV from a different angle.CRISPR, the gene-editing technology borrowed from bacteria that allows for the cutting and removing and adding of genetic information, has enabled the development of several new potential HIV cures, one of which, called EBT-101, basically enters the body, finds helper T cells, and then cuts out chunks of the HIV virus's DNA, which prevents it from being able to replicate itself or hide away, reemerging later after another treatment has suppressed it.The benefit of this approach is that it could kill the viral reservoirs that otherwise allow HIV to persist in people who have undergone treatments, and a version of it that targets SIV, which is similar to HIV, but found in non-human primates—performed exactly as they hoped it would, finding and editing the targeted DNA, raising hopes than an HIV-targeting variation may manage similar wonders in human patients.This would be great if it ends up working, as one injection would theoretically clear all HIV from a person's system in relatively short-order, but the trials done so far have been small and on monkeys, and because of the nature of the research, it's not clear the monkeys were cured of HIV—just that the treatment got where it was supposed to go and made some DNA edits.A human trial of EBT-101 will finish up in March of 2025, though the researchers plan to follow up with their subjects for up to 15 years following the trial, to assess any long-term effects from their treatment, since CRISPR and this approach to messing with genes is still such a new thing.So while this may be a solution at some point, there's a good chance it won't be a real-deal, available option for another decade, minimum.So we've come a long way in a very short period of time with HIV and AIDS treatments, and the future is looking pretty good, with even more options and approaches on the horizon, including some actual cures, alongside high-quality, actually useable treatments.But there's still room to grow in terms of infection awareness, there are still distribution issues for some of these drugs, and there's still a fair bit of prejudice, the consequence of ignorance and historical misunderstandings and biases, keeping folks and institutions from doing as much as they otherwise could in many parts of the world; so a lot to be proud of, a lot to look forward to, but still plenty of room for improvement across the board. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Sep 12, 2023 • 17min
China Standard Map
This week we talk about China’s standard map, the nine-dash line, and shoals.We also discuss WWIII, undersea minerals, and realities on the ground.Recommended Book: Outlive by Peter AttiaShow Notes* https://www.chinadaily.com.cn/a/202308/28/WS64ec91c2a31035260b81ea5b.html* https://www.uscc.gov/research/south-china-sea-arbitration-ruling-what-happened-and-whats-next* https://amti.csis.org/island-tracker/china/* https://globalvoices.org/2023/09/05/the-chinese-2023-map-has-nothing-new-but-why-are-chinas-neighbours-mad-about-it/* https://en.wikipedia.org/wiki/Government_of_China* https://www.reuters.com/world/asia-pacific/philippines-taiwan-malaysia-reject-chinas-latest-south-china-sea-map-2023-08-31/* https://theconversation.com/what-is-the-nine-dash-line-and-what-does-it-have-to-do-with-the-barbie-movie-209043* https://en.wikipedia.org/wiki/Republic_of_China_(1912%E2%80%931949)* https://en.wikipedia.org/wiki/Nine-dash_line* https://theconversation.com/what-is-the-nine-dash-line-and-what-does-it-have-to-do-with-the-barbie-movie-209043* https://hir.harvard.edu/vietnam-and-china-conflicting-neighbors-stuck-in-nationalism-and-memory/TranscriptIn the wake of some stunning defeats to European powers in the 19th century, and its place on the winning side of WWII, the Chinese government saw quite a lot of territory disappear, but then gained a fair bit back, following that global conflict, and this necessitated the redrawing of many maps, most of which were substantially outdated, because of the relative rapidity with which their territory was changing during this period—they lost Vietnam as a supplicant state, for instance, but also added a fair number of former Japanese islands to their collection, including Taiwan, which it took from Japan in 1945, and where the former Chinese government fled following Mao's revolution, which is what led to modern day Taiwan as a separate state, by their reckoning, at least, from that of Mainland China, which doesn't agree.And as is the case with Taiwan, not everyone in the area agrees about which other islands and bodies of water belong to whom, and the huge number of islands of varying sizes in the South China Sea are especially fraught, in terms of ownership claims, as many of them are worthless for the purpose of building real-deal settlements, but could be useful in terms of military infrastructure, allowing ships to dock and refuel, serving as weapons platforms for missiles and anti-aircraft equipment; that sort of thing.These island-related controversies have sparked or been components of several recent conflicts in the region, including clashes between the Chinese and Vietnamese militaries in 1974 and 1988, and as an apparent effort to lock-in their claim to some of these territories, the Chinese government, in December of 1947, published a map called the Location Map of South Sea Islands, which showed the South China Sea, along with an eleven-dash line that encompassed a huge, u-shaped portion of the region, with the implication that everything within that line belonged to China, though the Chinese government never outright said "all of this is ours, stay out."Beginning in the early 1950s, this line used only nine dashes, and had changed shape a bit, no longer including the Gulf of Tonkin as a concession to the now-independent North Vietnamese government.But the former Chinese government, the one that was now occupying and governing from Taiwan, continued to use an eleven-dash line on their official map, the implication being that they don't recognize the changes to Chinese territory made by the successor Chinese government that usurped them back in the mid-20th century.However many dashes are used, and whatever the specific expanse of them, though, the significance of this line on what's become known as the Chinese standard maps released at a regular cadence by the government have become the topic of furious debate, as the Chinese government has never really clarified what they're saying when they publish these things, allowing the implication to be that this is their home turf, their islands and ocean, but never taking the next step that would be required to formalize that claim.The implication of any territorial barrier is the violence required to defend it, so the presumed rationale here is that, like Taiwan's status, which is in an official sort of superposition right now, the Chinese government claiming it as their own, the Taiwanese government claiming independence, and everyone else just kind of making positive or negative noises while seldom taking a firm stance one way or the other, allows everyone involved to be unhappy and to hold their own opinions, but to not feel like they need to go to war over the issue, because no hands have been forced in that regard; a stronger stance and a more formal declaration of independence by Taiwan, supported by other nations, would presumably necessitate military action from China, while the same sort of concrete move by China to retake the island by force would probably trigger action from its opposition, as well.Leaving things flexible and vague, though, keeps everything nebulous enough that nothing needs to be blown up and no one has to die.The same seems to be true with this larger pseudo-claim of territory from the Chinese government, these maps showing an area that looks a lot like it belongs to China, but the Chinese government never formally saying "this is ours, and thus, if you want to go to these islands, travel these waters, you'll need our permission, and we'll blast you to kingdom come if you step over the line we've drawn here."What I'd like to talk about today are the implications of this sort of intentional geographic uncertainty, and the response to a new standard map the Chinese government recently released.—The 2023 edition of China's standard map, which usually displays its now-famous nine-dash line alongside other information about the country, like its territorial delineations, capitol cities, and the like, has created a moderate uproar throughout the area in part due to the addition of a tenth dash, and in part because it seems to have added to its collection of territory at the expense of many of its neighbors.Among those who are upset about these new visual claims is the Russian government, which has become increasingly close with the Chinese government following its invasion of Ukraine, which has left it a bit of a pariah, globally, and it's been in many ways propped up and sustained economically by trade with China; but even they made a statement of distaste about this map, which seems to show that an island which was previously divided between Chinese and Russian control, is now just China's.India is also pissed that highly disputed areas along its border with China have seemingly been folded into its neighbor's official collection of territories with the advent of this new map, and Vietnam, the Philippines, Malaysia, Indonesia, and of course, Taiwan, have also spoken out against what this new map implies—the latter of which, Taiwan, perhaps more than most, as that additional tenth dash seems to more firmly embrace it than previous maps, implying that Taiwan is becoming more China's than ever before, which in the current geopolitical context represents a potential military threat.But those other nations are also pretty peeved, as islands they claim as their own have been looped into this large u-shaped area, portrayed as being China's and China's alone; and although in many cases that's been true of previous versions of the map, as well, the context surrounding this version's release is substantially different than the context in previous years.So in response to this hubbub of outcries, the Chinese government has said, basically, calm down, this is the same map, what are you all so upset about?And to some degree that's largely true: most of these claims were on previous maps as well, but that additional dash does seem pretty aggressive in a world in which the Chinese government has made pretty clear that it both intends to retake Taiwan at some point, and that it's willing and able to do so, militarily, and in which the government has been feverishly investing in more guns, ships, jets, and missiles, and rapidly building out its military presence in these contested areas, including military bases high in the mountains along its border with India, in territory both nations claim as their own, and the construction of ship docks and turrets and missile launchers on tiny little islands in the middle of the ocean, which other nations claim, as well, but which China is physically occupying, punctuating their map-based claims with real-world threats toward anyone who challenges them; realities on the ground, to use the defense world parlance for building military assets of this kind in contested territory.In 2016, the Permanent Court of Arbitration in The Hague ruled that China's nine-dash line didn't have any basis in international law, and that this region is mostly international waters, usable by anyone, anytime, for any reasons, more or less.China dismissed this ruling and said it would ignore it, basically, so while other nations in the area, like the Philippines, have continued to fish in traditional areas, like the shoals surrounding the Spratly Islands, located between them and Malaysia and Vietnam in the South China Sea, China has been building artificial islands atop coral reefs on this island chain, dredging sand onto the reefs and then pouring concrete over that sand, allowing it to build permanent military structures and install radar systems, missile silos, and aircraft hangers, where it also now bases military aircraft.This has been a huge investment and a lot of work for the Chinese government, but it's allowing them to convert the soft, vague claims printed on their maps into hard realities in the real-world; the international arbitrators in The Hague would not honor what they considered to be their historic, national territorial claims, so they went out and made them real; the equivalent of putting up fences around land with unclear ownership on a parcel near your home—it might still be legally debatable who owns that land, but it becomes very clear who has control over it and access to it and who can use it after a fence is put up; and that's even more the case when you begin to deny others access and imply that you are willing and able to defend it if someone decides to step into what is now, on a very practical level, your turf.This carving out of new territory from international waters and in contested regions by the Chinese government has become an even more substantial issue over the past decade or so as the race to claim and develop undersea resources has become more frantic, with governments around the world scrambling to secure the minerals and other raw materials that will inform the next, post-fossil fuel paradigm, and many of these resources, from lithium to nickel to cobalt, are contained in hard-to-reach areas, like, in some cases, underwater continental shelves.So just as the Arctic has become a hotbed for exploration and infrastructural development, everyone with borders touching the Arctic Ocean doing what they can to build-out their ship-based capacity, military bases, and knowledge of what's underneath all that water, for if and when they can eventually justify stepping in to start building and harvesting those raw materials, the South China Sea is also rich with such assets, and this line on this map, and all this real-world building and hardening of military defenses in the area, is meant to allow China, if and when it wants, to start claiming these resources as its own, as it will have already established clear ownership of the territory surrounding these stockpiles, and the ability to defend these assets if anyone else challenges their claim.Physical conflict related to such claims has already broken out a few times, mostly related to fishing at the moment—the Chinese Coast Guard shooting high-powered water cannons at vessels owned by Philippines-based companies and Vietnamese fishing boats in order to drive them away and again, implicitly, partition-off these rich areas, over time redefining them as being for exclusive Chinese use.But the big concern is that at some point these measures might become more serious and deadly, and this type of conflict, if it escalates, could spiral into something truly global.The disagreement between China and Taiwan about who owns the island and whether the Taiwanese government is legit or not is generally seen as one of the most volatile hot-spots on the planet, in terms of the potential to accidentally set off WWIII, because of who's allied with whom, and what everyone involved has to gain or lose by engaging in such a conflict.It's possible, though, that something seemingly lower-level, like a scuffle over fishing grounds, or the development of undersea mineral extraction infrastructure could be what sets off such a fight, as China defending international waters as if they are their own, putting up a fence on public property, basically, and then shooting anyone who approaches, becomes a test of the international system, and that could lead to a direct conflict between China and let's say the Philippines, and that could pull other regional entities like Vietnam and Indonesia, and maybe even India into the fight, which in turn would potentially bring the US and EU into the conflict, directly or indirectly, alongside Russia and Iran on China's side, again, directly or indirectly.All of which could compound into something incredibly devastating, all because China is attempting to expand in a manner that is considered illegal by international bodies, because what we might think of as the Western bloc, the US and EU and India and its allies, are trying to box China in, as a response, which China doesn't like and which is probably amplifying their efforts in this regard, and because all of that is making this area a potential tinderbox for conflict—no one wanting to give ground, everyone aware the world is changing around them, economically, climactically, and so on, and everyone trying to set themselves up to be in the best possible position mid-century or so, doing the math and maybe even deciding a big conflict would be worth it, so long as that would make them a bigwig in the rapidly impending, next-step geopolitical paradigm. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Sep 5, 2023 • 23min
Gerontocracy
The podcast discusses the life and legacy of a famous singer, explores the challenges of gerontocracy in positions of power, and raises concerns about the performance of Mitch McConnell due to his age.

Aug 29, 2023 • 15min
BRICS
This week we talk about BRIC, BRICs, and BRICS+.We also discuss the USD, sanctions, and alternative global financial systems.Show notes/transcript: letsknowthings.com This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Aug 22, 2023 • 21min
Coup Belt
This week we talk about ECOWAS, Niger, and proxy conflicts.We also discuss military dictatorships, Wagner, and colonies.Show notes/transcript: letsknowthings.com This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Aug 15, 2023 • 23min
Bidenomics
Topics covered in this podcast include the Inflation Reduction Act, October Surprises, Hunter Biden's laptop scandal, the 2024 US Presidential election, Trump's legal woes, inflation, President Biden's economic approach, economic policies of the Biden administration, challenges of inflation and Russia's invasion of Ukraine, and President Biden's approval ratings and potential challenges ahead.

Aug 8, 2023 • 18min
Room-Temperature Superconductors
This week we talk about LK-99, mercury, and resistance.We also discuss online citizen science, physics, and replication issues.Show notes/transcript: letsknowthings.com This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe