
The New Stack Podcast
The New Stack Podcast is all about the developers, software engineers and operations people who build at-scale architectures that change the way we develop and deploy software.
For more content from The New Stack, subscribe on YouTube at: https://www.youtube.com/c/TheNewStack
Latest episodes

Jan 3, 2024 • 24min
2023 Top Episodes - What’s Platform Engineering?
Platform engineering “is the art of designing and binding all of the different tech and tools that you have inside of an organization into a golden path that enables self service for developers and reduces cognitive load,” said Kaspar Von Grünberg, founder and CEO of Humanitec, in this episode of The New Stack Makers podcast.This structure is important for individual contributors, Grünberg said, as well as backend engineers: “if you look at the operation teams, it reduces their burden to do repetitive things. And so platform engineers build and design internal developer platforms, and help and serve users."This conversation, hosted by Heather Joslyn, TNS features editor, dove into platform engineering: what it is, how it works, the problems it is intended to solve, and how to get started in building a platform engineering operation in your organization. It also debunks some key fallacies around the concept.Learn more from The New Stack about Platform Engineering and Humanitec:Platform Engineering Overview, News, and TrendsThe Hype Train Is Over. Platform Engineering Is Here to Stay9 Steps to Platform Engineering Hell

Dec 27, 2023 • 32min
2023 Top Episodes - The End of Programming is Nigh
Is the end of programming nigh? That's the big question posed in this episode recorded earlier in 2023. It was very popular among listeners, and with the topic being as relevant as ever, we wanted to wrap up the year by highlighting this conversation again.If you ask Matt Welsh, he'd say yes, the end of programming is upon us. As Richard McManus wrote on The New Stack, Welsh is a former professor of computer science at Harvard who spoke at a virtual meetup of the Chicago Association for Computing Machinery (ACM), explaining his thesis that ChatGPT and GitHub Copilot represent the beginning of the end of programming.Welsh joined us on The New Stack Makers to discuss his perspectives about the end of programming and answer questions about the future of computer science, distributed computing, and more.Welsh is now the founder of fixie.ai, a platform they are building to let companies develop applications on top of large language models to extend with different capabilities.For 40 to 50 years, programming language design has had one goal. Make it easier to write programs, Welsh said in the interview.Still, programming languages are complex, Welsh said. And no amount of work is going to make it simple. Learn more from The New Stack about AI and the future of software development:Top 5 Large Language Models and How to Use Them Effectively30 Non-Trivial Ways for Developers to Use GPT-4Developer Tips in AI Prompt Engineering

Dec 21, 2023 • 16min
The New Age of Virtualization
Kubevirt, a relatively new capability within Kubernetes, signifies a shift in the virtualization landscape, allowing operations teams to run KVM virtual machines nested in containers behind the Kubernetes API. This integration means that the Kubernetes API now encompasses the concept of virtual machines, enabling VM-based workloads to operate seamlessly within a cluster behind the API. This development addresses the challenge of transitioning traditional virtualized environments into cloud-native settings, where certain applications may resist containerization or require substantial investments for adaptation.The emerging era of virtualization simplifies the execution of virtual machines without concerning the underlying infrastructure, presenting various opportunities and use cases. Noteworthy advantages include simplified migration of legacy applications without the need for containerization, thereby reducing associated costs.Kubevirt 1.1, discussed at KubeCon in Chicago by Red Hat's Vladik Romanovsky and Nvidia's Ryan Hallisey, introduces features like memory hotplug and vCPU hotplug, emphasizing the stability of Kubevirt. The platform's stability now allows for the implementation of features that were previously constrained.Learn more from The New Stack about Kubevirt and the Cloud Native Computing Foundation:The Future of VMs on Kubernetes: Building on KubeVirtA Platform for KubernetesScaling Open Source Community by Getting Closer to Users

Dec 13, 2023 • 20min
Kubernetes Goes Mainstream? With Calico, Yes
The Kubernetes landscape is evolving, shifting from the domain of visionaries and early adopters to a more mainstream audience. Tigera, represented by CEO Ratan Tipirneni at KubeCon North America in Chicago, recognizes the changing dynamics and the demand for simplified Kubernetes solutions. Tigera's open-source Calico security platform has been updated with a focus on mainstream users, presenting a cohesive and user-friendly solution. This update encompasses five key capabilities: vulnerability scoring, configuration hardening, runtime security, network security, and observability.The aim is to provide users with a comprehensive view of their cluster's security through a zero to 100 scoring system, tracked over time. Tigera's recommendation engine suggests actions to enhance overall security based on the risk profile, evaluating factors such as egress traffic controls and workload isolation within dynamic Kubernetes environments. Tigera emphasizes the importance of understanding the actual flow of data across the network, using empirical data and observed behavior to build accurate security measures rather than relying on projections. This approach addresses the evolving needs of customers who seek not just vulnerability scores but insights into runtime behavior for a more robust security profile.Learn more from The New Stack about Tigera and Cloud Native Security:Cloud Native Network Security: Who’s Responsible?Turbocharging Host Workloads with Calico eBPF and XDP3 Observability Best Practices for Cloud Native App Security

Dec 12, 2023 • 19min
Hello, GitOps -- Boeing's Open Source Push
Boeing, with around 6,000 engineers, is emphasizing open source engagement by focusing on three main themes, according to Damani Corbin, who heads Boeing's Open Source office. He joined our host, Alex Williams, for a discussion at KubeCon+CloudNativeCon in Chicago.The first priority Corbin talks about is simplifying the consumption of open source software for developers. Second, Boeing aims to facilitate developer contributions to open source projects, fostering involvement in communities like the Cloud Native Computing Foundation and the Linux Foundation. The third theme involves identifying opportunities for "inner sourcing" to share internally developed solutions across different groups.Boeing is actively working to break down barriers and encourage code reuse across the organization, promoting participation in open source initiatives. Corbin highlights the importance of separating business-critical components from those that can be shared with the community, prioritizing security and extending efforts to enhance open source security practices. The organization is consolidating its open source strategy by collaborating with legal and information security teams.Corbin emphasizes the goal of making open source involvement accessible and attractive, with a phased approach to encourage meaningful contributions and ultimately enabling the compensation of engineers for open source work in the future.Learn more from The New Stack about Boeing and CNCF open source projects:How Boeing Uses Cloud NativeHow Open Source Has Turned the Tables on Enterprise SoftwareScaling Open Source Community by Getting Closer to UsersMercedes-Benz: 4 Reasons to Sponsor Open Source Projects

Dec 7, 2023 • 18min
How AWS Supports Open Source Work in the Kubernetes Universe
At KubeCon + CloudNativeCon North America 2022, Amazon Web Services (AWS) revealed plans to mirror Kubernetes assets hosted on Google Cloud, addressing Cloud Native Computing Foundation's (CNCF) egress costs. A year later, the project, led by AWS's Davanum Srinivas, redirects image requests to the nearest cloud provider, reducing egress costs for users.AWS's Todd Neal and Jonathan Innis discussed this on The New Stack Makers podcast recorded at KubeCon North America 2023. Neal explained the registry's functionality, allowing users to pull images directly from the respective cloud provider, avoiding egress costs.The discussion also highlighted AWS's recent open source contributions, including beta features in Kubectl, prerelease of Containerd 2.0, and Microsoft's support for Karpenter on Azure. Karpenter, an AWS-developed Kubernetes cluster autoscaler, simplifies node group configuration, dynamically selecting instance types and availability zones based on running pods.The AWS team encouraged developers to contribute to Kubernetes ecosystem projects and join the sig-node CI subproject to enhance kubelet reliability. The conversation in this episode emphasized the benefits of open development for rapid feedback and community collaboration.Learn more from The New Stack about AWS and Open Source:Powertools for AWS Lambda Grows with Help of VolunteersAmazon Web Services Open Sources a KVM-Based Fuzzing FrameworkAWS: Why We Support Sustainable Open Source

Dec 6, 2023 • 22min
2024 Forecast: What Can Developers Expect in the New Year?
In the past year, developers have faced both promise and uncertainty, particularly in the realm of generative AI. Heath Newburn, global field CTO for PagerDuty, joins TNS host Heather Joslyn to talk about the impact AI and other topics will have on developers in 2024.Newburn anticipates a growing emphasis on DevSecOps in response to high-profile cyber incidents, noting a shift in executive attitudes toward security spending. The rise of automation-centric tools like Backstage signals a changing landscape in the link between development and operations tools. Notably, there's a move from focusing on efficiency gains to achieving new outcomes, with organizations seeking innovative products rather than marginal coding speed improvements.Newburn highlights the importance of experimentation, encouraging organizations to identify areas for trial and error, learning swiftly from failures. The upcoming year is predicted to favor organizations capable of rapid experimentation and information gathering over perfection in code writing.Listen to the full podcast episode as Newburn further discusses his predictions related to platform engineering, remote work, and the continued impact of generative AI.Learn more from The New Stack about PagerDuty and trends in software development:How AI and Automation Can Improve Operational ResiliencyWhy Infrastructure as Code Is Vital for Modern DevOpsOperationalizing AI: Accelerating Automation, DataOps, AIOps

Dec 5, 2023 • 20min
How to Know If You’re Building the Right Internal Tools
Rob Skillington, co-founder and CTO of Chronosphere, discusses building internal tools and the challenges engineers face. He emphasizes the importance of understanding project abstractions and considering build or buy decisions. Skillington shares lessons from his experience at Uber, highlighting the importance of knowing the audience and customer base. He addresses the 'not invented here syndrome' prevalent in organizations like Microsoft and suggests younger companies explore external solutions. The conversation provides insights into Skillington's experiences and considerations in developing internal tools and platforms.

Nov 30, 2023 • 26min
Hey Programming Language Developer -- Get Over Yourself
Jean Yang, founder of API observability company Akita Software, emphasizes that programming languages should be shaped by software development needs and data, rather than philosophical ideals. Yang, a former assistant professor at Carnegie Mellon University, believes that programming tools and processes should be influenced by actual use and data, prioritizing the developer experience over the language creator's beliefs. With a background in programming languages, Yang advocates for a shift away from the outdated notion that language developers are building solely for themselves.In this discussion on The New Stack Makers, Yang underscores the importance of understanding the reality of developers' needs, especially as developer tools have evolved into a full-time industry. She argues for a focus on UX design and product fundamentals in developing tools, moving beyond the traditional mindset where developer tools were considered side projects.Yang founded Akita to address the challenges of building reliable software systems in a world dominated by APIs and microservices. The company transitioned to API observability, recognizing the crucial role APIs play in enhancing the understandability of complex systems. Yang's commitment to improving software correctness and the belief in APIs as key to abstraction and ease of monitoring align with Postman's direction after acquiring Akita. Postman aims to serve developers worldwide, emphasizing the significance of APIs in complex systems.Check out more episodes from The Tech Founder Odyssey series:How Byteboard’s CEO Decided to Fix the Broken Tech InterviewA Lifelong ‘Maker’ Tackles a Developer Onboarding ProblemHow Teleport’s Leader Transitioned from Engineer to CEO

Nov 28, 2023 • 12min
Docker CTO Explains How Docker Can Support AI Efforts
Docker CTO Justin Cormack reveals that Docker has been a go-to tool for data scientists in AI and machine learning for years, primarily in specialized areas like image processing and prediction models. However, the release of OpenAI's ChatGPT last year sparked a significant surge in Docker's popularity within the AI community.The focus shifted to large language models (LLMs), with a growing interest in the retrieval-augmented generation (RAG) stack. Docker's collaboration with Ollama enables developers to run Llama 2 and Code Llama locally, simplifying the process of starting and experimenting with AI applications. Additionally, partnerships with Neo4j and LangChain allow for enhanced support in storing and retrieving data for LLMs.Cormack emphasizes the simplicity of getting started locally, addressing challenges related to GPU shortages in the cloud. Docker's efforts also include building an AI solution using its data, aiming to assist users in Dockerizing applications through an interactive notebook in Visual Studio Code. This tool leverages LLMs to analyze applications, suggest improvements, and generate Docker files tailored to specific languages and applications.Docker's integration with AI technologies demonstrates a commitment to making AI and Docker more accessible and user-friendly.Learn more from The New Stack about AI and Docker:Artificial Intelligence News, Analysis, and ResourcesWill GenAI Take Jobs? No, Says Docker CEODebugging Containers in Kubernetes — It’s Complicated