Chris Aniszczyk, co-founder and CTO of the Cloud Native Computing Foundation (CNCF), argues that AI agents resemble microservices at a surface level, though they differ in how they are scaled and managed. In an interview ahead of KubeCon/CloudNativeCon Europe, he emphasized that being “AI native” requires being cloud native by default. Cloud-native technologies such as containers, microservices, Kubernetes, gRPC, Prometheus, and OpenTelemetry provide the scalability, resilience, and observability needed to support AI systems at scale. Aniszczyk noted that major AI platforms like ChatGPT and Claude already rely on Kubernetes and other CNCF projects.
To address growing complexity in running generative and agentic AI workloads, the CNCF has launched efforts to extend its conformance programs to AI. New requirements—such as dynamic resource allocation for GPUs and TPUs and specialized networking for inference workloads—are being handled inconsistently across the industry. CNCF aims to establish a baseline of compatibility to ensure vendor neutrality. Aniszczyk also highlighted CNCF incubation projects like Metal³ for bare-metal Kubernetes and OpenYurt for managing edge-based Kubernetes deployments.
Learn more from The New Stack about CNCF and what to expect in 2026:
Why the CNCF’s New Executive Director Is Obsessed With Inference
CNCF Dragonfly Speeds Container, Model Sharing with P2P
Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.