AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Kubernetes Locally, Docker Compose Locally?
I think that both of us are hitting in a way brick wall and looking at that same wall from the different side of that wall, right? I have K-NA and I'm using it as an example that is extremely easy for anybody to define how their application should run in production. Somebody else set it up, but it's extremely easy to define how it will run in production, but I cannot run it locally. And so that's why you need that in between piece to turn Docker compose into a different target for the application itself. Where does open application model fit into this? Or will it ever?