Cloud Native and AI: Why Open Source Needs Standards Like MCP
The past decade of cloud native was about scaling microservices with Kubernetes and GitOps. The next will be about how these systems work with AI.
Earlier this year, the Argo Project community received a major contribution: a Model Context Protocol (MCP) server for Argo CD, donated by Akuity and now maintained as a community project. This shows how open standards like MCP, combined with open source collaboration, are becoming critical as AI and cloud native converge.
Why Standards Matter for AI and Infrastructure
Cloud native technology has advanced through standards. Containers became practical with the Open Container Initiative (OCI) specification. Service meshes gained adoption with interoperability. GitOps scaled through common practices in projects like Argo CD and Flux.
AI is at a similar stage. Models and agents are powerful, but connecting them to infrastructure tools such as deployment platforms, observability stacks or security scanners often requires custom code.
MCP defines a consistent way for AI systems to connect with these tools. It serves as a universal adapter for AI in cloud native environments.
A GitOps Example: Argo CD Meets MCP
GitOps provides a practical example. Argo CD, the most widely used GitOps operator, keeps Kubernetes workloads synchronized and reliable for thousands of organizations.
In the past, adding AI to GitOps meant building custom integrations. With the new MCP server for Argo CD, AI agents can work directly with core workflows such as checking status, syncing deployments and pulling logs. This demonstrates how AI-assisted operations can reduce manual effort and simplify troubleshooting.
Donating the Argo CD MCP Server to the Community
For MCP to succeed, projects need to be developed in the open. That is why the Argo CD MCP Server, first built by engineers at Akuity, was donated to the Argo Project community. It now lives under argoproj-labs/mcp-for-argocd, where anyone can contribute.
The project has already gained traction. Users are testing it, filing feature requests and submitting pull requests. What began as an experiment is now a community-owned effort that connects AI and GitOps in practical ways.
Beyond GitOps: A Bigger Movement
The same idea applies across the cloud native stack. MCP could enable AI agents to:
- Query metrics or traces from observability tools.
- Inspect traffic flows or apply policies in service meshes.
- Run compliance checks in security scanners.
In every case, MCP lowers integration friction and creates a shared foundation for experimentation. Enterprises benefit from faster and safer adoption. The community benefits by avoiding fragmentation and accelerating innovation, just as OCI standardized containers do.
Why Open Source is Key
Cloud native progresses fastest when the community works in the open: sharing code, aligning on standards and solving problems together.
The donation of the Argo CD MCP Server reflects this approach. By moving the project under community ownership, its development will be guided by shared needs rather than individual vendor priorities. In cloud native, open source is more than just a licensing model, it’s how real progress happens.
Looking Ahead
It is still early for MCP, but the trajectory looks familiar:
- Containers became mainstream once standards like OCI emerged.
- GitOps scaled because projects like Argo CD and Flux rallied the community.
- Service mesh adoption accelerated once interoperability was prioritized.
AI in infrastructure will likely follow the same path. Open standards and open source projects will make it safe, consistent, and scalable.
A Call to the Community
MCP’s success depends on community experimentation. Whether you are a contributor, an operator or simply curious about AI in Kubernetes, now is the time to get involved.
The Argo CD MCP Server donation is one example of how open source contributions can accelerate progress. By making projects like this community-owned, the ecosystem can shape the way AI and cloud native come together.
Shared standards changed how we build and run applications. They can now change how we operate them in the age of AI. Community-driven approaches, not proprietary solutions, will make that possible.
KubeCon + CloudNativeCon North America 2025 is taking place Oct. 10-13 in Atlanta, Georgia. Register now.