MCPs, CLIs, and APIs Are All Just the Same
Programmatic access to resources has always been non-negotiable. So why is the industry losing its mind?
Every few years, the software industry collectively discovers something it already knew and treats it like a revelation. Right now, that something is MCP.
The Model Context Protocol has been called “USB-C for AI.” It has been called a paradigm shift. Every vendor I talk to is racing to ship an MCP server. Every pitch deck has an MCP slide. The discourse is loud. And most of it misses the point entirely.
MCP is a way for a program to discover and invoke capabilities on a remote system using a structured, self-describing interface. That is also what an API is. That is also what a CLI is. That is what SOAP was. That is what CORBA was, if you were unlucky enough to use it. The mechanism changes. The principle does not.
Programmatic access to resources is not new. It is the foundation of every distributed system ever built. The only thing that changes, decade over decade, is the consumer.
The industry keeps rediscovering the same idea
In 2009, Sun Microsystems submitted WADL to the W3C. It was an XML-based format that described REST services in a machine-readable way: resources, relationships, parameters, methods. That is exactly what an MCP tool definition does. WADL described what a service could do so a machine could figure out how to call it. The W3C never standardized it. Not because the idea was wrong, but because the timing was. There was no consumer sophisticated enough to benefit from it at scale. Before WADL, there was WSDL for SOAP. Before that, IDL for CORBA. MCP is the latest entry. It will not be the last.
Roy Fielding’s original REST dissertation was fundamentally about this same idea. The architecture was designed so that a client could discover what to do next by examining the current representation. You land on a resource, and the resource tells you where its parents are, where its siblings are, what actions you can take, what state transitions are available. The resource graph was navigable. You did not need an out-of-band specification to understand the API because the API told you what it could do, in real time, through its own structure.
The industry took REST, stripped out the hypermedia, ignored the self-describing semantics, and built what Fielding himself called “RPC with pretty URLs.” We turned a semantic architecture into a syntactic one and then spent fifteen years complaining that APIs are hard to discover.
Now MCP shows up and says: what if the service described itself to the consumer? What if the consumer could discover available tools and their schemas dynamically?
That is just REST done right. We had this. We chose not to use it.
The consumer changed, not the concept
The reason MCP matters is not the protocol. It is JSON-RPC with tool definitions and a handshake. Nothing groundbreaking. What changed is the consumer.
For the first time, the primary consumer of a service description is not a human developer. It is a language model. And language models are now sophisticated enough to take a structured tool definition, understand what it does, and invoke it correctly without someone writing integration code.
That is interesting. But it is a shift in the consumer, not in the concept. CLIs have been doing this forever: run --help and you get a structured description of available commands, arguments, types, defaults. APIs have been doing this forever: OpenAPI specs describe endpoints, parameters, request bodies, response schemas. A developer reads the spec and writes the client. An LLM reads the spec and writes the client. Same interface. Different reader.
MCP formalizes this for a specific consumer: AI agents talking to external systems over a standardized transport. That is useful. It is not a revolution.
Protocols do not solve the problems your business actually has
I have spent the last several months talking to operations leaders at some of the largest companies in the world. Financial services firms processing trust documents. Airlines validating flight data. Logistics companies routing shipments. Construction firms digitizing handwritten safety plans. Not one of them has ever asked me what protocol we use.
Every single one of them asks the same question: is this accurate? Can I trust it? Am I going to get fired for deploying this?
That is the question the industry should be obsessing over. Not which protocol wraps the call, but whether the system on the other end actually works when the data is messy, the formats are inconsistent, and the stakes are high.
MCP has real problems that matter to these buyers. It has no enforced authentication. It has no audit trail. It has context bloat issues where dozens of tool definitions consume the model’s attention before a single useful action is taken. The 2026 roadmap is essentially a list of things that need to be fixed before enterprises can take it seriously. Security researchers at RSA this year are demonstrating how MCP vulnerabilities can enable remote code execution and full tenant takeover.
These are not novel problems. They are the same problems every service description protocol has faced. WSDL had security gaps. CORBA had complexity problems. WADL had adoption problems. The pattern is always the same: a new protocol arrives, the industry gets excited, people bolt it onto everything, the security researchers find holes, the enterprises ask about governance.
Meanwhile, the companies that actually need to move data between systems just use APIs. They use webhooks. They use SDKs. They use whatever works. Because the job was never about the protocol. The job is getting data from point A to point B, accurately, reliably, every time. The models by themselves are not going to save anyone. You need the bricks. What those bricks look like matters far more than the protocol that calls them.
What actually matters
When we talk to enterprise buyers at bem, the conversation is never about MCP or REST or any protocol. The conversation is about accuracy: what is the confidence score on each field, how do corrections feed back into the system, can they see the reasoning behind every extraction. It is about deployment: does this run in their VPC, is data encrypted at rest and in transit, does it comply with their egress policies, can they explain this to their InfoSec team on day one. It is about trust: not “trust me, it works,” but “here are the metrics, here is the audit trail, here is what the system got wrong and here is how it learned from it.”
We have been API-first since day one. Adding an MCP server is, frankly, trivial for us. Our API already describes what it does. Our primitives are already composable. Our schemas are already structured. Wrapping that in MCP is texture. It is not transformation.
MCP is a way for Anthropic to compete with the orchestration frameworks of the world. Credit to them, it is working. OpenAI adopted it. Google adopted it. It is becoming a standard. Standards are good. But let us not confuse the standardization of an interface with the invention of a new paradigm.
The paradigm is and always has been: give programs structured access to resources. The interface evolves. The transport evolves. The consumer evolves. The principle is the same.
If your system is well-designed, if your API surface is clean, if your primitives are composable, and if your service can describe itself, then you are already ready for whatever protocol comes next. You were ready for WADL. You were ready for REST. You are ready for MCP. You will be ready for whatever replaces it.
Your buyers are not going to ask you which protocol you support. They are going to ask you if the data is right, if the system is secure, and if it gets better the more they use it. Answer those questions first. The protocols will catch up.



