Why MCP uses JSON-RPC 2.0 instead of grpc
The Model Context Protocol (MCP), introduced by Anthropic, is designed to connect AI models (like Claude) to local data and tools. While gRPC is the "gold standard" for high-performance microservices (like the internal workings of GKE), the creators of MCP chose JSON-RPC 2.0 for very specific reasons related to the AI developer ecosystem.
Here is why MCP uses JSON-RPC 2.0 instead of gRPC:
1. The "stdio" First Approach (Local Tools)
The most common way to run an MCP server today is as a local process where the AI client talks to the server via Standard Input/Output (stdio).
- JSON-RPC: It is just text. You can pipe a JSON string into a script and read the output. It’s trivial to implement over a simple process pipe.
- gRPC: It is a binary protocol (Protocol Buffers) that strictly requires HTTP/2 as a transport. You cannot easily "pipe" gRPC into a local Python script without setting up a full network stack, managing ports, and handling certificates.
- Why it matters for AI: Many AI "tools" are just local scripts (e.g., a script that reads your local files). JSON-RPC allows these tools to be "plug-and-play" without worrying about port conflicts or firewall rules on a developer's laptop.
2. Lower Barrier to Entry (Zero Code Gen)
For a protocol to become a "standard," it has to be easy for everyone to implement—from a hobbyist with a 10-line Python script to a massive enterprise.
- JSON-RPC: Every programming language has a JSON library. You don't need special tools to start; you just compose a dictionary/object and print it.
- gRPC: Requires a schema file (
.proto), a compiler (protoc), and generated code for every language involved. This adds a "build step" that often frustrates AI researchers and script-heavy developers.
3. LLMs Native Language is JSON
LLMs are exceptionally good at generating and parsing JSON. While the LLM doesn't "speak" the protocol directly (the client/app does), the data being passed back and forth (tool definitions, resource content, prompts) is almost always JSON-structured data.
- By using JSON-RPC, the protocol format matches the data format. There is no translation layer needed between the "wire format" and the "AI format."
4. Human Readability & Debugging
AI development involves a lot of "hallucination debugging." When a tool call fails, developers need to see exactly what was sent.
- JSON-RPC: You can open a terminal, look at the logs, and read exactly what happened:
{"method": "list_resources", "params": {}}. - gRPC: The logs are binary blobs. You need specialized tools (like Wireshark or specific gRPC debuggers) to understand what the messages contain.
5. Transport Agnosticism
While gRPC is tied to HTTP/2, JSON-RPC 2.0 is completely transport-agnostic.
- MCP is designed to work over stdio (local), HTTP/SSE (web-based), and potentially WebSockets or PostMessage (in-browser).
- JSON-RPC fits all of these perfectly. It doesn't care how the string gets from Point A to Point B.
6. The Bottleneck Argument
The primary reason to choose gRPC is performance (low latency, high throughput).
- In the AI world, the "bottleneck" is the inference time of the LLM. If it takes 2 seconds for Claude to generate a response, saving 2 milliseconds by using a binary protocol instead of JSON is irrelevant.
- The developers of MCP traded raw binary performance for flexibility and ease of use.
Comparison Summary
| Feature | JSON-RPC (MCP) | gRPC |
|---|---|---|
| Data Format | Human-readable JSON | Binary (Protobuf) |
| Primary Transport | stdio, HTTP/SSE, WebSockets | HTTP/2 Only |
| Setup Effort | Low (no compiler needed) | High (requires protoc) |
| Debugging | Easy (plain text logs) | Hard (binary logs) |
| Local Tooling | Excellent (simple pipes) | Overkill (requires local networking) |
| Best For | AI Tools & Interop | Internal Microservices |
Conclusion: MCP chose JSON-RPC because it prioritizes the developer experience and local tool integration. It allows anyone to turn a simple script into an AI-capable tool in minutes without needing to understand complex networking or binary serialization.