MCP Hit 97 Million Installs: The Protocol War Is Over

 

In November 2024, Anthropic quietly open-sourced a protocol for connecting AI models to external tools. Sixteen months later, Model Context Protocol (MCP) hit 97 million monthly SDK downloads — and every major AI provider ships MCP-compatible tooling. The protocol war everyone expected never happened. MCP just won.

Here's how we got here, what the ecosystem looks like now, and what it means for developers building AI-powered tools in 2026.


📋 What You'll Need

  • Basic understanding of AI agents — how LLMs interact with external tools and APIs
  • Familiarity with MCP concepts — if you're new, start with MCP Servers Explained
  • Optional — experience building MCP servers (Build a Custom MCP Server)

📈 The Numbers: From 2 Million to 97 Million

When Anthropic launched MCP in November 2024, it saw roughly 2 million downloads in its first month. Respectable for a new protocol. Not world-changing.

Then OpenAI adopted it in March 2025. Google followed a month later. The growth curve went vertical.

Date Monthly Downloads Milestone
Nov 2024 ~2M Anthropic launches MCP with Python/TypeScript SDKs
Mar 2025 ~5M OpenAI adopts MCP across ChatGPT, Agents SDK
Apr 2025 ~8M Google DeepMind confirms Gemini MCP support
Nov 2025 ~40M Major spec update: async ops, statelessness, server identity
Dec 2025 ~60M Anthropic donates MCP to Linux Foundation's AAIF
Mar 2026 97M Current milestone

That's 4,750% growth in 16 months. RedMonk called it "the fastest adopted standard we have ever seen." For comparison, Kubernetes took nearly four years to reach comparable deployment density.

The official MCP registry now lists 6,400+ servers — from database connectors and CI/CD tools to Slack integrations and blockchain APIs.


🌐 Who's Using MCP: The Full Ecosystem

The question isn't "who supports MCP" anymore. It's "who doesn't?"

AI Providers

Provider MCP Integration
Anthropic Claude Desktop, Claude Code — the original creator
OpenAI ChatGPT desktop, Agents SDK, Responses API
Google Gemini support confirmed by Demis Hassabis
Microsoft GitHub Copilot in VS Code, Copilot Studio, Azure AI
Amazon/AWS Linux Foundation membership, AWS integrations

Developer Tools

Every major AI-powered IDE now speaks MCP:

  • Cursor — deep MCP integration, one of the most popular AI-first editors
  • Windsurf (Codeium) — went all-in on MCP
  • VS Code — GitHub Copilot uses MCP for tool integration
  • Replit — AI coding assistants with real-time project context via MCP
  • Sourcegraph — code intelligence served through MCP

Enterprise & Beyond

  • Pinterest — runs a production-scale MCP ecosystem with domain-specific servers and a central registry, saving thousands of engineering hours per month
  • Zapier — announced MCP support for workflow automation
  • Elgato — Stream Deck 7.4 (April 2026) added native MCP support, making it the first consumer hardware with MCP integration
Tip: The Elgato integration is significant — it signals MCP moving beyond developer tools into consumer hardware. AI assistants can now control physical devices through the same protocol they use to query databases.

🔧 Why MCP Won: The M×N Problem

Before MCP, connecting AI tools to data sources was a combinatorial nightmare.

If you had N AI tools (Claude, Copilot, Cursor, etc.) and M data sources (GitHub, Slack, Postgres, etc.), you needed N × M custom integrations. Every tool vendor built their own connectors. Nothing was reusable.

Before MCP:
┌──────────┐     ┌──────────┐
  Claude   │────►│  GitHub    (custom integration)
  Claude   │────►│  Slack     (custom integration)
  Copilot  │────►│  GitHub    (different custom integration)
  Copilot  │────►│  Slack     (different custom integration)
  Cursor   │────►│  GitHub    (yet another custom integration)
└──────────┘     └──────────┘
    N tools × M sources = N×M integrations

MCP collapsed this to N + M:

After MCP:
┌──────────┐     ┌──────────┐     ┌──────────┐
│  Claude   │     │          │     │  GitHub  │
│  Copilot  │────►│   MCP    │◄────│  Slack   │
│  Cursor   │     │ Protocol │     │ Postgres │
└──────────┘     └──────────┘     └──────────┘
    N clients + M servers = N+M integrations

Build one MCP server for GitHub, and every MCP-compatible client can use it. Build one MCP client, and it can talk to every MCP server. That's the pitch that convinced the entire industry.


🏗️ Technical Evolution: What Changed in 2026

MCP isn't the same protocol that launched in 2024. The spec has matured significantly.

Protocol Architecture

MCP uses a client-server model with JSON-RPC 2.0 as the wire format. Servers expose four capability types:

Capability Purpose Example
Resources Read-only data File contents, database records
Tools Executable actions Run a query, create a PR, send a message
Prompts Reusable templates Pre-built prompt patterns for common tasks
Sampling Reverse LLM calls Server asks the client's LLM to generate text

Transport Layer Changes

The biggest technical shift: Streamable HTTP is replacing Server-Sent Events (SSE).

  • Original transports: stdio (local) and HTTP+SSE (remote)
  • Current: Streamable HTTP for remote connections (SSE deprecated, sunset June 2026)
  • Challenge: Stateful sessions fight with load balancers — horizontal scaling needs workarounds

The 2026 roadmap prioritizes making Streamable HTTP work statelessly across multiple server instances behind load balancers. This is the biggest remaining technical hurdle for enterprise adoption.

November 2025 Spec Update

The major spec revision added:

  • Async operations — long-running tools that don't block the connection
  • Statelessness support — servers that don't require session affinity
  • Server identity — cryptographic verification of server authenticity
  • Community-driven registry — official directory of verified MCP servers
Warning: If you built MCP servers using SSE transport, start migrating to Streamable HTTP now. SSE is officially deprecated and sunsets in June 2026.

🤝 MCP vs A2A: Complementary, Not Competing

The biggest misconception in the AI protocol space: that Google's A2A (Agent-to-Agent) protocol competes with MCP. It doesn't.

MCP A2A
Purpose Agent-to-tool communication Agent-to-agent communication
Created by Anthropic (Nov 2024) Google (Apr 2025)
Governed by Linux Foundation (AAIF) Linux Foundation
Backed by OpenAI, Google, Microsoft, AWS, Block OpenAI, Anthropic, Microsoft, AWS, Block
Use case AI reads a database, triggers CI/CD, searches code Two AI agents coordinate on a multi-step task

MCP connects agents to tools. A2A connects agents to each other. In production multi-agent systems, you'll use both.

IBM's Agent Communication Protocol merged into A2A in August 2025, further consolidating the landscape. The "protocol war" narrative is over — the industry landed on two complementary standards backed by the same companies.


🏭 Production Reality: Pinterest's MCP Deployment

The best public case study for MCP at scale comes from Pinterest, which runs a full production MCP ecosystem.

Their setup:

  • Domain-specific MCP servers — each team owns servers for their domain (ads, content, search)
  • Central registry — a single directory of all available MCP servers
  • Human-in-the-loop approval — agents request access, humans approve sensitive operations
  • Result: Thousands of engineering hours saved per month

This is the pattern emerging for enterprise MCP: don't give agents blanket access to everything. Build scoped servers per domain, require approval for high-impact actions, and centralize discovery.


🗺️ The 2026 Roadmap

The MCP team published their 2026 priorities. Four themes:

  1. Transport scalability — stateless Streamable HTTP that works behind load balancers without session affinity
  2. Task lifecycle management — standardized patterns for long-running, multi-step agent tasks
  3. Governance maturation — Working Groups, Spec Enhancement Proposals, and community-driven decision-making under the Linux Foundation
  4. Enterprise readiness — audit trails, SSO-integrated authentication, and compliance tooling

The donation to the Agentic AI Foundation (AAIF) under the Linux Foundation in December 2025 was the key governance move. Anthropic no longer controls MCP alone — it's now co-governed with Block, OpenAI, and the broader community.

Tip: Watch the MCP Dev Summit recordings for implementation details. The Python SDK (v1.27.0+) includes StreamableHTTP idle timeout, RFC 8707 OAuth resource validation, and TasksCallCapability backport.

🚀 What's Next

  • 🔌 Build your first MCP server — follow our Python & TypeScript tutorial to get started
  • 🔒 Secure your MCP servers — with 97M installs, security matters. Read about AI agent security risks
  • 📡 Migrate from SSE to Streamable HTTP — the June 2026 deadline is real, start now
  • 🏢 Explore enterprise patterns — Pinterest's domain-scoped server pattern is the emerging best practice
  • 🎮 Watch consumer hardware — Elgato's Stream Deck integration hints at MCP becoming a universal device protocol

Already using MCP? Dive deeper with our guide on how MCP servers actually work and learn to build your own custom MCP server.





Thanks for feedback.

Share Your Thoughts




Browse all AI Industry articles →