A Brief History of MCP

In the world of GenAI, few innovations have reshaped the ecosystem as rapidly and profoundly as the Model Context Protocol. Born from a simple insight at Anthropic in mid-2024, MCP addressed a fundamental challenge: how might we connect the growing constellation of AI models to the digital tools and data they need to be truly useful? MCP has quietly revolutionized AI integration, transforming isolated models into collaborative participants in our digital workflows. This paper traces MCP's remarkable six-month journey from an open-source experiment to the industry standard embraced by fierce competitors like OpenAI and Google. Beyond a mere tech spec, MCP represents a watershed moment in AI development… one where community-driven innovation triumphs over proprietary silos, and where the vision of truly contextually-aware AI assistants began to materialize.

July–Nov 2024: Origins and Open-Source Launch of MCP

Conception (Mid-2024)

The idea for MCP took shape at Anthropic in mid-2024. Engineers David Soria Parra and Justin Spahr-Summers conceived a solution to the “M×N” integration problem for AI tools, drawing inspiration from the Language Server Protocol. By August 2024 they had early prototypes, aiming to let AI assistants (like Anthropic’s Claude) seamlessly access IDEs, databases, and other systems via a unified protocol.

Official Launch (Nov 25, 2024)

Anthropic publicly open-sourced the Model Context Protocol (MCP) on November 25, 2024 . MCP was introduced as an open standard – “a single protocol to connect AI systems with the data they need”. The project’s GitHub organization was created with a formal specification and multi-language SDKs (Python, TypeScript, Java, C#, Kotlin). Anthropic’s Claude Desktop app (for Windows/Mac) immediately gained local MCP server support, allowing Claude to connect to MCP servers running on a user’s machine. To jumpstart adoption, Anthropic released a repository of reference MCP servers (connectors) for common tools and data sources. At launch, pre-built servers were provided for Google Drive, Slack, GitHub, Git, PostgreSQL, and Puppeteer (web browser), among others. These reference implementations (hosted under Anthropic’s GitHub, in partnership with Docker) showcased how MCP gives LLMs controlled access to files, databases, web search, etc.

Early Endorsements

The open-source release came with strong community-oriented messaging. Anthropic emphasized MCP as a collaborative project “open to contributions from the entire community”. Companies that had early access praised it: Block, Inc. (formerly Square) integrated MCP into internal systems prior to launch and lauded it as an open bridge connecting AI to real applications. Block’s CTO Dhanji Prasanna called open technologies like MCP “the foundation… to build agentic systems” that free users from mundane tasks. Another early adopter, Apollo (a fintech platform), also integrated MCP by launch. Meanwhile, developer tool companies Zed (code editor), Replit, Codeium, and Sourcegraph announced they were working to support MCP in their products. This broad backing signaled that MCP was intended to be an industry standard, not an Anthropic-only project. Within days, excitement spread in AI dev circles – “everyone from Copilot to Cognition to Cursor [was] adding support” in the initial flurry.

Dec 2024: First Open-Source Implementations and Clients

The open-source community moved quickly to experiment with MCP. Within a week of launch, Continue, an open-source AI coding assistant for VS Code, became the first client to fully support all MCP features. Continue’s team mapped MCP’s core concepts (Resources, Prompts, Tools) to its own plugin architecture and enabled easy configuration of external MCP servers. This allowed developers to, say, spin up a local SQLite database MCP server and have Continue query it from VS Code by simply updating a config.

Other projects quickly followed suit in late 2024:

  • Cursor IDE (open-source AI code editor) added MCP integration, letting its in-IDE assistant use tools like file system access. By December, Cursor could connect to MCP servers so that codebase queries and web searches were handled via standard MCP tools.

  • LibreChat (an open-source ChatGPT alternative) introduced an “Agents” mode using MCP, so users could connect chatbots to internet search or local data through MCP servers (the LibreChat Agents docs highlight one-click activation of MCP integrations).

  • ChatMCP (community project) provided a simple desktop UI for chatting with any LLM via MCP-connected tools

  • Several devs created MCP client bridges for existing AI apps – e.g. an MCP-Bridge adapter was published to connect any OpenAI-API-based app to MCP servers. This allowed experimentation with ChatGPT itself (via API) controlling MCP tools. ( 221 MCP Clients: AI-powered apps for MCP | PulseMCP )

On GitHub, activity surged. Anthropic’s MCP servers repository began receiving community contributions in December. Dozens of community-built MCP servers emerged for niche systems – from controlling Ableton Live (audio software) to automating Blender for 3D scenes. By year’s end, the community had shown that virtually any tool or API could be “MCP-ified.”

Jan–Feb 2025: Ecosystem Proliferation – Servers, Clients, and Aggregators

Explosion of MCP Servers

Early 2025 saw a rapid expansion in the number and variety of MCP servers. Anthropic’s reference connector repo grew to cover a wide range of use cases:

  • DevOps/Cloud: connectors for AWS resources, Azure Data Explorer, and Cloudflare were built.

  • Datastores: servers for databases like PostgreSQL and SQLite, vector DBs like Chroma, Milvus, and graph DB Neo4j became available.

  • Productivity Apps: integrations for Slack, GitHub, Atlassian, Google, Notion, etc. were implemented, many by the community or the respective companies.

  • Web/Research: tools like Brave Search (web search API) and Browserbase (cloud browser automation) let agents browse the web.

  • AI/LLM Utilities: servers like Memory provided a persistent knowledge graph memory for chats, and Sequential Thinking enabled chain-of-thought reasoning via stepwise tool invocation.

  • Creative/Other: EverArt allowed image generation via AI models, AlphaVantage provided stock market data, and even Minecraft had an MCP interface developed to let agents act in-game (anecdotally demonstrated by enthusiasts).

By February, third-party companies began officially maintaining MCP servers for their platforms. For example, Apify released an MCP server giving access to its library of web scrapers, Box, Inc. built one for its content management API, and Neon (a serverless Postgres provider) published a hosted MCP server for querying databases in the cloud. The growing corporate support showed that vendors saw value in making their services accessible through the open MCP standard.

Rise of MCP Clients

On the client side, MCP compatibility became a must-have for AI apps. VS Code and JetBrains IDE plugins gained MCP support through projects like Continue, Cline, and Sourcegraph’s Cody, allowing AI coding assistants to tap into resources like database schemas or documentation via MCP. Standalone AI desktop apps proliferated: Anthropic’s Claude Desktop natively supported MCP at launch, and community apps like HyperChat and 5ire (open-source chat UIs for local LLMs) let users connect to MCP servers for extended functionalities. Even specialized domains joined in – for example, the open-source Home Assistant project (home automation) added MCP support to its voice assistant, enabling smart home voice commands to trigger web searches or fetch external data via MCP.

Public Aggregators and Indexes

With hundreds of tools and integrations blooming, community aggregators arose to index and organize the MCP ecosystem:

  • Awesome-MCP-Servers – a curated list launched on GitHub to catalog “awesome” MCP servers, both official and experimental. By early 2025 it listed dozens of servers across categories (file systems, databases, APIs, etc.), helping developers discover new integrations.

  • PulseMCP – a community-driven website that tracked MCP adoption. By Jan 2025 it listed over 200 public MCP servers and 200+ MCP-enabled clients with descriptions. PulseMCP allowed filtering by category and even popularity, serving as a “store” for MCP extensions.

  • Smithery – launched as a central MCP server registry and installer. Smithery provided a searchable registry of ~200 MCP servers and an open-source CLI to install them in one step. It simplified adding new connectors to Claude Desktop or agent frameworks. Smithery also collected usage stats, enabling ranking of the most popular MCP servers (e.g. those for Slack, Google Drive, and Filesystem were heavily installed). By aggregating metrics, it highlighted which integrations the community found most useful.

  • A dedicated r/MCP subreddit formed for discussion and support, reflecting the growing community interest.

Metrics of Growth

The proliferation was dramatic – as of April 2025, one registry (likely Smithery) listed over 4,400 distinct capabilities exposed via MCP servers. In a few months, the ecosystem had grown from a handful of connectors to thousands of tools, from code assistants to email plugins. This explosive growth led one tech observer to note “it’s rare to see such fast adoption across all major IDEs” and predicted commercial vendors would “scramble to add MCP servers” given the clear developer demand.

Feb–Mar 2025: Community Inflection Points and Accelerating Adoption

By early 2025, MCP had momentum, but a key inflection point came with concerted community events and endorsements:

AI Engineer Summit (Feb 26–27, 2025)

At an AI engineering summit, Anthropic’s team hosted a live MCP Workshop that became a viral sensation in the AI dev community. Lead engineer Mahesh Murag (author of many MCP servers) gave a deep dive into the protocol’s design and upcoming features. Live-tweets from the workshop “started going viral”, especially as the team announced a forthcoming official MCP registry and showcased advanced usage patterns. The 2-hour workshop attracted tens of thousands of online viewers (with nearly 300k combined views of the released recording). This event reignited excitement; developers who had only dabbled in MCP now rushed to build on it, seeing the breadth of tools and an official index on the horizon. The workshop underscored how “LLMs are most useful when connecting to the data you already have and software you already use”, as Anthropic’s CPO tweeted, reinforcing the MCP vision.

Community Endorsements

Influential figures in AI began explicitly backing MCP. The Latent Space podcast/blog published “Why MCP Won” in March 2025, analyzing Anthropic’s remarkably successful open-source strategy. It highlighted that MCP solved a real pain point (integrating context) in a way that previous attempts at standards had failed, and credited the open, community-driven approach for its victory. The Pragmatic Engineer newsletter similarly featured MCP as a major new “AI dev tools building block,” expecting it to boost developer productivity and agent capabilities across the industry. Such coverage further validated MCP in the eyes of both open-source developers and enterprise adopters.

Ecosystem Toolkits

Around this time, more agent development frameworks incorporated MCP. For instance, FastAgent launched as an MCP-native agent orchestration CLI, allowing developers to rapidly compose multi-step agents using any MCP-accessible tools. Existing AI workflow tools like Langflow and Superinterface updated to let users include MCP tools in their pipelines. This made it even easier to experiment with MCP without coding from scratch. An emerging pattern was combining MCP with new Agent-to-Agent (A2A) protocols to enable agents that not only use tools (via MCP) but also communicate with each other – a trend noted in technical forums exploring “AI agent protocol wars” (MCP for tool access vs. A2A for agent comms).

All these developments set the stage for MCP’s next big milestones: endorsement by the major AI model providers themselves.

Mar 26, 2025: OpenAI Announces Official MCP Support

A pivotal moment came when OpenAI – Anthropic’s rival and the maker of ChatGPT – publicly embraced MCP. On March 26, 2025, CEO Sam Altman announced that OpenAI would “add support for Anthropic’s Model Context Protocol” across its products. This news, first shared on Altman’s X (Twitter) account and then detailed by tech press, marked a significant convergence in the AI industry. OpenAI integrated MCP into its new Agents SDK (released for building AI agents) effective immediately, meaning developers using OpenAI’s SDK could plug into any MCP server for tools. Altman noted “People love MCP and we are excited to add support across our products… available today in the Agents SDK, with ChatGPT desktop app and API support coming soon.”. In practical terms, this meant ChatGPT’s forthcoming desktop application and API could leverage the entire ecosystem of MCP connectors – a user might allow ChatGPT to, say, retrieve data from a Notion document or run a SQL query, via MCP, rather than being limited to OpenAI’s proprietary plugins.

The reception in the community was enthusiastic. Anthropic’s team welcomed OpenAI: “Excited to see the MCP love spread to OpenAI – welcome!” wrote Anthropic’s CPO. Observers noted that OpenAI adopting a standard “rival Anthropic’s” initiative was unprecedented – a testament to MCP’s widespread adoption and technical merit. TechCrunch dubbed it “OpenAI adopts rival’s standard”, emphasizing how MCP had emerged as a neutral, open solution valuable enough even for competitors to join. Within days of the announcement, developers reported that OpenAI’s agent examples could indeed interface with MCP servers. This move by OpenAI effectively put its stamp of approval on MCP as “the agentic AI standard”, as one analyst wrote. It also triggered a spike in community activity – many OpenAI developers, previously on the sidelines, began exploring MCP servers now that they could be used in familiar OpenAI workflows.

Apr 2025: Google and Industry-Wide Alignment

Just weeks after OpenAI’s news, Google also came on board. On April 9, 2025, Demis Hassabis (CEO of Google DeepMind) announced that Google’s upcoming Gemini AI models and its Agent Development Kit would support MCP. While Google had been developing its own Agent-to-Agent (A2A) protocol for multi-agent communication, it recognized MCP’s role for connecting agents to tools and data. Hassabis wrote “MCP is a good protocol and it’s rapidly becoming an open standard for the AI agentic era… Look forward to developing it further with the MCP team and others in the industry.” Google did not give an exact timeline, but the commitment was clear: MCP compatibility would be baked into Google’s AI offerings (e.g. enabling Gemini-powered assistants to call MCP tools for enterprise data access). At Google Cloud Next ’25, Google even showcased how its Agent Toolkit could use MCP to let Gemini-based agents securely access various Google Cloud services and third-party apps in a unified way.

By Spring 2025, all three major AI labs – Anthropic, OpenAI, and Google – had coalesced around MCP. What started as an open-source project was now a de facto industry standard. Competitors agreed it was better to collaborate on a common ecosystem of tools than to fragment the developer community with separate plugin frameworks. This consensus quelled fears of a protocol war and shifted focus to security and governance of MCP integrations (topics raised by enterprise analysts as MCP’s adoption grew).

Community Reaction and Next Steps

MCP’s mainstream acceptance further energized the open-source community. Many more MCP servers were created following Google’s announcement, often integrating Google’s own services (a flurry of Google Calendar, Sheets, and Gmail MCP connectors appeared on GitHub, unofficial but functional, anticipating future support). The number of public aggregators grew as well – e.g. MCP.so began ranking the most popular servers by install count, and OpenTools.ai published demonstrations on how combining MCP with agent frameworks can “supercharge the agentic AI revolution.” The focus started to shift from “can we connect X via MCP?” (almost everything was connected by now) to “how do we manage so many tools?” Companies like Solo.io even launched an “MCP Gateway” to help orchestrate and secure multiple MCP servers in enterprise settings.

Foray Favorites and Use Cases

Here are some of our favorite MCP Servers at Foray, and how we think they can be used.

  • Brave Web Search: Gives LLMs web search capabilities. We mostly use this for broad queries to identify URLs for further investigation.

  • FireCrawl: Scrapes and crawls URLs, converting them to markdown. We often use this as “step 2” in a deep research workflow, instructing our LLM to use Firecrawl to scrape the top 5-10 URLs identified in a previous Brave Web Search step.

  • Memory Graph: There are lots of memory MCPs out there, but this one is uniquely deep. The GitHub repo details powerful use cases for both humans and AIs consuming results from this tool.

  • Sequential Thinking: Instead of connecting to an external system, this inventive tool uses stepwise tool invocation to force chain-of-thought reasoning. We often use this as a “reflection step” in agentic planning workflows.

  • Youtube Transcripts: This is fantastic for quickly transcribing long videos to markdown for interaction with LLMs.

  • Zoom Transcripts: This MCP (developed by Foray Consulting) is a sophisticated tool for indexing and interacting with Zoom Cloud recording content, enabling LLM adopters to skip expensive middleware solutions like Gong in favor of advanced, context-rich cross-conversation synthesis.

  • Google Workspace: This magnificently-designed MCP fully extends Gmail, Google Calendar, Contacts, and Google Drive across multiple Google Accounts (with great OAuth handling). This MCP can be used to power an “Executive Assistant” agent, when combined with good prompt engineering.

  • Atlassian Cloud: This is a cluster of three MCP servers (Jira, Confluence, and JSM Assets) that allow users to read, write, and admin the core components of the Atlassian Cloud. We have used this for everything from Backlog Refinement, Customer Service Chatbots, and Asset Management.

  • GitHub Official: After much ado, GitHub themselves finally released an official MCP server. One of our favorite use cases here is a bit meta - we’ll grab 2 or more distinct MCP servers, and use a code-smart LLM (like Claude Sonnet 3.7) to Frankenstein them together, combining bits and pieces to accomplish a very specific desired result.

  • Git: Distributed version control is an obvious use case for developers, but surprisingly powerful for business users. When you start thinking in terms of “X as Code” (where “X” could be anything from company policy, to knowledge articles, to IT configurations, you name it), Git becomes a flexible but powerful change control mechanism.

Conclusion

In less than half a year, MCP evolved from an open-source proposal into a unifying protocol adopted across the AI landscape. Key milestones – Anthropic’s open-source release (Nov 2024), rapid community uptake in open-source projects (late 2024), the flourishing of MCP servers/clients and aggregator hubs (early 2025), and landmark endorsements by OpenAI and Google (Mar–Apr 2025) – all contributed to its proliferation. MCP’s success can be attributed to its timely solution for context integration, an open governance model inviting community innovation, and the network effects of a shared standard. By early 2025, MCP has become a single, standardized port through which any AI application – from a coding assistant in VS Code to a chatbot like ChatGPT – can access an ever-growing array of tools and data sources. The open-source ecosystem around MCP now teems with thousands of connectors and applications, all interoperable. This chronology underscores how a well-designed open protocol, championed by community and industry alike, sparked a rapid transformation in the capabilities of AI systems – empowering them with context and tools as never before. The MCP journey continues, but its early history already stands as a case study in collaborative innovation, marking a major inflection point in the pursuit of more useful, integrated AI.

Next
Next

Harnessing the Chain Reaction