Sign up for our daily and weekly newsletters to stay updated with the latest news and exclusive content on cutting-edge AI advancements. Find out more
The future of AI innovation isn’t driven by larger models but by standardization, a more subtle yet powerful force.
Introduced by Anthropic in November 2024, the Model Context Protocol (MCP) aims to standardize the interaction between AI applications and the external world beyond their training data. Similar to how HTTP and REST standardized web application connections to services, MCP establishes a uniform way for AI models to connect with tools.
While many articles explain what MCP is, what often goes overlooked is the crucial aspect: MCP is a standard. Standards not only organize technology but also create growth opportunities. Those who adopt standards early ride the wave of progress, while those who ignore them fall behind. This article delves into why MCP is significant now, the challenges it presents, and how it is reshaping the ecosystem.
How MCP Shifts from Chaos to Context
Meet Lily, a product manager at a cloud infrastructure company. She manages projects across various tools like Jira, Figma, GitHub, Slack, Gmail, and Confluence, struggling to keep up with updates.
By 2024, Lily recognized the potential of large language models (LLMs) in synthesizing information. She saw an opportunity to automate updates, communications, and responses by feeding all her team’s tools into a model. However, each model had its unique way of connecting to services, tying her deeper to specific vendor platforms. Integrating new tools like Gong transcripts meant creating more custom connections, complicating a potential switch to a different LLM in the future.
With the launch of MCP by Anthropic, an open protocol for standardizing context flow to LLMs, Lily’s workflow transformed. Backed by OpenAI, AWS, Azure, Microsoft Copilot Studio, and soon Google, MCP gained rapid adoption. Official SDKs are available for Python, TypeScript, Java, C#, Rust, Kotlin, and Swift, with community SDKs for Go and others following suit.
Today, Lily efficiently manages her tasks through Claude, connected to her work apps via a local MCP server. Reports draft themselves, and updates are just a prompt away. She can seamlessly switch between models without losing integrations, enhancing her productivity and flexibility.
The Impact and Implications of a Standard
Lily’s story highlights a universal truth: fragmented tools and vendor lock-ins are undesirable. MCP offers the freedom to choose the best tools without the burden of rebuilding integrations every time a model changes.
However, with standards come consequences.
Firstly, SaaS providers lacking robust public APIs face the risk of becoming obsolete as MCP tools rely on these APIs, necessitating support for AI applications.
Secondly, AI application development cycles are poised to accelerate significantly, as developers can now integrate MCP servers with readily available clients, streamlining testing and deployment processes.
Thirdly, switching costs diminish, enabling organizations to migrate between models seamlessly without reconstructing infrastructure. Future LLM providers can leverage the existing MCP ecosystem to focus on enhancing performance.
Addressing Challenges with MCP
Every standard introduces new challenges or leaves existing ones unresolved, and MCP is no exception.
Trust is Crucial: With numerous MCP registries offering community-maintained servers, ensuring control and trust over servers is essential to prevent data breaches. SaaS companies should provide official servers, while developers should seek official and trusted sources.
Quality Variances: Poorly maintained MCP servers can fall out of sync as APIs evolve, impacting the quality of metadata crucial for LLM operations. Official servers from trusted parties are imperative to maintain synchronization. SaaS companies should update servers with API changes, and developers should rely on official sources.
Optimizing MCP Servers: Overloading a single server with multiple tools increases costs and decreases utility, overwhelming models with excessive choices. Task-focused, smaller servers are more effective in enhancing performance and reducing confusion.
Authorization and Identity Challenges: Even with MCP, challenges related to authorization and identity persist. Human oversight remains necessary for critical decision-making, preventing potential errors due to misinterpretation by AI models.
The Future of MCP
MCP represents a significant shift in AI application infrastructure, offering a self-reinforcing cycle of growth and innovation. As new tools, platforms, and registries emerge, simplifying the development and integration of MCP servers, teams embracing the protocol will enjoy faster product deployment with seamless integration capabilities. Companies supporting public APIs and official MCP servers can play a pivotal role in this integration journey, while late adopters may struggle to stay relevant.
Noah Schwartz serves as the head of product at Postman.