Model Context Protocol (MCP) is emerging as a standard for connecting AI models to external tools and data sources. Instead of building custom integrations for every AI system, MCP provides a common protocol. It’s like USB for AI.
Here’s what MCP means for building AI applications.
What Is MCP
The Problem It Solves
integration_problem:
before_mcp:
- Custom integration per AI provider
- Different APIs for each tool
- Duplicated effort across projects
- Inconsistent capabilities
with_mcp:
- Standard protocol for tool access
- Write once, work with any MCP client
- Consistent interface
- Ecosystem of pre-built connectors
MCP Architecture
mcp_architecture:
components:
client:
description: "AI application that needs tools"
examples: ["Claude Desktop", "VS Code extension", "Custom apps"]
server:
description: "Provides tools and data to clients"
examples: ["File system", "Database", "API connector"]
protocol:
description: "Standard communication format"
features: ["Tool discovery", "Invocation", "Response handling"]
Building MCP Servers
Basic Server
from mcp import Server, Tool, Resource
class DatabaseServer(Server):
"""MCP server exposing database operations."""
def __init__(self, db_connection):
super().__init__(name="database")
self.db = db_connection
def list_tools(self) -> list[Tool]:
return [
Tool(
name="query",
description="Execute a read-only SQL query",
parameters={
"type": "object",
"properties": {
"sql": {
"type": "string",
"description": "SQL SELECT query"
}
},
"required": ["sql"]
}
),
Tool(
name="describe_table",
description="Get schema for a table",
parameters={
"type": "object",
"properties": {
"table_name": {"type": "string"}
},
"required": ["table_name"]
}
)
]
async def call_tool(self, name: str, arguments: dict) -> str:
if name == "query":
# Validate read-only
if not self._is_read_only(arguments["sql"]):
return "Error: Only SELECT queries allowed"
result = await self.db.execute(arguments["sql"])
return json.dumps(result)
elif name == "describe_table":
schema = await self.db.get_schema(arguments["table_name"])
return json.dumps(schema)
def list_resources(self) -> list[Resource]:
return [
Resource(
uri="db://tables",
name="Available Tables",
description="List of all tables in the database"
)
]
Server Configuration
{
"mcpServers": {
"database": {
"command": "python",
"args": ["-m", "myapp.mcp_server"],
"env": {
"DATABASE_URL": "postgresql://..."
}
},
"filesystem": {
"command": "mcp-server-filesystem",
"args": ["--root", "/path/to/project"]
},
"github": {
"command": "mcp-server-github",
"env": {
"GITHUB_TOKEN": "..."
}
}
}
}
Using MCP in Applications
Client Integration
from mcp import Client
class AIAssistant:
"""AI assistant using MCP for tool access."""
def __init__(self, llm, mcp_servers: list[str]):
self.llm = llm
self.mcp_client = Client()
self.servers = mcp_servers
async def connect(self):
for server in self.servers:
await self.mcp_client.connect(server)
# Discover available tools
self.tools = await self.mcp_client.list_all_tools()
async def chat(self, message: str) -> str:
# Include MCP tools in LLM context
response = await self.llm.generate(
message=message,
tools=self._format_tools_for_llm()
)
# Execute any tool calls via MCP
if response.tool_calls:
results = []
for call in response.tool_calls:
result = await self.mcp_client.call_tool(
call.name,
call.arguments
)
results.append(result)
# Continue conversation with results
return await self.llm.continue_with_results(response, results)
return response.content
MCP Ecosystem
mcp_ecosystem:
official_servers:
- Filesystem access
- Git operations
- Database queries
- Web browsing
community_servers:
- Slack integration
- Jira/Linear
- Cloud APIs
- Custom internal tools
benefits:
- Pre-built integrations
- Security reviewed
- Consistent interface
- Easy to extend
Key Takeaways
- MCP standardizes AI-tool integration
- Write connectors once, use everywhere
- Growing ecosystem of pre-built servers
- Security handled at protocol level
- Reduces integration effort significantly
- Enables composable AI applications
- Watch adoption across AI providers
- Consider building MCP servers for internal tools
MCP is infrastructure for AI integration. Learn it now.