Create a Python MCP server to expose APIs for ChatGPT and Claude Desktop integration
The MCP Python Server project is an API wrapper designed to expose tools through the Model Context Protocol (MCP). This protocol provides a standardized way for AI applications like Claude Desktop or ChatGPT Desktop, equipped with support for MCP, to interact with specific data sources or tools via an external API. The server allows developers to create and serve custom tools that integrate directly into these AI platforms, enhancing their functionality.
The core features of the MCP Python Server include:
claude.json
.Key benefits of this setup include:
The architecture of the MCP Python Server leverages the following components:
mcp[cli]
, which is available via pip or uv, to handle interactions between the server and the client.For example, a typical tool could be defined as follows:
from mcp.server.fastmcp import FastMCP
tool = FastMCP("API Wrapper")
@tool.tool(description="A tool for consulting external APIs.")
async def consultar_api(param: str) -> str:
"""This function asynchronously consults an external API with the provided parameter."""
This implementation ensures that tools are both functional and understandable, adhering closely to MCP standards.
pip install "mcp[cli]"
uv init mcp-api-server
cd mcp-api-server
uv add "mcp[cli]"
mcp install mi_script.py
You can extend this with environment variables if necessary:
mcp install mi_script.py -f .env
Ensure that all dependencies are installed as well:
pip install -r requirements.txt
Create a .env
file for optional environment variable setup:
API_KEY=your-api-key-here
API_URL=https://api.yourdomain.com
Assume you want to build a tool that fetches real-time financial data from an external API. You could define such a function within the server.py
file as follows:
async def obtener_precios_acciones(param: str) -> str:
async with httpx.AsyncClient() as client:
response = await client.get(f"https://api.stockmarket.com/prices?symbol={param}")
return response.json()
This example demonstrates fetching stock prices, which can then be displayed or used within the AI application to inform financial decision-making.
Creating a translation tool for generating translations in multiple languages would also benefit from using this server. The following function defines how such a tool might look:
async def traducir(texto: str, lang: str) -> str:
return f"Translated: {texto} → {lang}"
This allows the AI to seamlessly integrate language translation capabilities into its workflow.
To enable seamless integration with various AI clients such as Claude/Desktop and ChatGPT/Desktop, ensure that appropriate configurations are set up. For example:
claude.json
Configuration:{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem"],
"env": {"API_KEY": "MI_API_KEY"}
},
"myServer": {
"command": "uv",
"args": [
"run",
"--with",
"mcp[cli]",
"mcp run server.py"
],
"env": {"API_KEY": "SECRETEXAMPLE"}
}
}
}
Place this file in the respective application's configuration directory:
%APPDATA%\Claude\claude.json
~/.claude/claude.json
This setup allows your custom tools to be directly invoked by the AI application, enhancing its capabilities.
The following table outlines compatibility with different MCP clients and their features:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✔️ | ✔️ | ✔️ | Full Support |
Continue | ✔️ | ✔️ | ❌ | Partial Support |
Cursor | ❌ (No MCP) | ✔️ | ❌ | Tool Only |
This compatibility matrix ensures that you can choose the appropriate setup for each client, optimizing both resource usage and functionality.
Creating an .env
file to hold sensitive information like API keys is crucial for security. This file contains settings such as:
API_KEY=your-secret-key-here
You can run the server in development mode to quickly test changes:
mcp dev server.py
For production, use:
mcp run server.py
Or with uv for added security and flexibility:
uv run --with mcp[cli] mcp run server.py
Q: How do I integrate an external API into the AI application?
server.py
, such as an asynchronous function that queries the API.Q: What does the MCP configuration file look like for integrating my server?
Q: Can I run the server on Windows/Linux/MacOS?
Q: How do I secure sensitive data during setup?
Q: Are there any limitations with the integration across different MCP clients?
To contribute new tools, follow these guidelines:
@tool
decorator.For deeper learning, explore these resources:
With the MCP Python Server, developers can enhance their AI applications with custom tools and resource integrations. By following this guide, you’ll be well on your way to building powerful, extensible solutions that offer a unique blend of functionality.
Note: This documentation has been created using information from the provided README file and structured according to the template requirements. All content is in English, with original technical language aiming for clarity and completeness while emphasizing MCP server capabilities and AI application integration.
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Integrate AI with GitHub using MCP Server for profiles repos and issue creation
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Explore community contributions to MCP including clients, servers, and projects for seamless integration
Python MCP client for testing servers avoid message limits and customize with API key
Explore MCP servers for weather data and DigitalOcean management with easy setup and API tools