Expose MCP tools as secure, openAPI-compatible HTTP servers for seamless AI integration and interoperability
mcpo is a lightweight, open-source utility designed to expose any Model Context Protocol (MCP) server as an OpenAPI-compatible HTTP server. It simplifies the process of integrating custom AI tools and data sources into broader workflows by leveraging well-established web standards like HTTP and OpenAPI. This means that your MCP tools can now be seamlessly used with various AI applications, including Claude Desktop, Continue, Cursor, and others, without requiring any complex setup or customization.
mcpo excels in its simplicity and flexibility, making it an ideal tool for developers looking to quickly integrate custom AI servers into their workflows. Its key features include:
Instant OpenAPI Compatibility: mcpo automatically converts your native MCP command-line tools into standard RESTful OpenAPI services, eliminating the need for manual adaptation or protocol-specific code.
Enhanced Security and Stability: By operating over HTTP, mcpo ensures that communication between AI applications and custom servers is secure, stable, and scalable. It supports industry-standard features such as authentication, error handling, and interactive documentation.
Auto-Generated Documentation: mcpo leverages the power of OpenAPI to auto-generate comprehensive and interactive documentation for every MCP tool it hosts. This makes it easy for users to understand and utilize your custom tools without additional setup or configuration.
Customizable Command Options: The utility supports a wide range of command-line options, allowing you to customize how data is handled, processed, and exposed. This includes specifying API keys, ports, server types (e.g., JSON-RPC, SSE), and more.
At the heart of mcpo lies its implementation of the Model Context Protocol (MCP). The protocol enables seamless interaction between AI applications and custom backend services by standardizing various aspects of communication. Here’s a breakdown of how it works:
MCP Client Interaction: An MCP client, such as Claude Desktop or Continue, initiates a request to an mcpo-server hosting an MVP server.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
MCP Data Flow: The request travels through the mcpo-server, which then interacts with the underlying data source or tool via the MCP protocol. This interaction is managed by the server and ensures that all communications are well-defined and standardized.
Getting started with mcpo is straightforward, with both simple command-line options and Docker support to choose from based on your preference:
Using uv
for Optimal Performance:
uvx mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command
Python Installation:
pip install mcpo
mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command
Docker Deployment with Pre-Built Image:
docker run -p 8000:8000 ghcr.io/open-webui/mcpo:main --api-key "top-secret" -- your_mcp_server_command
These methods provide a range of options to serve any MCP tool with ease and speed.
Imagine an application that requires real-time data processing and natural language generation capabilities. mcpo can be used to expose a custom server that generates dynamic text based on user inputs, making it compatible with popular MCP clients.
For example:
python text_generator.py --api-key "your-api-key"
By wrapping this command with mcpo, you can ensure seamless integration into workflows involving Claude Desktop or Continue.
A large enterprise might have specific data sources and tools that are not directly accessible via HTTP but still need to be integrated into an AI application. mcpo provides a secure bridge between these internal systems and external MCP clients.
mcpo --port 8001 --api-key "top-secret" --server-type "sse" -- http://127.0.0.1:5000/sse
This setup ensures that the custom backend server can be securely accessed by MCP clients, maintaining both security and functionality.
mcpo supports a broad range of MCP clients, ensuring compatibility with various AI workflows:
MCP Client Compatibility Matrix:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ❌ | Partial Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
This matrix highlights the robust support for tools and resources across popular MCP clients, making it easier to integrate custom servers into existing workflows.
mcpo is designed to perform well under various conditions. Here’s a performance overview:
This makes it suitable for both small-scale projects and enterprise-level deployments where high availability is critical.
Advanced configuration options allow you to fine-tune the behavior of mcpo:
API Key Management: Securely manage API keys through environment variables or command-line arguments.
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
Server-Types: Support for different types of MCP servers (e.g., JSON-RPC, SSE), ensuring flexibility and compatibility with various applications.
A1: Yes, mcpo can seamlessly integrate any custom Python server that adheres to the Model Context Protocol. You can wrap your command-line tools in MCP using mcpo's straightforward command options.
A2: mcpo supports various authentication mechanisms, including API keys and Basic Auth, ensuring secure communication between clients and servers.
A3: Absolutely. mcpo is language-agnostic and can be used with any command-line tool or server that implements the Model Context Protocol.
A4: By leveraging standard HTTP protocols, mcpo benefits from wide-ranging optimizations that support high concurrency and large-scale deployments. It can handle thousands of requests per second with minimal performance degradation.
A5: Yes, you can customize the generated OpenAPI documents using various tools and libraries. While mcpo auto-generates most common fields, advanced configurations may require manual adjustments for detailed or specialized information.
We welcome contributions from the community to enhance the capabilities of mcpo:
Fork the Repository:
git clone https://github.com/K4moDev/MCPO.git
Set Up Your Environment:
uv sync --dev
Run Tests:
uv run pytest
Open a Pull Request: Share your developments or improvements with the community.
mcpo is part of a broader ecosystem dedicated to standardizing and streamlining AI tool integration:
By leveraging the power of mcpo, you can transform your custom AI tools into seamlessly interoperable components that integrate with a wide array of MCP clients.
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Python MCP client for testing servers avoid message limits and customize with API key
Discover easy deployment and management of MCP servers with Glutamate platform for Windows Linux Mac
Explore community contributions to MCP including clients, servers, and projects for seamless integration
Learn how to use MCProto Ruby gem to create and chain MCP servers for custom solutions