Convert OpenAPI specs to MCP configurations with this easy-to-use CLI tool
The OpenAPI to MCP Server tool converts OpenAPI specifications into configurations compatible with the Model Context Protocol (MCP). This allows developers and integrators to leverage the versatility of OpenAPI definitions in order to deploy APIs seamlessly on various platforms that support MCP, such as Higress. The resulting configuration can be used by a wide array of AI applications, enhancing their ability to interact with data sources or tools via a standardized protocol.
One of the key features is its capability to read and understand OpenAPI specifications (JSON or YAML), converting them into an equivalent MCP configuration. This ensures that any API built on modern standards can be accessed using MCP, making it easy for AI applications to integrate new services without complex integration work.
The tool supports multiple formats of OpenAPI specification files, making it versatile in handling existing APIs regardless of the format they are defined in. This flexibility enables seamless migration from one format to another.
From the converted configuration, MCP tools are automatically generated based on the paths defined in the OpenAPI spec. Each tool is configured with detailed settings like parameter descriptions and types, ensuring that the AI application has a clear understanding of how each endpoint operates.
The tool intelligently sets positions for parameters (path, query, header, body) based on their definitions within the OpenAPI specification. This ensures that tools are configured to match the exact API endpoints they represent, enhancing both usability and reliability.
Validation of the OpenAPI specification can be enabled or disabled as needed through configuration settings. If enabled, it helps catch any structural issues early in the development process, ensuring a clean conversion to MCP.
The Model Context Protocol (MCP) defines a set of steps and rules for how AI applications can connect to data sources or tools. This diagram illustrates the flow:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
This flow diagram shows how the MCP client within an AI application interacts with both the MCP protocol and server to seamlessly interact with external tools or data sources.
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
This table details the compatibility of various MCP clients with different aspects of MCP. It highlights that while all clients offer full support for resources and tools, some may not have support for prompts.
Installation is straightforward and can be done via the Go language's package manager:
go install github.com/jumpyweapon/openapi-to-mcpserver/cmd/openapi-to-mcp@latest
Once installed, you can run the tool with specific configuration options to match your API definitions. For example:
openapi-to-mcp --input path/to/swagger-petstore.json --output petstore-mcp.yaml --server-name petstore-config
Imagine an intelligent chatbot that needs to fetch real-time data from a weather API. By defining the OpenAPI spec for this API and converting it to MCP, you can enable the bot to seamlessly request current weather conditions based on user queries.
A news aggregator could use an article recommendation API powered by an OpenAPI specification. Converting this API to MCP would allow the aggregator's backend to query relevant articles based on user preferences and interactions, providing highly personalized content.
The OpenAPI to MCP server can be seamlessly integrated into any AI application that supports MCP clients like Claude Desktop. The generated configuration ensures that the endpoints are correctly mapped to tools or data sources, allowing for smooth communication through the MCP protocol.
{
"mcpServers": {
"[petstore-config]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-petstore"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
This sample configuration shows how a client can be set up to use the converted MCP server.
The tool is optimized for performance and compatibility with various OpenAPI definitions, ensuring that a wide range of specifications can be easily translated into MCP configurations. The compatibility matrix detailed earlier further underscores its robustness across different AI clients.
Here's an example of how the tool could handle a more complex request from Cursor, which might have both query and body parameters:
tools:
- name: Get User Profile
method: GET
path: /user/{id}
description: Retrieves user profile information.
parameters:
- name: id
in: path
type: string
- name: token
in: header
required: true
type: string
- name: Update User Profile
method: PUT
path: /user/{id}
description: Updates user profile information.
parameters:
- name: id
in: path
required: true
type: string
- name: body
in: body
required: true
schema: {
"type": "object",
"properties": {
"name": { "type": "string" },
"email": { "type": "string" }
}
}
For enhanced security, the generated configuration can be patched using a template. This allows you to add common headers such as API keys or other customizations:
server:
config:
apiKey: ""
tools:
requestTemplate:
headers:
- key: Authorization
value: "APPCODE {{.config.apiKey}}"
- key: X-Ca-Nonce
value: "{{uuidv4}}"
When applied, this template will add an API key header to all tools in the configuration.
Q: Can I use custom templates to patch the generated MCP server configuration?
Q: What if my API includes sensitive data that needs specific handling?
Q: Is it possible to change HTTP methods for certain endpoints during the conversion process?
Q: Can I validate my OpenAPI spec before generating MCP config?
Q: How do I ensure seamless integration with existing AI applications like Claude Desktop or Continue?
For developers interested in contributing to the project, please follow these steps:
The OpenAPI to MCP Server integrates seamlessly into the broader MCP ecosystem, providing valuable tools and resources for building and deploying robust AI applications. For more information on MCP and its usage:
By leveraging this tool, you ensure that your AI applications are well-integrated into the MCP network, providing a robust foundation for future developments.
This comprehensive documentation highlights the capabilities and benefits of using the OpenAPI to MCP Server in integrating various AI applications with data sources or tools via Model Context Protocol.
Discover seamless cross-platform e-commerce link conversion and product promotion with Taobao MCP Service supporting Taobao JD and Pinduoduo integrations
Implement a customizable Python-based MCP server for Windsurf IDE with plugins and flexible configuration
Explore Security MCP’s tools for threat hunting malware analysis and enhancing cybersecurity practices
Browser automation with Puppeteer for web navigation screenshots and DOM analysis
Analyze search intent with MCP API for SEO insights and keyword categorization
Configure NOAA tides currents API tools via FastMCP server for real-time and historical marine data