OpenAPI endpoint discovery and request tool with semantic search for seamless API integration
The MCP (Model Context Protocol) Server is an innovative solution designed to facilitate seamless integration between AI applications, specifically those leveraging MCP clients like Claude Desktop, Continue, and Cursor, with various backend data sources through standardized APIs. This server empowers developers to enhance the functionality of their AI applications by providing a scalable mechanism for endpoint discovery and API request execution. By utilizing openAPI JSON files as its primary source, it eliminates the need for local file system access or manual updates, making it highly adaptable and maintainable.
At the heart of the MCP Server is an advanced semantic search capability that leverages a lightweight but powerful 43MB version of the MiniLM-L3 model. This allows users to query endpoints in natural language, and the server swiftly returns accurate API documentation complete with full parameters. The system processes openAPI specifications up to 10MB by chunking them into individual endpoints, ensuring no context is lost during indexing.
The MCP Server runs on a lightweight yet robust FastAPI framework, fully supporting asynchronous operations. This setup ensures that even complex queries and requests can be handled efficiently without compromising performance. The async nature of the server also supports simultaneous processing, making it ideal for environments where multiple API calls are made frequently.
To enhance performance, the MCP Server employs an in-memory FAISS vector search, allowing for near-instantaneous endpoint discovery. This feature significantly reduces latency and enhances user experience by quickly providing relevant API documentation based on natural language queries.
The MCP Server is built with a clear focus on efficiency and compatibility with various MCP clients. The architecture leverages the Model Context Protocol (MCP) to establish a standardized communication interface between AI applications and backend systems such as APIs, databases, or other tools. This protocol ensures seamless integration by providing a common language that can be understood across different platforms and environments.
Unlike traditional document-level analysis which struggles with large openAPI specs, the MCP Server indexes individual endpoints based on unique identifiers (path + method) along with parameter-aware embeddings and response schema context. This approach enables the server to handle up to 5,000 endpoints while maintaining context integrity.
The server supports multi-architecture builds, ensuring compatibility across different platforms. By default, it supports linux/amd64
and linux/arm64
, making it accessible for a wide range of users regardless of their system requirements. Users can build and push the server image using Docker's buildx functionality:
# Build and push using buildx
docker buildx create --use
docker buildx build --platform linux/amd64,linux/arm64 \
-t buryhuang/mcp-server-any-openapi:latest \
--push .
For seamless integration of the Scalable OpenAPI Endpoint Discovery and API Request Tool for Claude Desktop using Smithery, you can automate the setup process by following:
{
"mcpServers": {
"any_openapi": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e OPENAPI_JSON_DOCS_URL=https://api.example.com/openapi.json",
"-e MCP_API_PREFIX=finance",
"buryhuang/mcp-server-any-openapi:latest"
]
}
}
}
EndpointSearcher: Core class responsible for:
Server Implementation:
In the context of developing applications, the MCP Server acts as a bridge between developers and backend systems. For instance, imagine an e-commerce platform where developers need to quickly find and integrate new APIs into their application. Without this server, they might spend hours parsing through large openAPI specs or consulting documentation, which can slow down development cycles. The MCP Server simplifies this process by allowing them to query endpoints in natural language and receive detailed API documentation immediately.
AI assistants like Claude Desktop can benefit immensely from the MCP Server by providing more accurate and contextually relevant information. For example, an assistant tasked with generating content for social media posts might use the server to quickly identify appropriate APIs for gathering data or images related to trending topics. This integration allows the assistant to generate higher-quality outputs faster, enhancing user satisfaction.
The MCP Server is compatible with a spectrum of popular MCP clients:
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
This matrix highlights which clients support tool resources, allowing developers to tailor their integrations accordingly.
The MCP Server delivers exceptional performance metrics with:
These metrics ensure that the server can handle high-volume traffic and dynamic usage patterns typical of modern applications.
The server is compatible with a wide range of operating systems, including Linux (linux/amd64
and linux/arm64
) and Docker environments. Users can run the server locally or deploy it in cloud-based solutions for added flexibility.
python -m mcp_server_any_openapi
This command initiates the server directly from the source code, providing developers with full control over its configuration.
Users can customize environment variables such as OPENAPI_JSON_DOCS_URL
and MCP_API_PREFIX
during deployment:
docker run \
-e OPENAPI_JSON_DOCS_URL=https://api.example.com/openapi.json \
-e MCP_API_PREFIX=finance \
buryhuang/mcp-server-any-openapi:latest
To ensure the security of your environment, consider implementing SSL/TLS for encrypted communication and using secure API keys or authentication mechanisms.
How does the MCP Server handle large openAPI specifications?
What are the supported platforms for running the MCP Server?
linux/amd64
and linux/arm64
natively, with Docker building and pushing capabilities available.Can I customize the environment variables for my deployment?
OPENAPI_JSON_DOCS_URL
during container startup to tailor the server's behavior.What is the expected response time from the MCP Server?
How do I ensure the security of my deployment?
git checkout -b feature/new-feature
.git push origin feature/new-feature
.Once you have implemented new features or fixed issues, open a pull request for review and merge.
For more information about the broader MCP ecosystem and related resources, visit:
By utilizing this comprehensive documentation, developers can effectively integrate the MCP Server into their AI workflows, enhancing functionality and performance.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods