Simplify MCP API and CLI integration with support for multiple LLM providers and MCP-compatible servers
The Model Context Protocol (MCP) Server serves as a versatile adapter that allows various AI applications to interact seamlessly with specific data sources through a standardized protocol. This server acts as the bridge between the diverse AI applications and the underlying tools or data sources, enabling them to communicate efficiently. By leveraging MCP, developers can ensure interoperability among different AI tools and platforms, promoting a more cohesive ecosystem for AI development.
MCP is designed to be agnostic and flexible, allowing it to support multiple AI applications while maintaining consistent behavior. This server version is compatible with popular AI applications such as Claude Desktop, Continue, Cursor, and others, ensuring that these powerful tools can effortlessly connect to the desired data sources or perform specific computations. The compatibility matrix details which features are supported by each application, highlighting areas where this server excels in facilitating seamless integration.
The Model Context Protocol (MCP) Server supports a wide range of MCP-compatible servers, including SQLite and Brave Search, as well as custom servers that can be added through the mcp-server-config.json
file. This flexibility ensures that this server works seamlessly with different backend systems, providing developers with the freedom to choose their preferred data source or tools.
The server is integrated with LangChain, which enables it to execute Language Model (LLM) prompts efficiently. It supports multiple LLM providers through APIs that support function calls, including popular options such as OpenAI, Claude, Gemini, AWS Nova, Groq, and Ollama. This feature allows the server to collaborate between MCP servers when handling a specific query, ensuring that the response is accurate and efficient.
The architecture of the Model Context Protocol (MCP) Server is built around the core principles of interoperability and simplicity. It uses a robust protocol implementation that ensures seamless communication between AI applications and the targeted data sources or tools. The protocol flow involves the following steps:
使用以下 Mermaid 图表表示 MCP 协议的流程:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
To begin using this MCP Server, start by cloning the repository from GitHub:
git clone https://github.com/rakesh-eltropy/mcp-client.git
cd mcp-client
You need to set environment variables for your API keys and configurations. You can do this either in a terminal session or in the mcp-server-config.json
file.
export OPENAI_API_KEY=your-openai-api-key
export BRAVE_API_KEY=your-brave-api-key
You can start the server using the command-line interface (CLI) by executing:
uv run cli.py
This will allow you to interact with the server directly from the console.
To interact with the server using a REST API, first start the server:
uvicorn app:app --reload
You can then use curl
to send requests to the server for interaction with an LLM:
curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?"}' http://localhost:8000/chat
For real-time responses, use streaming capabilities:
curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?", "streaming": true}' http://localhost:8000/chat
Imagine a scenario where an e-commerce platform needs to provide detailed information about products to customers. Using this server, the platform can integrate with various data sources such as SQLite databases or search engines like Brave. When a customer asks for product details, the server uses LLMs to query these sources in real time.
What is the capital city of India?
Search the most expensive product from database and find more details about it from amazon?
Consider a content creation platform that requires real-time generation of articles based on current events or market trends. The platform can leverage this MCP Server to integrate with various tools such as data feeds, chatbots, and LLMs to generate detailed and relevant content.
The server supports the following AI applications through MCP clients:
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
This compatibility matrix helps developers understand which features are supported by each application, ensuring that the integration process is as smooth as possible.
The server has been tested and optimized for performance with multiple AI applications. Here's a detailed compatibility matrix:
Server Name | Command | Env Variables | Status |
---|---|---|---|
SQLite | npx -y @modelcontextprotocol/server-sqlite | API_KEY=your-api-key | Full Support |
Brave Search | npx -y @modelcontextprotocol/server-brave | BRAVE_API_KEY=your-api-key | Full Support |
You can configure the server by modifying the mcp-server-config.json
file:
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
To ensure the security of your server and its data, we recommend implementing measures such as API key validation, rate limiting, and secure authentication.
Q: Does this server support multiple LLM providers?
Q: How do I integrate this MCP Server with my custom data source?
mcp-server-config.json
file or using the command specified in the configuration.Q: Can this server handle large-scale deployments?
Q: What are the supported AI applications that can use this MCP Server?
Q: How do I secure the API keys for different servers?
If you want to contribute improvements, bug fixes, or new features to this MCP Server, feel free to submit issues and pull requests. Ensure your contributions align with the project's guidelines and are well-documented to benefit other developers.
For further information on the Model Context Protocol (MCP) and its ecosystem, refer to:
By leveraging this MCP Server, developers can enhance AI application integration and workflow efficiency in diverse environments.
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Python MCP client for testing servers avoid message limits and customize with API key
Learn how to use MCProto Ruby gem to create and chain MCP servers for custom solutions
Analyze search intent with MCP API for SEO insights and keyword categorization
Expose Chicago Public Schools data with a local MCP server accessing SQLite and LanceDB databases