Bridge MCP tools with OpenAI API for seamless integration and enhanced AI capabilities
MCP-Bridge serves as an essential bridge between the OpenAI API and MCP (Model Context Protocol) tools, enabling developers to harness the power of MCP-driven tools through the familiar interface of the OpenAI API. This server facilitates a seamless interaction with various AI applications like Open Web UI, Claude Desktop, Continue, and Cursor without the need for explicit support from each tool.
MCP-Bridge offers an extensive array of features that cater to both developers and end-users of MCP tools.
MCP-Bridge operates by seamlessly integrating the complexities of the OpenAI API and MCP servers. Here’s a deep dive into its architecture:
The following Mermaid diagram illustrates this flow:
sequenceDiagram
participant OpenWebUI as Open Web UI
participant MCPProxy as MCP Proxy
participant MCPserver as MCP Server
participant InferenceEngine as Inference Engine
OpenWebUI ->> MCPProxy: Request
MCPProxy ->> MCPserver: list tools
MCPserver ->> MCPProxy: list of tools
MCPProxy ->> InferenceEngine: Forward Request
InferenceEngine ->> MCPProxy: Response
MCPProxy ->> MCPserver: call tool
MCPserver ->> MCPProxy: tool response
MCPProxy ->> InferenceEngine: llm uses tool response
InferenceEngine ->> MCPProxy: Response
MCPProxy ->> OpenWebUI: Return Response
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
Clone the repository:
git clone https://github.com/your-repo/mcp-bridge.git
.Edit the compose.yml file:
environment:
- MCP_BRIDGE__CONFIG__FILE=config.json # mount the config file for this to work
- MCP_BRIDGE__CONFIG__HTTP_URL=http://10.88.100.170:8888/config.json
- MCP_BRIDGE__CONFIG__JSON={"inference_server":{"base_url":"http://example.com/v1","api_key":"None"},"mcp_servers":{"fetch":{"command":"uvx","args":["mcp-server-fetch"]}}}
docker-compose up --build -d
to start the application.Clone the repository:
git clone https://github.com/your-repo/mcp-bridge.git
.Set up dependencies:
uv sync
for dependency installation.Create a config.json file in the root directory:
{
"inference_server": {
"base_url": "http://localhost:8000/v1",
"api_key": "None"
},
"mcp_servers": {
"fetch": {
"command": "uvx",
"args": ["mcp-server-fetch"]
}
},
"network": {
"host": "0.0.0.0",
"port": 80
}
}
your-app-command
to launch the server.MCP-Bridge supports various clients:
Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ❌ |
Cursor | ❌ | ✅ | ❌ |
MCP-Bridge ensures optimal performance and compatibility across different AI applications:
Feature | Performance (Latency) in ms | Compatibility with Tools | Configuration Effort |
---|---|---|---|
Chat Completions | 10-50 | High | Medium |
Non-Streaming Completions | Below 30 | Full | Low |
Sampling | Under 25 | Variable | High |
{
"mcpServers": {
"elexisserver": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-elexis"],
"env": {
"API_KEY": "123456"
}
},
"cursorserver": {
"command": "npx",
"args": ["@modelcontextprotocol/server-cursor"],
"env": {
"API_KEY": "abcdefg"
}
}
},
"network": {
"hostname": "localhost",
"port": 3000
}
}
API_KEY
through proper configuration practices.How does MCP-Bridge differ from generic AI APIs? MCP Bridge differentiates itself by its compatibility with MCP tools, offering a seamless integration experience for users of these specific platforms.
Is it difficult to set up MCP-Bridge for new tools? The setup process can vary based on tool requirements but is generally manageable with the provided documentation and examples.
How does MCP-Bridge handle real-time data in complex applications like finance? By integrating high-latency handling capabilities, MCPBridge ensures that real-time data analysis processes are efficient and reliable for such applications.
Why might Continue only partially support prompts? The compatibility matrix indicates partial support due to current limitations or untested features within the specific version of Continue being utilized.
Can MCP-Bridge be deployed on cloud environments too? Yes, MCP Bridge is designed to be flexible and can be easily migrated to various cloud platforms for scalability and performance optimization.
main
into a unique branch name (e.g., add-chat-completion-feature
).MCP Bridge is part of a broader ecosystem where developers and organizations can integrate various AI tools more effectively through standard protocols. Explore the official documentation for detailed setup guides, APIs, and additional resources.
By harnessing the power of MCP-Bridge, developers and users alike can enhance their workflows with more flexibility and efficiency in integrating diverse AI tools.
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration