Unified Docker image for multiple MCP servers with Kubernetes support and secure remote access
MCP-Collection (Model Context Protocol collection) is an innovative Docker image that bundles multiple Model Context Protocol (MCP) servers, allowing developers to easily switch between different server implementations at runtime. This flexibility enables a seamless integration of various AI applications with data sources and tools through a standardized protocol. The bundle comes packaged with the supergateway, which facilitates communication protocols such as HTTP/SSE or WebSocket, making it remote-ready out of the box.
MCP-Collection simplifies deployment by providing several key features:
stdio
interface to more familiar protocols like HTTP/SSE or WebSocket.uv.lock
, preventing run-time installation or modification.These features make MCP-Collection an ideal choice for developers working on complex AI workflows that require integration with multiple tools and data sources through a single consistent interface.
MCP-Collection leverages the Model Context Protocol to enable seamless communication between various AI applications. The protocol's architecture supports a rich set of commands, making it highly adaptable to different types of interactions such as prompting, fetching, and managing data sources. The core structure of each MCP server within the collection is implemented using both Python and Node.js environments, ensuring broad compatibility across diverse use cases.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
This diagram illustrates that AI applications use the MCP client to communicate with the Model Context Protocol, which then routes commands and data to the appropriate server (in this case, MCPServer-Collection), ensuring secure and efficient interactions between the application and underlying data sources.
To get started using MCP-Collection, follow these simple steps:
Pull the Multi-Arch Image:
docker pull ghcr.io/erhardtconsulting/mcp-collection:1.0.0
List Available MCP Servers (shows help):
docker run --rm ghcr.io/erhardtconsulting/mcp-collection
Run the Git Server with Extra Flags for the Underlying CLI:
docker run --rm -p 8080:8080 \
ghcr.io/erhardtconsulting/mcp-collection mcp-server-fetch --test
MCP-Collection can significantly enhance the development and deployment of AI workflows by simplifying interactions with various tools and data sources. Two notable use cases include:
graph TD
A[AI Application] -->|MCP Client| B[MCP Collection]
B --> C[MCP Server Fetch]
C --> D[SQL Database]
D --> E[Real-Time Database]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
style E fill:#d4ffce
In this diagram, the AI application communicates with MCP Collection, which then routes commands to specific servers. For instance, data fetching capabilities can draw from SQL or real-time databases, ensuring that the AI workflows are well-equipped with necessary information.
MCP-Collection ensures compatibility with a wide array of AI applications and clients through its standardized protocol. Key clients include:
Here’s an example of how you might configure the MCP servers within MCP-Collection:
{
"mcpServers": {
"git-fetch": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-fetch"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
This JSON snippet provides a basic configuration for the git-fetch
server, integrating it with MCP-Collection and specifying an API key for authentication.
MCP-Collection is designed to be highly compatible across a wide range of AI applications. Here’s a compatibility matrix detailing the support status for each client:
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
1000:1000
), reducing potential security risks.For enhanced security and external accessibility, consider deploying MCP-Collection behind a reverse proxy or API gateway. These solutions can enforce TLS and authentication mechanisms to secure data transmission.
Q: How do I configure different MCP servers within the image?
entrypoint.sh
file, allowing you to define and modify server parameters as needed.Q: Can I add new MCP clients or update existing ones?
Q: What environments does MCP-Collection support?
Q: How do I handle authentication and authorization with different servers?
pyproject.toml
/package.json
.Q: Can MCP-Collection be deployed on Kubernetes?
root/app/package.json
(Node) or root/app/pyproject.toml
(Python). Also, update the corresponding configuration in entrypoint.sh
.make test
command to run local tests and ensure everything is working as expected.MCP-Collection is part of a broader ecosystem aimed at standardizing interactions between AI applications and data sources. By contributing to this project, you can actively shape how future AI workflows integrate with diverse tools and platforms. For more information on the MCP Protocol and its applications, visit the official Model Context Protocol documentation.
By leveraging MCP-Collection, developers can streamline their AI application development process by ensuring consistent and reliable communication between various servers and clients. This document aims to provide a comprehensive guide for both new users and seasoned contributors to fully utilize the capabilities of MCP-Collection in building robust and scalable AI ecosystems.
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Python MCP client for testing servers avoid message limits and customize with API key
Discover easy deployment and management of MCP servers with Glutamate platform for Windows Linux Mac
Explore community contributions to MCP including clients, servers, and projects for seamless integration
Learn how to use MCProto Ruby gem to create and chain MCP servers for custom solutions