Discover tools to extract and submit MCP servers over the internet with easy configuration and integration
The mcp-server-collector
is a specialized MCP (Model Context Protocol) server designed to facilitate the integration of various AI applications by collecting, analyzing, and submitting Model Context Protocols. This server acts as a bridge between diverse AI clients like Claude Desktop, Continue, Cursor, and other applications that need access to or submission of model contexts. By leveraging standard protocols, it ensures seamless interaction among different components in an AI ecosystem.
The mcp-server-collector
implements several core features:
Data Collection: It can extract MCP Servers from URLs and content via command-line tools.
Submission: Allows submission of MCP servers to directories similar to mcp.so, thereby enabling wider distribution and usage.
The server is equipped with three main tools:
extract-mcp-servers-from-url
: This tool allows the extraction of MCP Servers from a given URL.extract-mcp-servers-from-content
: Enables the extraction of MCP Servers directly from provided content, such as text or raw data.submit-mcp-server
: Facilitates the submission of an MCP Server to a designated directory.These tools are essential for maintaining and disseminating model contexts across various platforms and environments.
MCP operates on a standardized protocol that ensures seamless interaction between AI applications, servers, and data sources. The following Mermaid diagram illustrates this flow:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
To highlight its versatility, the mcp-server-collector
supports a wide range of MCP clients. Here is the compatibility matrix:
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
This table indicates that resources and tools are available for all clients, while prompts can only be used with certain applications.
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Here is an example of the configuration:
{
"mcpServers": {
"fetch": { "command": "uvx", "args": ["mcp-server-fetch"] },
"mcp-server-collector": {
"command": "uv",
"args": [ "--directory", "path-to/mcp-server-collector", "run", "mcp-server-collector" ],
"env": {
"OPENAI_API_KEY": "sk-xxx",
"OPENAI_BASE_URL": "https://api.openai.com/v1",
"OPENAI_MODEL": "gpt-4o-mini",
"MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project"
}
}
}
}
The configuration remains largely the same, with only minor adjustments:
{
"mcpServers": {
"fetch": { "command": "uvx", "args": ["mcp-server-fetch"] },
"mcp-server-collector": {
"command": "uvx",
"args": [ "mcp-server-collector" ],
"env": {
"OPENAI_API_KEY": "sk-xxx",
"OPENAI_BASE_URL": "https://api.openai.com/v1",
"OPENAI_MODEL": "gpt-4o-mini",
"MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project"
}
}
}
}
Suppose an AI developer, Alice, wants to integrate a new model context into her application. She can use the extract-mcp-servers-from-url
tool on the mcp-server-collector
server to gather data from an external source:
extract-mcp-servers-from-url "https://example.com/model-context"
Once extracted, she can then submit it using:
submit-mcp-server "https://example.com/model-context" "path/to/avatar.jpg"
For dynamic applications like live chatbots or real-time data processing systems, the mcp-server-collector
ensures that all necessary model contexts are available in a standardized format. This makes it easier for developers to integrate new models without extensive code changes.
To ensure compatibility and ease of integration, the mcp-server-collector
supports both known and future MCP clients:
{
"mcpServers": {
"fetch": { "command": "uvx", "args": ["mcp-server-fetch"] },
"mcp-server-collector": {
"command": [ "uv", "run", "mcp-server-collector" ],
"env": {
"OPENAI_API_KEY": "sk-xxx",
"OPENAI_BASE_URL": "https://api.openai.com/v1",
"OPENAI_MODEL": "gpt-4o-mini",
"MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project"
}
}
}
}
The mcp-server-collector
is optimized for performance and compatibility across different environments. It supports a wide range of platforms, including Windows, MacOS, and Linux.
For instance, real-time data streaming applications can rely on the mcp-server-collector
to quickly adapt to incoming model contexts without downtime, ensuring smooth operation and high throughput.
Ensuring security and performance requires setting up environment variables correctly:
OPENAI_API_KEY="sk-xxx"
OPENAI_BASE_URL="https://api.openai.com/v1"
OPENAI_MODEL="gpt-4o-mini"
MCP_SERVER_SUBMIT_URL="https://mcp.so/api/submit-project"
Sensitive information, such as API keys and URLs, should be stored securely to prevent exposure.
Q1: Can the mcp-server-collector
handle dynamic and static model contexts?
A: Yes, it supports both. The extract-mcp-servers-from-url
tool can handle dynamic updates, while submit-mcp-server
ensures static contexts are submitted accurately.
Q2: Is there a limit to how many model contexts can be processed at once? A: There is no specific processing limit; however, optimal performance may require tuning based on the system architecture and network conditions.
Q3: How do I ensure data integrity during submission? A: Implement checksum verification or use versioned submissions to maintain data integrity.
Q4: Do these tools support cross-platform integration? A: Yes, they are designed to work across multiple platforms, including Windows, MacOS, and Linux.
Q5: Are there any known issues with the submit-mcp-server
tool?
A: Users have reported occasional delays in submission due to network conditions. Monitoring and optimizing for high connectivity can mitigate these issues.
To distribute this package, follow these steps:
uv sync
uv build
uv publish
Ensure you have set your PyPI credentials via environment variables or flags.
Due to the nature of stdio, debugging can be challenging. We recommend using the MCP Inspector.
To launch the MCP Inspector:
npx @modelcontextprotocol/inspector uv --directory path-to/mcp-server-collector run mcp-server-collector
Upon launching, the Inspector will provide a URL to access via your browser.
Get involved and connect with other users:
Learn more about the MCP Protocol from:
By utilizing this MCP server, developers and AI application creators can establish robust connectivity standards that enhance interoperability across various platforms.
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Python MCP client for testing servers avoid message limits and customize with API key
Expose Chicago Public Schools data with a local MCP server accessing SQLite and LanceDB databases
Learn how to use MCProto Ruby gem to create and chain MCP servers for custom solutions
Discover easy deployment and management of MCP servers with Glutamate platform for Windows Linux Mac