Connects to MCP servers using SSE, providing an interactive chat UI for testing and tool integration
The Model Context Protocol (MCP) Server is a universal adapter designed to enable seamless integration of AI applications with various data sources and tools through a standardized protocol. This server acts as a bridge, allowing AI applications like Claude Desktop, Continue, Cursor, and others to interact dynamically with external resources and tools such as web scraping, data extraction, and more. The MCP Server supports Server-Sent Events (SSE) for real-time data streaming and communication over HTTPS or WebSocket.
The core features of the Model Context Protocol include:
The MCP Server is compatible with leading AI applications such as:
The core of the MCP Server architecture is designed around a modular protocol stack, facilitating easy integration of various AI applications. The protocol flow and data architecture are detailed below:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
graph LR
subgraph MCP Server
B[MCP Server] --> C[Data Source]
C --> D[Tool Integration]
end
subgraph AI Application
A[AI App] --> E[Query/Command]
E --> B[HTTP Request]
end
To start using the MCP Server, you can follow these steps:
Clone the Repository:
git clone https://github.com/apify/tester-mcp-server.git
cd tester-mcp-server
Install Dependencies:
npm install
Configure Environment Variables:
Create a .env
file with the following content (refer to the .env.example
file for guidance):
APIFY_TOKEN=YOUR_APIFY_TOKEN
LLM_PROVIDER_API_KEY=YOUR_LLM_API_KEY
Start the Server:
npm start
Access the Interface: Navigate to http://localhost:3000
in your browser or use a remote MCP client.
Imagine an AI application tasked with collecting data from multiple social media platforms using a pre-defined set of actors. The MCP Server can dynamically call the appropriate tools based on user queries, allowing for efficient scraping and analysis.
# Example Integration Code
def scrape_social_media(platforms):
# Initialize the MCP Server client
mcp_client = MCPClient()
# Define prompts and data sources
prompt = "Scrape recent posts from these platforms: " + ", ".join(platforms)
tool_call = ToolCall("InstagramScraper", {"queries": ["#marketing"]})
# Send the request to the MCP Server
response = mcp_client.send_request(prompt, [tool_call])
print(response)
An AI content generation system can use the MCP Server to seamlessly integrate with various text processing tools. By dynamically selecting tools based on user queries, the server ensures that the content is optimized for specific contexts and platforms.
# Example Integration Code
def generate_optimized_content(prompt):
# Initialize the MCP Server client
mcp_client = MCPClient()
# Define prompts and data sources
tool_call = ToolCall("GPT-3", {"prompt": prompt})
# Send the request to the MCP Server
response = mcp_client.send_request(tool_call)
print(response)
The MCP Client provides an interactive chat interface for testing and integrating AI applications. It connects to the MCP Server over SSE, allowing real-time interaction and dynamic tool usage.
For custom configurations, refer to the following sample code:
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
The MCP Server supports multiple AI applications and tools. The following compatibility matrix details the status of integration.
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
Advanced configurations include setting up API keys, configuring environment variables, and specifying tool parameters. The server supports secure connections through HTTPS or WebSocket to ensure data privacy.
APIFY_TOKEN=YOUR_APIFY_TOKEN
LLM_PROVIDER_API_KEY=YOUR_LLM_API_KEY
MCP_URL=https://example.mcp.server.com/sse
SYSTEM_PROMPT="Your initial context here..."
How does the MCP Server ensure data privacy? The MCP Server supports secure HTTP streams and HTTPS WebSocket connections, ensuring that all interactions are encrypted.
Can I integrate custom tools with the MCP Server? Yes, the server supports dynamic tool calls based on context and user queries. Custom tools can be integrated by defining them in your configuration.
How does the Pay-per-event pricing model work? You pay for specific actions such as actor start, running time, and query responses. Detailed guidelines are provided in the documentation.
Can I modify the MCP Client's source code? Yes, the client is open-source, and you can review or modify the code to suit your needs.
What tools does the MCP Server support out of the box? The server supports a wide range of tools including web scraping, data extraction, natural language processing (NLP), and more. See the compatibility matrix for details.
By leveraging the Model Context Protocol, developers can build powerful and flexible AI applications with seamless tool integration.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods