Discover how to integrate LangChain with MCP adapters for multi-server math and weather queries in Python
The Math and Weather Integration MCP Server demonstrates the integration of LangChain with multiple Model Context Protocol (MCP) adapters to handle both mathematical calculations and weather queries. This project showcases how AI applications can leverage MCP servers for enhanced functionality, supporting a wide range of tasks from basic math operations to real-time weather conditions.
This server integrates multiple MCP servers, including a Math Server for performing mathematical calculations and a Weather Server for retrieving current weather information. By providing a standardized protocol, these servers enable seamless communication between AI applications and diverse data sources or tools.
The project incorporates LangChain to facilitate the management of AI workflows within Python scripts. It leverages the OpenAI API for generating human-like text, combining it with MCP to handle various types of requests efficiently.
Async operations are supported, allowing multiple tasks to be performed concurrently without blocking other processes. This feature is crucial for optimizing performance and ensuring a smooth user experience in complex AI applications.
The server configuration can be customized using environment variables, providing flexibility for setting up the application based on specific requirements or deployment scenarios.
math_server.py
)weather_server.py
)To set up and run the Math and Weather Integration MCP Server, follow these steps:
Clone the Repository
git clone https://github.com/your-repo-url.git
Create and Activate a Virtual Environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
Install Dependencies
pip install -r requirements.txt
Configure Environment Variables
Create a .env
file in the root directory and add your OpenAI API key:
OPENAI_API_KEY=your_api_key_here
The Math and Weather Integration MCP Server can be used to integrate real-time weather data into various AI workflows. For instance, developers can build predictive models that require up-to-date climate information to generate accurate forecasts.
By hosting mathematical operations on a dedicated server, the main server can handle complex computations without impacting performance or user experience. This setup is ideal for scenarios where real-time calculations are necessary.
The Math and Weather Integration MCP Server supports compatibility with various MCP clients, including:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
Here's an example configuration for setting up the MCP Servers:
{
"mcpServers": {
"math_server": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-math"],
"env": {
"API_KEY": "your-api-key"
}
},
"weather_server": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-weather"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
Environment variables such as OPENAI_API_KEY
can be configured for enhanced security and customization. Ensure that sensitive information like API keys are stored securely.
How do I set up the Math and Weather Integration MCP Server?
Can this server handle complex mathematical operations without impacting performance?
Is there support for other AI applications besides Claude Desktop, Continue, and Cursor?
What is the difference between the single-server implementation (main.py) and the multi-server implementation (langchain_client.py)?
How can I optimize performance when running multiple servers together?
To contribute to this project, follow these guidelines:
Explore the broader MCP ecosystem for additional resources, tools, and integrations:
By leveraging this Math and Weather Integration MCP Server, developers can enhance their AI application's capabilities while maintaining a clean and efficient architecture.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration