WeCom Bot MCP Server supports multiple message types, @mentions, and integrates easily with Python for automation
The WeCom Bot MCP Server is an advanced implementation of the Model Context Protocol (MCP) designed specifically for integrating AI applications with WeCom (WeChat Work). This server acts as a bridge, enabling seamless communication between various AI tools and the WeCom platform. By adhering to the MCP standard, it ensures that different AI clients can interoperate with a unified protocol, making the integration process both efficient and reliable.
The WeCom Bot MCP Server offers a variety of features tailored to meet the needs of AI developers. These include advanced message handling capabilities such as support for multiple message types (text, markdown, images, files), @mention functionality, and comprehensive logging mechanisms. Additionally, it supports full type annotations and Pydantic-based data validation, ensuring that all interactions are structured and error-free.
The server is equipped to handle a wide range of messages:
The @mention feature allows for targeted notifications or interactions within messages. Users can be mentioned by their unique identifier or phone number, enhancing the user experience.
A configurable logging system ensures traceability and debugging ease. The server leverages platformdirs.user_log_dir() to manage log files across different operating systems, storing them in specific directories such as C:\Users\<username>\AppData\Local\hal\wecom-bot-mcp-server for Windows or ~/Library/Application Support/hal/wecom-bot-mcp-server on macOS.
With full type annotations and Pydantic-based validation, the server enforces robust data handling, reducing errors and improving the overall reliability of AI application interactions.
The WeCom Bot MCP Server is built to comply with the MCP protocol, ensuring compatibility across various clients. It adheres to a robust design that allows for seamless integration and dynamic interaction between AI applications and other services hosted on WeCom. The implementation involves:
To get started, developers have multiple installation options available:
npx -y @smithery/cli install wecom-bot-mcp-server --client claude
pip install wecom-bot-mcp-server
For Windsurf configuration:
{
"mcpServers": {
"wecom": {
"command": "uvx",
"args": [
"wecom-bot-mcp-server"
],
"env": {
"WECOM_WEBHOOK_URL": "your-webhook-url"
}
}
}
}
The WeCom Bot MCP Server excels in various use cases within AI workflows, providing a robust platform for seamless integrations. Here are two practical examples:
import mcp
async def send_weather():
response = await mcp.send_message(
content="Shenzhen Weather:\n- Temperature: 25°C\n- Weather: Sunny\n- Air Quality: Good",
msg_type="markdown"
)
async def send_reminder():
response = await mcp.send_message(
content="## Project Review Meeting Reminder\n\nTime: Today 3:00 PM\nLocation: Meeting Room A\n\nPlease be on time!",
msg_type="markdown",
mentioned_list=["zhangsan", "lisi"]
)
The WeCom Bot MCP Server provides a broad MCP client compatibility matrix:
| MCP Client | Resources | Tools | Prompts | Status |
|---|---|---|---|---|
| Claude Desktop | ✅ | ✅ | ✅ | Full Support |
| Continue | ✅ | ✅ | ✅ | Full Support |
| Cursor | ❌ | ✅ | ❌ | Tools Only |
This ensures that the server can be effectively integrated with various AI tools, enhancing their functionality and user experience.
The WeCom Bot MCP Server has successfully proven its capabilities in a wide range of environments and use cases. Here’s an overview:
| Environment | API Key Handling | Message Queuing | Event Triggers |
|---|---|---|---|
| Windows | Fully Supported | Partial | Limited |
| Linux | Partial Support | Full | Full |
| macOS | Limited Support | Full | Full |
Advanced configuration options include:
# On Windows PowerShell
$env:WECOM_WEBHOOK_URL = "your-webhook-url"
$env:MCP_LOG_LEVEL = "DEBUG" # Log levels: DEBUG, INFO, WARNING, ERROR, CRITICAL
$env:MCP_LOG_FILE = "path/to/custom/log/file.log"
Q: How does the WeCom Bot MCP Server integrate with different AI clients?
Q: Can I customize message templates in real-time?
Q: Is there a limit to the number of concurrent requests supported by the server?
Q: How can I secure my environment variable settings?
Q: Are there any known limitations across operating systems?
For developers interested in contributing to the WeCom Bot MCP Server, the following steps outline the process:
git clone https://github.com/your-repo.git
cd wecm-bot-mcp-server
npm install
The WeCom Bot MCP Server is part of a larger ecosystem that includes other MCP clients and tools. Developers can explore documentation, community support, and additional resources to further enhance their projects.
By leveraging the WeCom Bot MCP Server, AI application developers can easily integrate and scale their solutions on the WeCom platform while ensuring compatibility with various external systems.
This comprehensive documentation positions the WeCom Bot MCP Server as a vital tool for enhancing AI application integration through model context protocol compliance. It covers all necessary technical details, provides real-world use cases, and outlines contribution guidelines to foster a supportive community of developers working together in this domain.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration