Slack MCP Server enables seamless Slack workspace integration with secure, permission-free message retrieval and channel management
The Slack Model Context Protocol (MCP) server for Slack Workspaces functions as a universal adapter, enabling AI applications to seamlessly connect and interact with specific data sources and tools through a standardized protocol. This integration supports both Stdio and SSE transports, simplifies proxy settings, and ensures that no permissions or bots need to be created or approved by Workspace admins, making it an intuitive choice for any organization looking to integrate AI applications with Slack.
The core features of the Slack MCP Server include:
Slack MCP Server ensures real-time data exchange between AI applications and Slack. By utilizing the Stdio or SSE transport mechanisms, the server facilitates instantaneous communication, making it ideal for interactive and responsive applications like chatbots, workflow tools, and analytics dashboards.
The Slack MCP Server is architected to follow a robust protocol that aligns with the Model Context Protocol standards. The architecture includes:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
graph LR
A[AI Application] --> B{Slack API}
B --> C[MCP Protocol]
C --> D(MCP Server)
D --> E[Data Source/Tool]
style A fill:#b2e7e6
style C fill:#f3e5f5
style D fill:#e8f5e8
The Slack MCP Server can be installed using either the npx
package manager or Docker. Here are detailed steps to get it up and running.
For quick installation, you can use npx
, which is part of Node.js. This method is ideal for a fast setup without needing to install additional dependencies.
{
"mcpServers": {
"slack": {
"command": "npx",
"args": [
"-y",
"slack-mcp-server@latest",
"--transport",
"stdio"
],
"env": {
"SLACK_MCP_XOXC_TOKEN": "xoxc-...",
"SLACK_MCP_XOXD_TOKEN": "xoxd-..."
}
}
}
}
For a more robust setup, running the Slack MCP Server in a Docker container is recommended. This ensures that all dependencies are managed within an isolated environment.
export SLACK_MCP_XOXC_TOKEN=xoxc-...
export SLACK_MCP_XOXD_TOKEN=xoxd-...
docker pull ghcr.io/korotovsky/slack-mcp-server:latest
docker run -i --rm \
-e SLACK_MCP_XOXC_TOKEN \
-e SLACK_MCP_XOXD_TOKEN \
slack-mcp-server --transport stdio
Alternatively, you can use nginx
or ngrok
to expose the server over HTTPS if your setup requires it.
A chatbot integrated with the Slack MCP Server can provide quick and accurate responses by processing user queries through an AI model. This interaction is facilitated via the MCP protocol, ensuring seamless data flow between the bot and the user interface within Slack.
An automated ticketing system that leverages Slack MCP Server can handle incoming requests from users directly in the Slack channel. The server processes these requests and delegates tasks to an appropriate backend service for resolution, enhancing efficiency and reduce manual intervention.
The Slack MCP Server is compatible with various AI applications that support the Model Context Protocol:
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
The following table outlines the configuration and performance considerations for different environments:
Configuration | Stdio | SSE |
---|---|---|
Transport Protocols | Full Support | Limited Support |
Customization Options | Yes | Some Settings Allowed |
You can configure the Slack MCP Server using command line arguments:
--transport <Required> Select transport for the MCP Server, possible values are: 'stdio', 'sse'
Key environment variables to set include:
SLACK_MCP_XOXC_TOKEN
: Authentication data token field token
from POST data field-set (xoxc-...
)SLACK_MCP_XOXD_TOKEN
: Workspace-specific OAuth access token for API callsAPI_KEY
: Your specific application's API key for additional authorizationYes, it supports a wide range of AI clients that adhere to the Model Context Protocol.
Data is encrypted in transit using secure transport protocols like SSL/TLS. Additionally, you should follow best practices for securing API keys and other sensitive information.
The exact capacity depends on the server's hardware configuration and resource allocation. You may need to scale horizontally or vertically based on your usage patterns.
Yes, you can integrate any tool that supports standard protocols or those with customized configurations as per your needs.
The server is designed to be platform-agnostic and works seamlessly across various platforms supporting the Model Context Protocol.
If you are interested in contributing to the Slack MCP Server, please review our contribution guidelines. We welcome contributions from both developers and users to improve this essential tool for integrating AI applications with Slack.
Explore the broader ecosystem of products that support the Model Context Protocol to find other integrations and tools that can complement your AI workflows. The community around these protocols is active, providing extensive documentation, tutorials, and best practices.
By leveraging the Slack MCP Server, you equip your organization with a powerful framework for integrating AI applications into daily operations, enhancing productivity, and driving innovation through seamless data flow and efficient task automation.
Explore Security MCP’s tools for threat hunting malware analysis and enhancing cybersecurity practices
Browser automation with Puppeteer for web navigation screenshots and DOM analysis
Analyze search intent with MCP API for SEO insights and keyword categorization
Discover seamless cross-platform e-commerce link conversion and product promotion with Taobao MCP Service supporting Taobao JD and Pinduoduo integrations
Configure NOAA tides currents API tools via FastMCP server for real-time and historical marine data
Implement a customizable Python-based MCP server for Windsurf IDE with plugins and flexible configuration