Containerized Manim environment with API and AI support for creating mathematical animations
Manim MCP Server is a Docker-based environment designed to facilitate the creation of mathematical animations through an isolated, reproducible setting that encompasses both command-line interface (CLI) access and web API capabilities. It leverages Manim Library for intricate animations alongside a FastAPI RESTful API adhering to the Model Context Protocol (MCP), offering seamless integration with AI applications such as Claude Desktop, Continue, Cursor, and more.
Manim MCP Server introduces several key features:
The following Mermaid diagram illustrates how the manim-MCP-server interacts with the Model Context Protocol (MCP) clients like Claude Desktop:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
The table below outlines the current state of integration with various MCP clients:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
This matrix highlights which features each MCP client supports, including resource management, tool integration, and prompt generation.
The architecture of the Manim MCP Server is built around a modular design, where different components can plug into the main server. The protocol itself is a layer that sits between the AI application (client) and the underlying infrastructure, enabling smooth communication without needing to understand low-level details.
A more detailed flow diagram showcases how data travels within this architecture:
graph TD
A[AI Application] --> B{API Request}
B --> C[MCP Server]
C --> D[MCP Handler]
D --> E[Leveraging Manim Library]
E --> F{Rendering Animation}
F --> G[Save to Output Directory]
C --> H[MCP Client Response Construction]
H --> I[Acknowledgement/Result to Client]
To install and deploy the Manim MCP Server, follow these steps:
docker pull wstcpyt/manim-docker-mcp:latest
docker compose up -d
git clone https://github.com/YOUR_USERNAME/manim-docker-mcp.git
cd manim-docker-mcp
docker compose build
Imagine an educational tool where users describe scenarios through natural language queries. This scenario could be transformed into a visual animation using Manim, which is then managed via the Manim MCP Server's REST API.
from manim import *
class HelloScene(Scene):
def construct(self):
text = Text("Hello from AI")
self.play(Write(text))
A user might input something like "Create a scene showing 'Hola desde ASI'":
For a financial analysis tool, an AI assistant could describe a chart displaying historical stock prices. The server would then process this request and provide a rendered video output.
from manim import *
class StockPrices(Scene):
def construct(self):
table = Table([['AAPL', 147], ['GOOGL', 295]])
self.play(FadeIn(table))
A user might input: "Create an animated chart showing Apple and Google stock prices over a year."
curl -X POST "http://localhost:8000/run-manim?filepath=/manim/temp/example.py&scene_name=StockPrices&quality=1080p"
This would trigger the server to generate a script for creating an animated table with given data points.
The Manim MCP Server supports interoperability with various AI applications by adhering to the Model Context Protocol. This protocol ensures that any compatible client can interact with the server without deep knowledge of how specific tools (like Manim) are implemented internally. Here’s a conceptual representation:
{
"mcpServers": {
"manim-mcp-server": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-manim"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
Performance-wise, Manim MCP Server ensures compatibility with multiple AI applications while maintaining high reliability in animation creation:
Quality Settings:
-ql
: Quick 480p, 15 fps-qm
: Medium 720p, 30 fps-qh
: High 1080p, 60 fps-qk
: Ultra 1440p, 60 fpsClient Compatibility:
Manim MCP Server allows advanced configurations through the Docker Compose file:
version: "3.8"
services:
manim-api:
build: .
ports:
- "8000:8000" # Exposes API on port 8000
volumes:
- ./animations:/manim/animations # Mounts local script directory
Security measures include:
Q: What if I encounter an issue while running animations?
Q: How can I securely manage API keys in production?
Q: Can multiple AI clients connect simultaneously?
Q: Is there a minimum setup cost for deploying this solution on cloud servers?
Q: What happens when API requests are too frequent or large in size?
This documentation effectively positions Manim MCP Server as a powerful tool for integrating AI applications with animation creation, emphasizing its interoperability and flexibility.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration