Learn about MemGPT MCP server for seamless LLM chatting with memory management and multi-provider support
MemGPT, an MCP-based server developed in TypeScript, provides a robust memory system that enhances interaction between Artificial Intelligence Language Models (LLMs) and their users. By implementing the Model Context Protocol (MCP), it seamlessly integrates with various AI applications to maintain conversation history, switch models, and providers without disrupting user experience or conversation continuity.
MemGPT MCP Server supports multiple LLM providers, enabling users to seamlessly switch between different platforms like OpenAI, Anthropic, OpenRouter, and Ollama. The server offers tools for various tasks such as sending messages, retrieving conversation history, clearing memories, switching providers, and models.
The chat
tool sends a message to the currently selected LLM provider with options for multiple providers:
get_memory
)Retrieves conversation history from stored memories. Offers control over the number of memories to retrieve through an optional limit
parameter:
limit
allows for limiting memory retrieval.{ "limit": null }
for retrieving all stored memories.clear_memory
)Completely clears the conversation history, effectively removing all stored memories.
use_provider
)Switches between different LLM providers while persisting provider selection:
use_model
)Switch to a different model within the current provider for enhanced functionality. Provides support for various models across different providers:
Anthropic Claude Models:
claude-3-haiku
: Fast response times suitable for customer support and content moderation.claude-3-sonnet
: Balanced performance for general-purpose use.claude-3-opus
: Advanced model ideal for complex reasoning and high-performance tasks.claude-3.5-haiku
: Enhanced speed and cost-effectiveness.claude-3.5-sonnet
: Superior performance with capabilities in computer interaction.OpenAI:
OpenRouter: Any model specified as 'provider/model'
(e.g., openai/gpt-4
, anthropic/claude-2
).
Ollama: Any locally available model like 'llama2', 'codellama'.
All these selections persist across sessions.
MemGPT is built on TypeScript, focusing on implementing the Model Context Protocol (MCP) to ensure seamless communication between the server and various AI applications. The core architecture includes:
The following diagram illustrates the flow of signals and responses between an AI application, MemGPT server, and backend tools or data sources:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
An integration is established between the MemGPT MCP Server and a support chatbot. The flow involves initiating a conversation, retrieving previous exchanges, and switching to a more advanced model for better resolution:
get_memory
): Pulls up recent issue details.use_model
): Upgrades model to claude-3-opus
.chat
): Generates detailed solution.A writer uses Claude Desktop, incorporating MemGPT MCP Server for a smoother experience:
get_memory
): Retrieves recent drafts and ideas from the writing session.use_model
): Enhances model to claude-3.5-sonnet
.chat
): Writes new content seamlessly.To install MemGPT MCP Server, follow these steps:
Install Dependencies: Run the following command in your terminal.
npm install
Build the Server:
npm run build
Development with Auto-Rebuild: For real-time development and server rebuilding, use:
npm run watch
Configure for Use with Claude Desktop:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%/Claude/claude_desktop_config.json
Add the Following Configuration:
{
"mcpServers": {
"letta-memgpt": {
"command": "/path/to/memgpt-server/build/index.js",
"env": {
"OPENAI_API_KEY": "your-openai-key",
"ANTHROPIC_API_KEY": "your-anthropic-key",
"OPENROUTER_API_KEY": "your-openrouter-key"
}
}
}
}
Environment Variables:
OPENAI_API_KEY
ANTHROPIC_API_KEY
OPENROUTER_API_KEY
MemGPT MCP Server is ideal for scenarios requiring persistent conversation history and model switching flexibility, such as:
These workflows leverage the server’s robust capabilities to provide a seamless AI interaction experience.
MemGPT is compatible with several MCP clients, enhancing their functionality by integrating multiple LLM providers and models:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
The performance and compatibility of MemGPT MCP Server are optimized for:
Advanced configuration and security measures include:
Q: What is the difference between MemGPT and other MCP servers?
Q: How does MemGPT handle security with API keys?
Q: Can I use MemGPT with Claude Desktop on Windows?
Q: What tools does MemGPT support besides LLMs?
Q: Can I use MemGPT with models not listed in the README?
Contribution to MemGPT MCP Server involves understanding its architecture and following best practices for TypeScript development. Key steps include:
For further information on MCP, visit the official Model Context Protocol documentation at ModelContextProtocol.org.
By providing robust integration capabilities and advanced features like memory management and model switching, MemGPT MCP Server is a valuable addition to AI application development, ensuring seamless user experience across diverse platforms.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods