Manage multiple LLMs with memory, switching models, retrieving conversation history, and supporting latest Claude models
MemGPT is an advanced TypeScript-based MCP (Model Context Protocol) server that serves as a robust memory system for Large Language Models (LLMs). It is designed to work seamlessly with various AI applications, offering seamless integration through the Model Context Protocol. This powerful server supports multiple LLM providers such as OpenAI, Anthropic, OpenRouter, and Ollama and enhances them by providing tools that facilitate continuous conversation history, model switching, and provider management.
MemGPT MCP Server is specifically tailored to support memory retention across different AI applications, including Claude Desktop, Continue, Cursor, and more. By leveraging the Model Context Protocol, it ensures that these applications can interact with LLMs through a standardized interface, thus enabling cross-application compatibility and enhanced functionality.
MemGPT offers several key features that make it a versatile tool for developers and AI enthusiasts:
chat
command allows users to send messages to their chosen LLM provider. This tool supports multiple providers such as OpenAI, Anthropic, OpenRouter, and Ollama.use_provider
command enables users to switch between LLM providers, ensuring that they can adapt their workflows as needed without disrupting current processes.By implementing these features through the Model Context Protocol, MemGPT ensures compatibility across different AI applications, making it a robust choice for developers looking to integrate LLMs into their projects.
MemGPT is architected around the Model Context Protocol (MCP), which defines how various components of an ecosystem communicate and interact. The protocol ensures that different AI applications can seamlessly communicate with MemGPT, maintaining a consistent interface for interaction. This architecture includes the following key elements:
The implementation of MCP in MemGPT ensures that it can be easily integrated into existing AI workflows, making it a valuable addition for anyone looking to enhance LLM performance through standardized communication.
To use MemGPT as an MCP server, you will need to follow these steps:
npm install
npm run build
npm run watch
Additionally, you will need to configure MemGPT within AI applications such as Claude Desktop or Continue by adding a server configuration file:
For usage with Claude Desktop on MacOS:
{
"mcpServers": {
"letta-memgpt": {
"command": "/path/to/memgpt-server/build/index.js",
"env": {
"OPENAI_API_KEY": "your-openai-key",
"ANTHROPIC_API_KEY": "your-anthropic-key",
"OPENROUTER_API_KEY": "your-openrouter-key"
}
}
}
}
On Windows:
{
"mcpServers": {
"letta-memgpt": {
"command": "\\path\\to\\memgpt-server\\build\\index.js",
"env": {
"OPENAI_API_KEY": "your-openai-key",
"ANTHROPIC_API_KEY": "your-anthropic-key",
"OPENROUTER_API_KEY": "your-openrouter-key"
}
}
}
}
MemGPT can be leveraged in several real-world scenarios to enhance AI application workflows:
These use cases demonstrate the versatility and real-world applicability of MemGPT in various AI-driven applications, making it a valuable addition to any developer’s toolkit.
MemGPT is designed to be compatible with several popular MCP clients, including:
While other applications may offer similar features, the standardized Model Context Protocol ensures seamless communication and integration between MemGPT and these clients. This compatibility matrix highlights its broad applicability across various AI use cases:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
MemGPT is optimized for performance and ensures compatibility across multiple LLM providers, including:
claude-3-haiku
, claude-3-sonnet
, claude-3-opus
claude-3.5-haiku
, claude-3.5-sonnet
gpt-4o
, gpt-4o-mini
, gpt-4-turbo
provider/model
format, e.g., openai/gpt-4
llama2
, codellama
To ensure that MemGPT is secure and flexible, you can configure it using environment variables:
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
OPENROUTER_API_KEY=your-openrouter-key
Additionally, the server can be debugged using MCP Inspector to monitor communication over stdio and ensure optimal performance.
MemGPT offers tools like get_memory
to retrieve the conversation history and clear_memory
to erase stored data, ensuring control over the retention of information during conversations.
Yes, MemGPT supports multiple LLM providers such as OpenAI, Anthropic, OpenRouter, and Ollama through its configuration options and provider-switching features.
The MCP protocol ensures a standardized interface, allowing seamless communication and data exchange between AI applications and MemGPT, facilitating cross-application compatibility.
MemGPT can be used for customer support by maintaining conversation history, content creation where past interactions inform current responses, and more, demonstrating its versatility across various AI workflows.
MemGPT maintains strict data handling protocols and supports secure API keys through environment variables to protect sensitive information during interactions with LLMs.
If you are interested in contributing to the development of MemGPT, follow these guidelines:
git clone https://github.com/your-username/memgpt.git
For more information and resources, visit the official Model Context Protocol documentation and community forums:
MemGPT is part of an expanding ecosystem that aims to standardize interactions between AI applications and memory systems, making it easier for developers to build powerful and flexible AI solutions.
By leveraging MemGPT as your MCP server, you can enhance the integration and functionality of LLMs in various AI workflows, providing a robust foundation for developing cutting-edge applications.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods