Learn about xgmem MCP server for structured memory management in LLM projects and cross-project knowledge sharing
xgmem is a TypeScript-based Model Context Protocol (MCP) server designed to enable project-specific and knowledge graph-based memory for AI applications such as Claude Desktop, Continue, Cursor, and other tools. This server supports storing, retrieving, and managing entities, relations, and observations per project, with a strong focus on flexibility and cross-project knowledge sharing.
xgmem provides robust capabilities to leverage the Model Context Protocol (MCP). It offers comprehensive CRUD operations through MCP tools, ensuring seamless integration with various AI applications. The server is built to be both scalable and persistent, storing all project memory in a disk-based JSON file (memory.json
). This design ensures that data remains intact even when the system restarts. Additionally, xgmem leverages Docker for deployment, making it easy to containerize and run in any environment.
The architecture of xgmem is designed to be fully compliant with the Model Context Protocol (MCP). It consists of a series of tools that allow AI applications to interact with project-specific data. These tools include save_project_observations
, get_project_observations
, and many more, enabling a wide range of operations such as adding entities, relations, and observations.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[xgmem Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ (Tools Only) | ✅ (Tools Only) | ❌ (No Prompts Supported) | Tools Only |
To get xgmem up and running, follow these steps:
npm install
to set up the necessary dependencies.npm run build
to compile the TypeScript files into JavaScript.npx ts-node index.ts
.npm start
to launch it.xgmem can also be deployed via Docker for ease of deployment and management:
docker build -t xgmem-mcp-server .
docker run -v $(pwd)/memories:/app/memories xgmem-mcp-server
Suppose you are working on an AI tool that needs to understand the relationships between different entities. You can use xgmem to store these relations as knowledge graphs, which can be easily queried and modified.
Technical Implementation: To add a relationship, run the create_relations
tool with appropriate parameters. For example, adding a relation between "Alice" and "Bob" in a project named "demo-project".
{
"name": "create_relations",
"args": {
"projectId": "demo-project",
"relations": [
{"entity1": "Alice", "relType": "works with", "entity2": "Bob"}
]
}
}
Consider a scenario where multiple AI applications need to collaborate on a project. xgmem can serve as a unified memory store, allowing each tool to access and update the same knowledge base.
Technical Implementation: To save project-specific observations, use the save_project_observations
tool. Here’s an example of saving some text-based observations for "Alice" in the project:
{
"name": "save_project_observations",
"args": {
"projectId": "demo-project",
"observations": [
{"entityName": "Alice", "contents": ["Joined Acme Corp in 2021.", "Is a software engineer."]}
]
}
}
xgmem is compatible with the following MCP clients:
By integrating xgmem into your project, you can ensure that all AI applications have access to a shared knowledge base. This seamless integration enhances collaboration and efficiency across different development environments.
xgmem is designed for high performance and compatibility with various MCP clients. The server leverages persistent storage mechanisms to ensure data integrity even during system restarts.
Here’s how to configure xgmem in your MCP settings:
{
"mcpServers": {
"xgmem": {
"command": "npx",
"args": ["-y", "xgmem@latest"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
This configuration ensures that xgmem is properly integrated into your project, ready to handle MCP requests.
For advanced users, you can customize the environment settings and security configurations:
MEMORY_DIR_PATH
environment variable.How do I ensure data persistence?
Can multiple AI applications access the same knowledge base concurrently?
Is xgmem compatible with all MCP tools?
How can I secure my xgmem instance against unauthorized access?
API_KEY
.What happens if an observation is accidentally modified or deleted?
Contributors are welcome to enhance and improve xgmem. To contribute, follow these steps:
For more information on the Model Context Protocol and its integration with AI applications, visit the official Model Context Protocol (MCP) documentation. Additionally, explore resources and tools available for developers building advanced AI workflows.
By leveraging xgmem as your MCP server, you can significantly enhance the functionality and collaboration of AI applications across various projects and teams. This comprehensive setup ensures a seamless experience for developers integrating complex knowledge management systems into their work.
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Python MCP client for testing servers avoid message limits and customize with API key
Discover easy deployment and management of MCP servers with Glutamate platform for Windows Linux Mac
Explore community contributions to MCP including clients, servers, and projects for seamless integration
Learn how to use MCProto Ruby gem to create and chain MCP servers for custom solutions