Learn about xgmem MCP server for structured memory management in LLM projects and cross-project knowledge sharing
xgmem is a TypeScript-based Model Context Protocol (MCP) server designed to enable project-specific and knowledge graph-based memory for AI applications such as Claude Desktop, Continue, Cursor, and other tools. This server supports storing, retrieving, and managing entities, relations, and observations per project, with a strong focus on flexibility and cross-project knowledge sharing.
xgmem provides robust capabilities to leverage the Model Context Protocol (MCP). It offers comprehensive CRUD operations through MCP tools, ensuring seamless integration with various AI applications. The server is built to be both scalable and persistent, storing all project memory in a disk-based JSON file (memory.json). This design ensures that data remains intact even when the system restarts. Additionally, xgmem leverages Docker for deployment, making it easy to containerize and run in any environment.
The architecture of xgmem is designed to be fully compliant with the Model Context Protocol (MCP). It consists of a series of tools that allow AI applications to interact with project-specific data. These tools include save_project_observations, get_project_observations, and many more, enabling a wide range of operations such as adding entities, relations, and observations.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[xgmem Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
| MCP Client | Resources | Tools | Prompts | Status |
|---|---|---|---|---|
| Claude Desktop | ✅ | ✅ | ✅ | Full Support |
| Continue | ✅ | ✅ | ✅ | Full Support |
| Cursor | ❌ (Tools Only) | ✅ (Tools Only) | ❌ (No Prompts Supported) | Tools Only |
To get xgmem up and running, follow these steps:
npm install to set up the necessary dependencies.npm run build to compile the TypeScript files into JavaScript.npx ts-node index.ts.npm start to launch it.xgmem can also be deployed via Docker for ease of deployment and management:
docker build -t xgmem-mcp-server .docker run -v $(pwd)/memories:/app/memories xgmem-mcp-serverSuppose you are working on an AI tool that needs to understand the relationships between different entities. You can use xgmem to store these relations as knowledge graphs, which can be easily queried and modified.
Technical Implementation: To add a relationship, run the create_relations tool with appropriate parameters. For example, adding a relation between "Alice" and "Bob" in a project named "demo-project".
{
"name": "create_relations",
"args": {
"projectId": "demo-project",
"relations": [
{"entity1": "Alice", "relType": "works with", "entity2": "Bob"}
]
}
}
Consider a scenario where multiple AI applications need to collaborate on a project. xgmem can serve as a unified memory store, allowing each tool to access and update the same knowledge base.
Technical Implementation: To save project-specific observations, use the save_project_observations tool. Here’s an example of saving some text-based observations for "Alice" in the project:
{
"name": "save_project_observations",
"args": {
"projectId": "demo-project",
"observations": [
{"entityName": "Alice", "contents": ["Joined Acme Corp in 2021.", "Is a software engineer."]}
]
}
}
xgmem is compatible with the following MCP clients:
By integrating xgmem into your project, you can ensure that all AI applications have access to a shared knowledge base. This seamless integration enhances collaboration and efficiency across different development environments.
xgmem is designed for high performance and compatibility with various MCP clients. The server leverages persistent storage mechanisms to ensure data integrity even during system restarts.
Here’s how to configure xgmem in your MCP settings:
{
"mcpServers": {
"xgmem": {
"command": "npx",
"args": ["-y", "xgmem@latest"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
This configuration ensures that xgmem is properly integrated into your project, ready to handle MCP requests.
For advanced users, you can customize the environment settings and security configurations:
MEMORY_DIR_PATH environment variable.How do I ensure data persistence?
Can multiple AI applications access the same knowledge base concurrently?
Is xgmem compatible with all MCP tools?
How can I secure my xgmem instance against unauthorized access?
API_KEY.What happens if an observation is accidentally modified or deleted?
Contributors are welcome to enhance and improve xgmem. To contribute, follow these steps:
For more information on the Model Context Protocol and its integration with AI applications, visit the official Model Context Protocol (MCP) documentation. Additionally, explore resources and tools available for developers building advanced AI workflows.
By leveraging xgmem as your MCP server, you can significantly enhance the functionality and collaboration of AI applications across various projects and teams. This comprehensive setup ensures a seamless experience for developers integrating complex knowledge management systems into their work.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration