Discover a flexible TypeScript-based MCP server for storing and managing project-specific knowledge graphs and memory
zmem is a TypeScript-based Model Context Protocol (MCP) server designed to provide project-specific and knowledge graph-based memory solutions for Claude Desktop, Continue, Cursor, and other AI-driven tools. It seamlessly integrates with these applications, enabling them to store, retrieve, and manage entities, relations, and observations per project. This infrastructure supports persistence of data to disk (memory.json), making it ideal for scalable, queryable memories in agent ecosystems.
zmem offers a robust set of core features that are essential for managing structured memory within AI applications:
Knowledge Graph Storage: Stores detailed information about entities, their relations, and associated observations. This feature is crucial for maintaining a comprehensive knowledge base.
CRUD Operations via MCP Tools: Provides comprehensive create, read, update, delete (CRUD) operations for efficient data management using the Model Context Protocol.
Persistence to Disk: Data is saved in memory.json
files on disk, ensuring that vital information is not lost. This feature enhances reliability and scalability of memory management.
Docker and TypeScript Support: Facilitates easy deployment through Docker integrations and uses TypeScript for robust development practices, making it suitable for both development and production environments.
These capabilities make zmem a versatile solution for integrating structured memory into AI applications, ensuring seamless connectivity with various MCP clients and tools.
zmem implements the Model Context Protocol (MCP) to enable seamless communication between AI applications and data sources. The protocol defines a standardized way of interacting with memory servers, facilitating compatibility across different ecosystems. Through MCP, zmem ensures that AI applications like Claude Desktop can connect to specific data sources and tools through well-defined APIs.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
graph TD;
A[Project Memory] --> B(Entity);
B --> C[Relation];
C --> D[Observation];
D --> E[Project Metadata];
style A fill:#e1f5fe;
style B fill:#80cbc4;
style C fill:#ef5350;
style D fill:#26a69a;
style E fill:#ffe082;
These diagrams illustrate the intricate flow of data and interactions between AI applications, MCP servers, and underlying data sources. The first diagram demonstrates how an AI application uses a MCP client to interact with a zmem server, which then communicates with the appropriate data source or tool. The second diagram shows the hierarchical structure of knowledge graph storage within zmem, illustrating entities, relations, observations, and project metadata.
To set up zmem for development or production, begin by installing the necessary dependencies:
npm install
This command will fetch all required packages listed in package.json
.
Once dependencies are installed, build the project using the following command:
npm run build
Building the project ensures that your code is properly compiled and ready for execution.
For development purposes, you can run the server with ts-node
as follows:
npx ts-node index.ts
This command utilizes TypeScript compilation at runtime to quickly test and debug changes without a full build cycle.
To start the production-ready zmem server, use the following command:
npm start
This will launch the server in its optimized configuration for real-world applications.
For seamless deployment, you can use Docker. First, build a Docker image:
docker build -t zmem-mcp-server .
Then run the server with a mounted memory directory on your host machine:
docker run -v $(pwd)/memories:/app/memories zmem-mcp-server
This setup ensures that all project memory files are persisted in a designated host directory, making it easy to back up and migrate data.
Imagine integrating zmem with Claude Desktop to enable users to manage detailed histories of conversations, tasks, and projects. By storing this information within the knowledge graph provided by zmem, Claude can offer more context-aware responses that are grounded in real-world data.
Teams using Continue or Cursor can leverage zmem's ability to store and manage shared observations across multiple projects. This feature allows team members to build a collective library of insights and facts, enhancing collaboration and reducing duplication of effort.
zmem is compatible with several popular MCP clients:
Claude Desktop: Fully supports knowledge graph storage and CRUD operations.
Continue: Supports persistence to disk through memory.json
.
Cursor: Primarily focuses on managing relations, but limited support for observations.
The following table provides a quick reference for compatibility across various MCP-enabled tools.
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Limited |
These integrations ensure that zmem can be seamlessly utilized in a variety of AI workflows, enhancing the overall functionality of these applications.
zmem is optimized for performance and broad compatibility. The following table outlines its key features and their impact on AI application performance:
Feature | Performance Impact | Compatibility |
---|---|---|
CRUD Operations via MCP | High Speed | Claude Desktop, Continue, Cursor |
Persistence to Disk (memory.json) | Low Overhead | All Compatible Clients |
Knowledge Graph Storage | Rich Data Model | Advanced AI Applications |
This matrix highlights zmem’s strengths in terms of performance and compatibility with various AI tools.
To adjust the default behavior, you can set environment variables. For instance, to change the memory storage directory:
export MEMORY_DIR_PATH=/path/to/new/directory
This customization allows for greater flexibility in deployment scenarios.
You can call get_help
through the MCP API to get detailed documentation on available tools and their usage. For example:
{
"name": "get_help",
"args": {
"toolName": "save_project_observations"
}
}
This approach ensures that users have access to comprehensive documentation right from within their applications.
zmem uses standard encryption practices and secure storage techniques. It is recommended to set up access controls via environment variables for enhanced security.
Yes, while this repository focuses on compatibility with Claude Desktop, Continue, and Cursor, zmem's API can be adapted to work with any compatible client or tool that adheres to the Model Context Protocol.
zmem is designed with scalability in mind. It employs efficient data storage practices and optimization techniques to manage even extensive knowledge graphs without performance degradation.
Advanced users can explore additional tools like add_graph_observations
and read_graph
, which offer more granular control over memory management and retrieval operations.
Yes, zmem supports Windows, macOS, and other Unix-based systems. Docker compatibility ensures cross-platform deployment without significant modifications.
Contributing to zmem is straightforward with clear guidelines:
git clone https://github.com/your-repo/zmem.git
npm install
Ensure you have Docker installed and follow the instructions in the README for running the server.
npm test
Once your changes are implemented, submit a pull request detailing your contributions and addressing any concerns from maintainers.
For developers looking to build integrations with zmem or other MCP servers, the following resources can provide additional insights:
MCP Documentation: Comprehensive documentation on the Model Context Protocol.
Community Forums: Join discussion boards or forums dedicated to MCP and its applications.
GitHub Repository: Active development and contribution opportunities.
By leveraging zmem as an MCP server, developers and AI application integrators can create more advanced and context-aware solutions. The capabilities of zmem make it a powerful tool for managing structured memory in a variety of AI workflows, ensuring seamless data interaction and sharing across different environments.
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Python MCP client for testing servers avoid message limits and customize with API key
Discover easy deployment and management of MCP servers with Glutamate platform for Windows Linux Mac
Explore community contributions to MCP including clients, servers, and projects for seamless integration
Learn how to use MCProto Ruby gem to create and chain MCP servers for custom solutions