Knowledge graph server with semantic search using Qdrant and OpenAI embeddings
The MCP Memory Server, integrated with Qdrant for vector database persistence, serves as a sophisticated knowledge graph implementation for AI applications. Leveraging semantic search through OpenAI embeddings and Qdrant's advanced vector storage capabilities, this server enhances data retrieval efficiency in complex information networks. It offers robust functionality via an MCP (Model Context Protocol) interface, ensuring seamless integration with various AI application clients.
The core features of the MCP Memory Server are designed to boost AI applications by facilitating entity and relation management, semantic search operations, and supporting multiple deployment environments through Docker. These capabilities enable a flexible and powerful data processing environment that can be tailored to specific use cases within AI workflows.
create_entities
: Allows the creation of new entities and their attributes.create_relations
: Enables setting up relationships between existing entities, enhancing graph complexity and connectivity.search_similar
tool. This provides a robust way to find relevant information within large datasets by leveraging OpenAI embeddings.The architecture of the MCP Memory Server is built around the Model Context Protocol (MCP), which ensures interoperability with various AI applications. The server components include:
Changes to the graph are first written to memory.json
and then reflected in Qdrant, ensuring that both data stores remain consistent.
To set up the MCP Memory Server, follow these detailed steps:
Install Dependencies:
npm install
Build the Server:
npm run build
Docker Setup: For ease of deployment, you can use Docker.
docker build -t mcp-qdrant-memory .
docker run -d \
-e OPENAI_API_KEY=your-openai-api-key \
-e QDRANT_URL=http://your-qdrant-server:6333 \
-e QDRANT_COLLECTION_NAME=your-collection-name \
-e QDRANT_API_KEY=your-qdrant-api-key \
--name mcp-qdrant-memory \
mcp-qdrant-memory
Add to MCP Settings:
{
"mcpServers": {
"memory": {
"command": "/bin/zsh",
"args": ["-c", "cd /path/to/server && node dist/index.js"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"QDRANT_API_KEY": "your-qdrant-api-key",
"QDRANT_URL": "http://your-qdrant-server:6333",
"QDRANT_COLLECTION_NAME": "your-collection-name"
},
"alwaysAllow": [
"create_entities",
"create_relations",
"add_observations",
"delete_entities",
"delete_observations",
"delete_relations",
"read_graph",
"search_similar"
]
}
}
}
HTTPS and Reverse Proxy Configuration:
server {
listen 443 ssl;
server_name qdrant.yourdomain.com;
ssl_certificate /path/to/cert.pem;
ssl_certificate_key /path/to/key.pem;
location / {
proxy_pass http://localhost:6333;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
}
QDRANT_URL=https://qdrant.yourdomain.com
This MCP Memory Server can be leveraged to enhance a variety of AI applications, particularly those that require structured knowledge and efficient data retrieval. Here are two realistic scenarios:
Corporate Knowledge Base:
Product Recommendation System:
search_similar
tool.The MCP Memory Server is designed to be compatible with popular AI clients:
Claude Desktop
Continue
Cursor
The performance and compatibility matrix for the MCP Memory Server highlights its robustness:
Feature | Status |
---|---|
Entity Management | ✅ |
Relation Management | ✅ |
Semantic Search | ✅ |
File-based Persistence | ✅ |
Qdrant Integration | ✅ |
HTTPS Support | ✅ |
Reverse Proxy Compatibility | ✅ |
{
"mcpServers": {
"memory": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-memory"],
"env": {
"OPENAI_API_KEY": "your-api-key"
}
}
}
}
The server supports robust HTTPS handling with custom SSL/TLS configurations. Proper certificate verification, connection pooling, timeouts, and retry mechanisms ensure secure and reliable operations.
openssl
for debugging.memory.json
, then embeddings are generated and stored in Qdrant, ensuring both systems remain consistent.The MCP Memory Server with Qdrant is an indispensable tool for AI applications requiring robust data management and semantic search capabilities. By providing seamless integration with popular clients like Claude Desktop, Continue, and Cursor, it offers unparalleled flexibility and functionality. Whether you're building a corporate knowledge base or enhancing a product recommendation system, this server delivers the tools needed to succeed in today's data-driven landscape.
This comprehensive documentation positions the MCP Memory Server as an essential component for AI applications seeking advanced data management solutions through Model Context Protocol integration.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods