Discover RagDocs MCP Server for semantic document management with vector search using Qdrant and AI embeddings
The RagDocs MCP Server provides Model Context Protocol (MCP) compatibility, enabling seamless integration of RAG (Retrieval-Augmented Generation) capabilities into various AI applications. This server supports semantic search through documents using Qdrant vector database and embeddings from Ollama or OpenAI, making it a robust tool for managing and querying large volumes of textual data.
The core features of the RagDocs MCP Server revolve around semantic search and document management. Key capabilities include adding documents with metadata, conducting search through stored documents using vector similarity, listing and organizing documents, deleting selected items, and supporting both free (Ollama) and paid (OpenAI) embedding services.
The server ensures that AI applications like Claude Desktop, Continue, Cursor, etc., can integrate seamlessly by leveraging the standardized MCP protocol. This integration allows these applications to query the document repository as if it were a native component of the application itself—enabling richer user interactions and enhanced productivity in workflows involving complex document management.
The RagDocs MCP server is built on top of Qdrant's vector database, which stores embeddings generated by Ollama or OpenAI. The document content undergoes automatic chunking before embedding to ensure precision and relevance in search results. When an AI application like Claude Desktop sends a request through the MCP client, it triggers the RagDocs server to perform semantic searches based on the query provided.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[RagDocs Server]
C --> D[Qdrant Vector DB]
style A fill:#e1f5fe
style D fill:#f3e5f5
The table below shows compatibility of the RagDocs MCP server with various AI applications.
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
To get started, you need Node.js 16 or higher and one of the following Qdrant setups: a local instance using Docker (free) or a Qdrant Cloud account with an API key.
npm install -g @mcpservers/ragdocs
The configuration file specifies details like where embeddings are stored and which embedding provider to use. Below is the default setup for local usage:
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["@mcpservers/ragdocs"],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "ollama"
}
}
}
}
For integration with Qdrant Cloud, you would include the API key:
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["@mcpservers/ragdocs"],
"env": {
"QDRANT_URL": "https://your-cluster-url.qdrant.tech",
"QDRANT_API_KEY": "your-qdrant-api-key",
"EMBEDDING_PROVIDER": "ollama"
}
}
}
}
To use OpenAI, you would specify the API key in addition to embedding model selection:
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["@mcpservers/ragdocs"],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "openai",
"OPENAI_API_KEY": "your-api-key"
}
}
}
}
Developers frequently need to refer to documentation while coding. By integrating RagDocs with their development environment, they can quickly retrieve relevant documents through natural language queries. For example, when working on a piece of code related to asynchronous programming in Node.js, the developer might type "async functions and callbacks" into the search bar, and the system would return documentation specific to this topic.
Customer support teams often deal with frequently asked questions that could be answered using internal knowledge bases. By embedding customer service tools like FAQ systems or knowledge management software with RagDocs, the backend can provide relevant solutions based on user queries, improving response time and accuracy. For instance, a query "how to configure DNS settings" could lead to an instantly generated list of articles containing step-by-step instructions.
The RagDocs server is designed to work seamlessly with a range of MCP clients, such as Claude Desktop, Continue, and Cursor. When a user or developer interacts in their chosen environment, the request gets routed via an MCP client to the RagDocs backend for processing.
This integration ensures that documents stored within RagDocs are accessible through any eligible AI application, enriching end-user experiences by providing on-demand access to relevant information directly from these apps. This enhances the utility of applications, making them more powerful and context-aware.
Functionality | RAG System (Qdrant + Embeddings) |
---|---|
Indexing Speed | Sub-second indexing for documents |
Query Response Time | Typically less than 2 seconds |
Document Count | Up to millions of documents supported |
This server is compatible with both the Ollama and OpenAI embedding services, offering varying levels of scalability and cost based on the chosen provider. Qdrant Cloud adds a layer of managed service, simplifying deployment and maintenance.
You can configure additional settings for higher security or performance via environment variables. For example:
{
"mcpServers": {
"ragdocs": {
"<command>": "node",
"args": ["@mcpservers/ragdocs"],
"env": {
"QDRANT_URL": "<url>",
"QDRANT_API_KEY": "<api-key>",
"EMBEDDING_PROVIDER": "<provider>",
"OPENAI_API_KEY": "<openai-api-key>",
"EMBEDDING_MODEL": "<model>"
}
}
}
}
Q: How does RagDocs handle large document sets?
Q: Can I integrate other data sources with this MCP server?
Q: What happens if I need to scale up the system later?
Q: Are OpenAI and Ollama embeddings compatible with all AI applications?
Q: How do I manage metadata efficiently for documents?
title
, contentType
, etc., can be added during document submission and used to enhance search queries and filtering operations, ensuring that relevant content is surfaced appropriately across different use cases.Contributing to the RagDocs project involves creating pull requests for improvements or adding new features aligned with community guidelines. Feel free to explore issues marked as 'help wanted' on GitHub and contribute solutions for ongoing needs. Additionally, setting up a development environment and running tests thoroughly is essential before submitting any contributions.
To contribute:
npm test
locally to ensure all tests pass.The RagDocs server is part of the broader MCP ecosystem designed to facilitate the development of intelligent applications that can handle complex queries seamlessly. Explore additional resources like documentation, forums, and community support available on GitHub or other relevant platforms to extend the utility and reach of this technology.
For more information, visit:
By participating in this vibrant ecosystem, you contribute not only to your own projects but also enrich the collective knowledge base available for developers and organizations worldwide.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica