Search your Cursor chat history with vector search using Dockerized FastAPI and LanceDB database
The Cursor Chat History Vectorizer & Dockerized Search MCP Server is a specialized tool designed to make your Cursor chat history searchable for advanced AI workflows. It facilitates the extraction of user prompts from local state.vscdb
files within Cursor's workspace storage, converts these prompts into text embeddings via a locally running Ollama instance, and stores them in a LanceDB vector database. The centerpiece is an API server (referred to as an "MCP server") that exposes a simple search endpoint for retrieving similar chat history based on input queries.
This project offers several core features essential for integrating with AI applications via Model Context Protocol (MCP):
state.vscdb
files.nomic-embed-text:latest
) to convert text data into vector representations.The MCP server supports compatibility with popular AI applications such as:
While full support is available for Claude Desktop, Continue, and Cursor, there are limitations noted for tools within these applications. Specifically, Cursor does not currently have prompt storage or retrieval functionality.
The implementation leverages Model Context Protocol (MCP) to enable seamless integration between the Cursor Chat History Vectorizer and various AI applications. The core components include:
state.vscdb
files.The following Mermaid diagram illustrates how the MCP client interacts with the server:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
This diagram visually represents the communication flow where the AI application client sends requests through MCP, communicates with the server over a protocol (MCP), and retrieves relevant data from connected tools or storage components.
pip install -r requirements.txt
python vectorizer.py
docker-compose.yml
as needed.docker-compose up --build
This setup enhances AI workflows by enabling more contextual and relevant prompts. For instance, an AI bot can query its own past interactions to generate customized responses based on historical patterns.
Implementation: During the interaction, the bot queries the MCP server with a partial or context-based prompt. The response is filtered from stored conversations, ensuring consistency and relevance in the generated outputs.
The vectorized chat history can serve as a knowledge base for AI assistants, where they can retrieve information based on user inputs similar to those found in past interactions.
Implementation: When a new query comes in, the MCP server searches through stored responses to find relevant documents. This ensures that the AI response is directly aligned with historical user concerns and queries.
The Cursor Chat History Vectorizer & Dockerized Search MCP Server integrates seamlessly into several popular AI platforms via MCP support:
Integration involves ensuring compatibility with the protocol specifications. Customization may be required depending on specific application needs.
The following table outlines the MCP client status for integration in this server setup.
MCP Client | Resources | Tools |
---|---|---|
Claude Desktop | ✅ | ✅ |
Continue | ✅ | ✅ |
Cursor | ❌ | ✅ |
Customize the configuration via a .env
file or Docker environment variables:
graph TD;
A[Environment Configuration] --> B[MCP Server Initialization]
style A fill:#d7e2f0
Example:
API_KEY=your_api_key_here
VECTOR_DATABASE_PATH=/path/to/your/lancedb/database
Ensure secure operation by:
A1: Verify that your Ollama instance is running locally and accessible from where you run the script or container. Check network configurations, firewall settings, and ensure there are no port conflicts.
A2: No direct support for non-compliant applications without significant modifications to the client or server. However, custom scripts can bridge certain gaps temporarily.
A3: The setup supports secure storage and transmission of data through encryption and limited access controls. Ensure all communication is encrypted using TLS/SSL.
A4: Optimize database performance with indexing, partitioning techniques, and regular maintenance tasks to manage growing datasets efficiently.
A5: Yes, but custom modifications may be required. The MCP protocol supports extensibility through client-specific adapters or plugins.
Contributions are welcome! Developers interested in contributing should:
git checkout -b feature-branch
.git commit -am 'Your description'
.git push origin feature-branch
.For more information on Model Context Protocol and related resources, visit:
By participating in this project, you'll be part of a growing ecosystem dedicated to enhancing AI application compatibility through standardized protocols like MCP.
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Python MCP client for testing servers avoid message limits and customize with API key
Explore community contributions to MCP including clients, servers, and projects for seamless integration
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Powerful GitLab MCP Server enables AI integration for project management, issues, files, and collaboration automation
SingleStore MCP Server for database querying schema description ER diagram generation SSL support and TypeScript safety