API for searching and retrieving vectorized chat history using FastAPI and LanceDB
The Cursor History MCP Server is a versatile API service designed to enhance AI application workflows by providing fast and efficient access to vectorized chat history from the Cursor IDE. Built on FastAPI, Docker, and LanceDB, this server leverages embeddings for advanced search capabilities, enabling seamless integration with various AI applications. It aims to serve as a cornerstone of the Model Context Protocol (MCP), acting as a universal adapter that connects AI applications to specific data sources and tools through a standardized protocol.
The Cursor History MCP Server is constructed using FastAPI, ensuring high performance and ease of integration. This server acts as an intermediary between the Cursor IDE's chat history and external applications or tools that need to access this data. By using FastAPI, developers can create robust, high-speed APIs that meet stringent requirements in AI workflows.
One of the standout features is its vectorized search capability. This feature utilizes embeddings to transform text into numerical vectors, which are then used for efficient querying and searching within the chat history. The use of vector embeddings allows for precise and relevant results even when working with unstructured data, such as natural language queries.
The server is designed to be self-hosted, providing full control over your data and service. Users can run this server locally or on their own server, ensuring privacy and compliance with local regulations, making it ideal for both development and production environments.
Docker support simplifies the deployment process, allowing users to quickly set up and configure the environment required for the server. This makes it easy to integrate into existing workflows without extensive setup procedures.
The Cursor History MCP Server is compatible with local language models provided by Ollama. This integration enables advanced processing within the context of chat history, enhancing both accuracy and relevance of responses generated by AI applications. By leveraging these models for processing and generation, users can achieve more sophisticated interactions without relying on external services.
The Cursor History MCP Server is designed to follow the Model Context Protocol (MCP), a standard framework that ensures interoperability between different components in AI workflows. The protocol flow involves interactions between an MCP Client, the server, and various data sources or tools.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
To get started with the Cursor History MCP Server, you need to have Docker and Python 3.8 or higher installed on your system, along with FastAPI and LanceDB dependencies.
git clone https://github.com/Nossim/Cursor-history-MCP.git
cd Cursor-history-MCP
docker build -t cursor-history-mcp .
docker run -p 8000:8000 cursor-history-mcp
After running the container, you can access the API documentation at http://localhost:8000/docs
.
By integrating with the Cursor History MCP Server, an AI application like Claude Desktop can instantly query chat logs for insights and recommendations. For example, a finance AI tool could analyze historical conversations to identify relevant trends or potential risk factors based on specific keywords.
graph TD
A[AI Application (Claude Desktop)] -->|Query Chat Logs| B[Cursor History MCP]
B --> C[MCP Server]
C --> D[Chat Logs Database]
style A fill:#e1f5fe
style D fill:#e8f5e8
The server can be used to enhance content-based recommendation systems, where the application needs to understand user preferences and behavior based on past interactions. For instance, a customer service bot could use historical chat data to personalize responses and offer relevant services.
graph TD
A[Customer Service Bot] -->|Query Chat History| B[Cursor History MCP]
B --> C[MCP Server]
C --> D[Chat Logs Database]
style A fill:#e1f5fe
style D fill:#e8f5e8
The Cursor History MCP Client is compatible with various AI applications that support the Model Context Protocol (MCP). These clients can utilize the server to extend their functionality by integrating with rich data sources and tools. Users need only configure their MCP client to connect to this server to unlock comprehensive historical context.
The performance of the Cursor History MCP Server is optimized for scalability and efficiency, ensuring fast response times even under heavy loads. This makes it suitable for both small-scale pilot projects and large enterprise deployments.
AI Applications | Cursor Desktop | Continue | GPT-4 |
---|---|---|---|
Functionality | ✅ | ✅ | ❌ |
Users can configure the Cursor History MCP Server to meet specific needs. This includes setting up authentication, customizing responses, and managing connection settings.
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
A: The Model Context Protocol (MCP) is a standard framework that facilitates communication between AI applications and external data sources, ensuring interoperability.
A: It integrates with local models provided by Ollama to enhance processing capabilities within chat history. This integration supports both querying and generation tasks, providing more natural interactions.
A: While Docker simplifies deployment, it is not a strict requirement. You can use the server on bare-metal or other containerization tools if preferred.
A: Vectorized search uses advanced embeddings to transform text into numerical vectors, enabling efficient and precise querying of large datasets without manual indexing.
A: Yes, the server is fully compatible with both local and cloud-based environments. Docker makes deployment straightforward in any setting.
Contributions to the Cursor History MCP Server are encouraged from developers around the world. The following steps outline the process for submitting patches or suggestions:
Please ensure your code adheres to the project's coding standards and includes tests where applicable.
Explore more about the Model Context Protocol (MCP) ecosystem, including related tools and resources:
Feel free to explore the repository and make use of the API service. Your feedback is always welcome!
This comprehensive document provides a detailed overview of the Cursor History MCP Server, highlighting its key features, integration capabilities, and usage scenarios, ensuring developers building AI applications can leverage this server effectively.
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Build a local personal knowledge base with Markdown files for seamless AI conversations and organized information.
Integrate AI with GitHub using MCP Server for profiles repos and issue creation
Python MCP client for testing servers avoid message limits and customize with API key
Explore MCP servers for weather data and DigitalOcean management with easy setup and API tools