Semantic search and memory management with txtai MCP server for AI assistants
TxtAI Assistant MCP Server is an implementation of the Model Context Protocol (MCP) designed to provide semantic search and memory management functionalities. Leveraging robust capabilities from the open-source txtai project, this server enhances AI applications such as Claude Desktop by enabling them to maintain and query contextual memories efficiently.
TxtAI Assistant MCP Server is built on top of txtai, a powerful semantic search engine. With integration into the MCP protocol, this server offers several key features that significantly boost the capabilities of AI applications:
TxtAI Assistant MCP Server implements the Model Context Protocol (MCP) to facilitate seamless integration with a wide range of AI applications. The architecture ensures that the server operates efficiently while adhering strictly to the MCP protocol standards. Here’s an overview of how it works:
store_memory
, retrieve_memory
, search_by_tag
, delete_memory
, get_stats
, and check_health
.To get started, follow these steps:
Clone the Repository:
git clone https://github.com/yourusername/txtai-assistant-mcp.git
cd txtai-assistant-mcp
Run the Start Script:
./scripts/start.sh
This script handles:
Contextual Chatbot Enhancement:
Personalized Knowledge Base Management:
The TxtAI Assistant MCP Server supports integration with popular AI clients such as:
Claude Desktop
"mcpServers": {
"txtai-assistant": {
"command": "path/to/txtai-assistant-mcp/scripts/start.sh",
"env": {}
}
}
Continue AI
This table outlines the compatibility and support status of various MCP clients:
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
Configuring the server is straightforward using environment variables:
# Server Configuration
HOST=0.0.0.0
PORT=8000
# CORS Configuration
CORS_ORIGINS=*
# Logging Configuration
LOG_LEVEL=DEBUG
# Memory Configuration
MAX_MEMORIES=0
These settings can be tailored to the specific requirements of your deployment.
memories.json
file containing all stored memories for backup purposes.Contributions are encouraged! To get started:
For more information, explore these resources and related projects:
graph TD
A[AI Application] -->|MCP Client| B[MCP Server]
B --> C[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
graph LR
subgraph DataSource
MemoryStorage(memories.json)
TagIndex(tags.json)
end
subgraph ServerComponents
MemoryStore
TagManager
SearchEngine
end
APIEndpoints -->|Queries & Updates| MemoryStore
TagManager -->|Tags and Metadata Management| TagIndex
style MemoryStorage fill:#e8f5e8
style TagIndex fill:#e8f5e8
By integrating the TxtAI Assistant MCP Server into your AI applications, you can significantly enhance their functionality through advanced memory management capabilities. Whether it's improving chatbot responses or maintaining a personalized knowledge base, this server offers robust solutions to meet diverse needs.
Note: This documentation is designed to provide comprehensive insights and guidance for developers looking to leverage the power of the Model Context Protocol (MCP) in their AI applications.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods