Web search and vector database tools with ChromaDB and LangChain for efficient document retrieval
The Web Search MCP Server is a sophisticated server designed to facilitate web search and vector database functionalities, leveraging LangChain and ChromaDB. This MCP (Model Context Protocol) server enables seamless integration with various AI applications, providing a robust platform that enhances their capabilities through standardized data access and manipulation protocols.
The Web Search MCP Server offers a range of advanced features aligned with the Model Context Protocol:
Web Search: Users can query popular documentation libraries such as LangChain, LlamaIndex, and OpenAI to extract relevant content from web pages. The server supports both simple queries and sophisticated search operations.
Vector Database (ChromaDB): This feature allows for efficient storage and retrieval of documents using vector embeddings. It supports semantic similarity searches, metadata filtering, and batch operations. By integrating with ChromaDB, the server ensures high-efficiency in handling large volumes of data.
The architecture of the Web Search MCP Server is designed to ensure compatibility with a wide range of AI applications via Model Context Protocol (MCP). The protocol flow diagram illustrates how an application communicates with the server to retrieve context-rich information and manipulate vector databases:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
The table below provides a compatibility matrix for the MCPServer with various MCP clients:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
This matrix ensures that developers can easily integrate the MCPServer with their chosen AI applications, enhancing their capabilities through robust MCP protocol support.
To get started with setting up the Web Search MCP Server, follow these steps:
Install dependencies using pip:
pip install -e .
# Alternatively, use `uv` if needed.
Create a .env
file to configure environment variables for Serper API and ChromaDB.
# Serper API for web search
USER_AGENT=Mozilla/5.0
SERPER_API_URL=https://google.serper.dev/search
SERPER_API_KEY=your_serper_api_key
# ChromaDB configuration
CHROMA_PERSIST_DIRECTORY=./chroma_db
EMBEDDING_MODEL_NAME=sentence-transformers/all-MiniLM-L6-v2
# Transport mode (stdio or sse)
TRANSPORT=stdio
python main.py
In this use case, an industry analyst uses the Web Search MCP Server to automatically classify and retrieve relevant documents. By integrating with ChromaDB, they can store large volumes of documents, manage metadata efficiently, and perform semantic searches.
For content creators, this server acts as a powerful tool for generating content based on web-sourced information. They can query the server to find relevant documentation, extract key points, and incorporate these insights into their articles or blog posts.
The Web Search MCP Server is designed to be seamlessly integrated with various AI applications that support Model Context Protocol (MCP). This allows for enhanced functionality in areas such as data retrieval, content manipulation, and vector database management. Developers can utilize the server's APIs to streamline their application workflows.
To ensure optimal performance and compatibility, the following compatibility matrix outlines technical requirements:
Feature | Requirement |
---|---|
Language Support | English |
Vector Database Size | Up to 1GB per document |
Search Query Timeouts | ≤5 seconds |
This matrix guarantees consistent performance across different environments and integration scenarios.
graph TD
A[AI Application] -->|MCP Client| B[MCP Server]
B --> C[Vector DB / Data Source]
style A fill:#e1f5fe
style B fill:#f3e5f5
style C fill:#e8f5e8
To integrate the Web Search MCP Server into your AI application, follow these advanced configuration steps:
Code Formatting: Use black
and isort
for code formatting:
black .
isort .
Linting and Type Checking: Ensure code quality by running linters and type checks:
ruff check .
mypy .
How does the Web Search MCP Server enhance AI applications?
Which AI applications are compatible with this MCPServer?
What document formats does the server support in ChromaDB?
How can I secure the data stored in ChromaDB?
Can I customize the vector embeddings used by the server?
EMBEDDING_MODEL_NAME
environment variable.Contributions to the Web Search MCP Server are welcome and can be made directly via GitHub. Please adhere to these guidelines:
Explore additional resources for developers building AI applications and integrating MCP into their workflows:
By utilizing these resources, developers can gain deeper insights and support for integrating the Web Search MCP Server into their applications.
This comprehensive guide provides a thorough understanding of the Web Search MCP Server's capabilities, integration, and deployment. Developers are encouraged to explore its potential in various AI workflows and contribute to its ongoing development.
AI Vision MCP Server offers AI-powered visual analysis, screenshots, and report generation for MCP-compatible AI assistants
Analyze search intent with MCP API for SEO insights and keyword categorization
Learn how to use MCProto Ruby gem to create and chain MCP servers for custom solutions
Discover seamless cross-platform e-commerce link conversion and product promotion with Taobao MCP Service supporting Taobao JD and Pinduoduo integrations
Learn how to try Model Context Protocol server with MCP Client and Cursor tools efficiently
Connects n8n workflows to MCP servers for AI tool integration and data access