Explore a flexible AI-powered personal assistant with Wikipedia and GitHub search designed for extensibility
LLM-MCP Personal Assistant is an innovative personal assistant application that leverages the Model Context Protocol (MCP) to provide sophisticated interactions between AI models and external tools/resources. This application, built using the Anthropic API, currently supports Wikipedia search and GitHub search functionalities but is designed with extensibility in mind for adding more advanced features.
This MCP server implementation plays a crucial role by facilitating communication between AI models and various resources, enabling seamless interaction and context management within the application. The LLM-MCP Personal Assistant integrates modern React for its user interface, Express.js as an API backend, and the MCP protocol to ensure interoperability with leading AI applications like Claude Desktop, Continue, and Cursor.
The LLM-MCP Personal Assistant MCP Server offers a rich set of features facilitated by the Model Context Protocol. These capabilities include:
The architectural design of the LLM-MCP Personal Assistant revolves around several key components:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
The LLM-MCP Personal Assistant ensures compatibility with leading MCP clients, as shown below:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
To set up and run the LLM-MCP Personal Assistant MCP Server, follow these steps:
git clone https://github.com/mikefey/LLM-MCP-personal-assistant.git
cd LLM-MCP-personal-assistant
pnpm install
.env
file in the root directory with your API key and other configuration settings.ANTHROPIC_API_KEY=your_anthropic_api_key_here
VITE_API_PORT=3001
VITE_API_HOST=localhost
pnpm dev
This command will build and run all components, including configuring the MCP server, starting the API server, and launching the Vite development server for the client.
AI applications can leverage the LLM-MCP Personal Assistant to search Wikipedia or GitHub efficiently. An example use case includes conducting research where an Anthropic AI model requires context from multiple sources:
{
"command": "search_wikipedia",
"query": "Artificial Intelligence"
}
During the process of code development, accessing GitHub repositories can be streamlined. For instance, a user might need to reference existing projects or libraries:
{
"command": "search_github",
"query": "React JS library for charting"
}
The LLM-MCP Personal Assistant ensures compatibility and seamless integration with top MCP clients such as Claude Desktop, Continue, and Cursor.
The LLM-MCP Personal Assistant MCP Server ensures compatibility and optimization across various environments, as detailed below:
Environment | Performance | Compatibility |
---|---|---|
macOS | High | Full |
Windows | Moderate | Partial |
Linux | Low | Minimal |
In addition to the basic setup, advanced configuration options and security measures are available:
The LLM-MCP Personal Assistant uses secure protocols for all data exchanges to maintain user privacy and confidentiality.
Yes, the LLM-MCP Personal Assistant is compatible with Claude Desktop, Continue, Cursor, and other leading MCP clients.
Currently, it supports Wikipedia search and GitHub search functionalities. Additional tools can be integrated via the configuration.
Context is managed through session management features that retain state information for smoother interactions.
No, LLM-MCP Personal Assistant can run on standard hardware configurations, making it accessible across a wide range of devices and operating systems.
Contributions are welcome! If you want to contribute or need further assistance, refer to the contribution guidelines in the repository. We value community contributions and are always open to improving our project together.
For more information about the Model Context Protocol and its ecosystem, visit the official documentation at:
By leveraging the LLM-MCP Personal Assistant MCP Server, developers can unlock new possibilities by integrating AI models with a wide array of tools and resources, fostering innovation and productivity across various domains.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration