Simple MCP CLI tool for running LLM prompts with support for various models and tools.
The mcp-client-cli is a simple command-line interface (CLI) program designed to run large language model (LLM) prompts and integrate seamlessly with Model Context Protocol (MCP). This server acts as an alternative client to tools like Claude Desktop, while supporting various LLM providers and local models. By leveraging MCP, it ensures consistent and unified communication between AI applications, data sources, and tools.
The mcp-client-cli offers several core features that make it a powerful component in the broader MCP ecosystem:
The mcp-client-cli architecture ensures robust and efficient integration with MCP servers. It employs a modular design, allowing developers to add new tools and LLM providers easily. The protocol implementation adheres to the Model Context Protocol standards, ensuring seamless communication between AI applications, servers, and data sources.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
To get started with the mcp-client-cli MCP server, follow these steps:
Install via pip:
pip install mcp-client-cli
Configure .llm/config.json
:
Create and edit a ~/.llm/config.json
file to specify your LLM and MCP servers.
{
"systemPrompt": "You are an AI assistant helping a software engineer...",
"llm": {
"provider": "openai",
"model": "gpt-4",
"api_key": "your-openai-api-key",
"temperature": 0.7,
"base_url": "https://api.openai.com/v1" // Optional, for OpenRouter or other providers
},
"mcpServers": {
"fetch": {
"command": "uvx",
"args": ["mcp-server-fetch"],
"requires_confirmation": ["fetch"],
"enabled": true, // Optional, defaults to true
"exclude_tools": [] // Optional, list of tool names to exclude
},
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"env": {
"BRAVE_API_KEY": "your-brave-api-key"
},
"requires_confirmation": ["brave_web_search"]
},
"youtube": {
"command": "uvx",
"args": ["--from", "git+https://github.com/adhikasp/mcp-youtube", "mcp-youtube"]
}
}
}
Run the CLI:
llm "What is the capital city of North Sumatra?"
AI applications often require real-time data to inform their decisions or provide accurate responses. By integrating MCP servers like fetch
and brave-search
, developers can dynamically retrieve data from diverse sources (e.g., APIs, search engines) during the conversation.
Custom prompts allow AI applications to generate content tailored to specific scenarios or requirements. The p
prefix for predefined templates supports these use cases by enabling users to quickly invoke commonly used language generation tasks.
The mcp-client-cli seamlessly integrates with various MCP clients, enhancing AI application functionality. Some notable MCP clients include:
By supporting these clients, the mcp-client-cli ensures compatibility across multiple platforms and use cases.
The performance of the mcp-client-cli can be optimized through careful configuration. The following matrix provides insights into its compatibility with different MCP servers:
LLM Provider | Supported Tools |
---|---|
OpenAI | ✅ |
Groq | ✅ |
Local Models | ✅ (via llama.cpp) |
To enhance security and performance, the mcp-client-cli offers advanced configuration options:
~/.llm/config.json
or $PWD/.llm/config.json
.LLM_API_KEY
or OPENAI_API_KEY
to set API keys..json
file format.Q: How do I integrate new tools with mcp-client-cli? A: You can add support for new tools by specifying their integration details in your configuration file, such as command and environment variables.
Q: Can the mcp-client-cli handle multiple LLM providers? A: Yes, it supports multiple LLM providers including OpenAI, Groq, and local models via llama.cpp.
Q: How do I secure sensitive information like API keys? A: Use environment variables or secure configuration files to store sensitive information.
Q: What are the potential challenges in integrating with MCP servers? A: Challenges include ensuring compatibility across different clients and managing data flow efficiently.
Q: How does mcp-client-cli improve AI application performance? A: It improves performance by standardizing communication protocols, reducing fragmentation, and enhancing tool integration.
To contribute to the mcp-client-cli project:
The mcp-client-cli is part of a larger ecosystem that includes various tools and resources for developers:
By leveraging the capabilities of MCP servers like the mcp-client-cli, developers can build more robust, scalable AI applications that integrate seamlessly with diverse data sources and tools.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration