Discover MyAIServ: a high-performance AI FastAPI server with MCP, Elasticsearch, Redis, monitoring, and seamless LLM integration
High-performance FastAPI server implementing Model Context Protocol (MCP) for seamless integration with Large Language Models (LLMs). Built with a modern stack, including FastAPI, Elasticsearch, Redis, Prometheus, and Grafana.
MyAIServ MCP server is an innovative AI-powered FastAPI application that enables developers to integrate various Large Language Models (LLMs) into their applications through the Model Context Protocol. The protocol standardizes interactions between AI applications and external data sources, tools, and prompts, ensuring seamless connectivity and enhanced functionality.
MyAIServ MCP server is built around a comprehensive set of features that make it an indispensable tool for developers working with AI applications. Key features include:
FastAPI-powered REST, GraphQL, and WebSocket APIs: This integration allows for efficient data exchange and real-time communication between the MyAIServ server and external clients.
Full MCP Support (Tools, Resources, Prompts, Sampling): By fully supporting these components of MCP, the server ensures a robust connection between AI applications and their required resources.
Vector Search with Elasticsearch: Utilizing Elasticsearch for vector search capabilities optimizes retrieval speed and accuracy in handling large datasets relevant to AI applications.
Real-time Monitoring (Prometheus + Grafana): With Prometheus and Grafana integrated into the monitoring stack, developers can track server performance metrics in real time, ensuring optimal application performance.
Docker-ready Deployment: The server is designed with Docker in mind, making it easy for developers to containerize their applications and deploy them across different environments.
Comprehensive Test Coverage: Extensive test coverage ensures that the MyAIServ MCP server operates reliably and efficiently under various conditions.
The implementation of MCP at the core of MyAIServ ensures seamless integration with a wide range of AI applications. The server's architecture is designed to facilitate communication between the client and the backend using the Model Context Protocol.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
graph LR
S[Server] -->|MCP Message| E[Elasticsearch]
E --> R[Redis Cache]
S --> P[Prometheus Metrics]
P --> G[Grafana Dashboard]
style S fill:#e5f5dc
style E fill:#c9daff
style R fill:#ffe6cc
style P fill:#b2f5de
style G fill:#d8e0ee
To get started quickly, follow these steps:
Clone and Setup:
git clone https://github.com/eagurin/myaiserv.git
cd myaiserv
python -m venv venv
source venv/bin/activate # Linux/macOS
pip install -r requirements.txt
Configure and Run:
cp .env.example .env
uvicorn app.main:app --reload
After running, the API Docs are accessible at http://localhost:8000/docs
and GraphQL via http://localhost:8000/graphql
.
A customer service team could integrate MyAIServ MCP server into their chatbot application. The real-time monitoring provided by Prometheus and Grafana would enable quick detection of any issues, ensuring the chatbot operates smoothly without interruptions.
# Example Python code snippet for integrating with MyAIServ MCP(server)
from client import MCPClient
client = MCPClient(server='http://localhost:8000')
response = client.request(prompt="How can I assist you today?")
print(response.text) # Processed response from the LLM
A marketing team could leverage MyAIServ to generate creative content based on specific prompts. By integrating Elasticsearch’s vector search capabilities, they can quickly retrieve and refine suitable data sources, ensuring high-quality content with minimal effort.
# Example Python code snippet for content generation use case
from elasticserach_client import ElasticSearchClient
es = ElasticSearchClient(index='content_sources')
prompts = es.search_vectors(prompt="Marketing campaign slogan")
generated_content = client.request(prompts[0])
print(generated_content.text) # Suggested marketing slogan
MyAIServ is compatible with a variety of MCP clients, supporting tools and resources required for Large Language Models (LLMs). Below are the current supports:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
The server's performance and compatibility are crucial for ensuring optimal usage in diverse AI workflows. The following table summarizes key metrics:
Feature | Performance Metrics | Compatibility (Clients) |
---|---|---|
API Latency | < 100ms | Claude Desktop, Continue, Cursor |
Data Integration | Elasticsearch | Claude Desktop, Continue |
Monitoring | Prometheus + Grafana | Full Support |
MyAIServ offers a wide range of configuration options to customize and secure the MCP server. Environment variables support API key management and other security-related configurations.
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
Q: How do I configure MyAIServ to work with my specific LLM? A: You can use the provided configuration sample in JSON format to set up your environment variables. Ensure to replace placeholders with actual values.
Q: Are there any limitations on the number of MCP clients that can be supported simultaneously? A: MyAIServ is designed to handle multiple MCP clients concurrently, but specific limits might depend on server resources and configurations.
Q: What level of support does MyAIServ offer for Cursor, which lacks full Prompt support? A: While full Prompt support from Cursor is not available, the server still supports tools integration, making it a viable option for certain use cases.
Q: How can I troubleshoot performance issues in MyAIServ? A: Real-time monitoring via Prometheus and Grafana can help identify performance bottlenecks. Review logs and metrics to pinpoint areas that need optimization.
Q: Can I customize the API endpoints for my use case? A: Yes, FastAPI framework allows extensive customization of API endpoints based on specific requirements.
Contributions are encouraged to enhance the capabilities of MyAIServ. To contribute:
MyAIServ is part of an expanding ecosystem of MCP-compliant tools and services, fostering interoperability between various AI applications. For more information on the latest updates and community-driven projects, visit our official documentation or join developer forums for ongoing discussions.
By integrating MyAIServ with your AI workflows, you can unlock a host of benefits including enhanced performance, flexibility, and ease of use in working with Large Language Models through MCP.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration