Backend for Quirox Telegram Bot integrating Google Sheets Web Search and Google Gemini AI
The Quirox Telegram Bot Backend project provides a robust backend infrastructure for integrating AI applications, particularly leveraging Google Gemini (Flash model) to interact with various tools and services. This documentation focuses on the core components and functionalities of the MCP server component, designed for seamless integration with emerging AI platforms.
The MCP Server is a central hub that facilitates communication between an AI model (Google Gemini) and external tools such as Google Sheets, web search engines, and content fetchers. Key capabilities include:
mcp_server.py
runs the MCP server, exposing tools like Sheets and Search. The mcp_client.py
handles communication with these services./chat
endpoint for AI interaction through HTTP requests.The architecture of the Quirox Telegram Bot Backend centers on the Model Context Protocol (MCP) to standardize interactions between the AI model and the various tools. The protocol ensures that AI applications can efficiently communicate with and utilize external services, enhancing their functionality. Key components include:
mcp_server.py
): Exposes services via MCP framework.mcp_client.py
): Facilitates communication between the AI model and the MCP server.graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
Clone Repository:
git clone <your-repository-url>
cd <repository-directory>
Install Dependencies:
pip install -r requirements.txt
Environment Variables Setup:
Create a .env
file in the root directory and add the following variables:
GEMINI_API_KEY=your_gemini_api_key
# Method 1: Service Account (Recommended for servers)
SERVICE_ACCOUNT_PATH=path/to/your/service_account.json
DRIVE_FOLDER_ID=your_google_drive_folder_id
# Method 2: OAuth 2.0 (Requires user interaction for the first run)
CREDENTIALS_PATH=path/to/your/credentials.json
TOKEN_PATH=token.json
# Method 3: Base64 Encoded Service Account Config (Alternative)
CREDENTIALS_CONFIG=your_base64_encoded_service_account_json_string
Run MCP Server:
python mcp_server.py
Start FastAPI Application:
python main.py
# or using uvicorn directly for more options:
# uvicorn main:app --reload --host 0.0.0.0 --port 8000
Connect AI Application to MCP Server:
Use the /chat
endpoint via POST requests.
AI applications can create, update, or fetch data from Google Sheets with ease using the GoogleSheetsService
.
Technical Implementation:
ghseet.py
, creating a new spreadsheet and adding rows.For web-based tasks such as content fetching or search queries, users can benefit from integrated web services using DuckDuckGo Searcher.
Technical Implementation:
search_tools.py
to fetch and parse the results from DuckDuckGo.MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
The Performance and Compatibility Matrix outlines the efficiency and interoperability of the MCP server across different AI clients:
Update .env
file to include specific details for authentication and service configuration. Ensure secure handling of API keys and credentials.
search_tools.py
for web searches.Q: Can other AI applications besides Google Gemini be integrated? A: Yes, the MCP server supports multiple AI applications through custom MCP clients.
Q: How do I enable OAuth 2.0 authentication? A: Follow the Google Cloud Platform documentation for detailed steps on setting up OAuth credentials.
Q: What is the maximum number of queries per minute?
A: Use rate limiting within search_tools.py
to control the frequency of requests.
Q: How do I contribute to the project? A: Review the contribution guidelines for submitting issues, enhancements, and features.
Q: Are there any known limitations with the current version? A: The server currently supports tools without full AI prompt integration, which is being addressed in future updates.
To contribute to the project,
Explore additional resources within the Model Context Protocol ecosystem:
By positioning this document comprehensively, it becomes a valuable resource for developers aiming to integrate MCP servers within AI workflows.
Explore Security MCP’s tools for threat hunting malware analysis and enhancing cybersecurity practices
Browser automation with Puppeteer for web navigation screenshots and DOM analysis
Discover seamless cross-platform e-commerce link conversion and product promotion with Taobao MCP Service supporting Taobao JD and Pinduoduo integrations
Discover efficient methods for mcp_stdio2sse integration to enhance data streaming and system performance
Manage PocketBase data with a customizable MCP server for record, collection, and migration management
Configure NOAA tides currents API tools via FastMCP server for real-time and historical marine data