Implement MCP server for TOS data retrieval with tools for listing buckets, objects, and getting objects efficiently
The TOS Model Context Protocol Server is a specialized implementation designed to facilitate seamless data retrieval and manipulation for AI applications via the MCP protocol. This server acts as a bridge between AI applications like Claude Desktop, Continue, Cursor, and VolcEngine's Transfer Object Storage (TOS), enabling them to leverage vast storage resources with minimal integration effort.
The TOS MCP Server offers a robust suite of functions that enable fine-grained control over data management. Key features include:
These capabilities form the backbone of the MCP protocol, making it easier for AI applications to interact with storage solutions as if they were local filesystems.
The TOS MCP Server is architected to comply closely with the Model Context Protocol (MCP) standards. Key aspects include:
VOLC_ACCESSKEY
, VOLC_SECRETKEY
, REGION
, and TOS_ENDPOINT
. Optionally, users can specify a security token or restrict access to specific buckets.To deploy the TOS MCP Server, follow these steps:
export VOLC_ACCESSKEY=your-access-key
export VOLC_SECRETKEY=your-secret-key
export REGION=tos-region
export TOS_ENDPOINT=tos-endpoint-url
{
"mcpServers": {
"tos-mcp-server": {
"command": "uv",
"args": [
"--directory",
"/ABSOLUTE/PATH/TO/PARENT/FOLDER/src/mcp_server_tos",
"run",
"main.py"
]
}
}
}
Alternatively, you can use a more concise setup:
{
"mcpServers": {
"tls": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/volcengine/ai-app-lab#subdirectory=mcp/server/mcp_server_tos",
"mcp-server-tos"
],
"env": {
"VOLC_ACCESSKEY": "your ak",
"VOLC_SECRETKEY": "your sk",
"REGION": "tos region",
"TOS_ENDPOINT": "tos endpoint",
"SECURITY_TOKEN": "your security token",
"TOS_BUCKET": "your specific bucket"
}
}
}
}
The TOS MCP Server addresses several critical use cases within the AI ecosystem:
Suppose you have a dataset distributed across multiple buckets in TOS. By integrating the TOS MCP Server with your AI application via MCP, you can streamline data loading processes and improve training performance.
import os
# Setting environment variables
os.environ['VOLC_ACCESSKEY'] = 'your-access-key'
os.environ['VOLC_SECRETKEY'] = 'your-secret-key'
# Running the TOS MCP Server command
!tos-mcp-server run <main.py>
Integrate the TOS MCP Server into a model hosting pipeline to ensure models are easily accessible from any node in your infrastructure.
import mcp
# Initialize MCP client
mcp_client = mcp.Client(server_name='tos-mcp-server')
# Fetch hosted model for inference
model_data, metadata = mcp_client.get_object(bucket_name='ai-models', key='path/to/model')
The TOS MCP Server supports compatibility with several popular MCP clients:
Clients | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ (No support) | ✅ | ❌ (No support) | No Integration |
While most major clients are fully supported, some features like prompt generation remain unsupported in certain instances.
The TOS MCP Server is designed to ensure high performance and compatibility across a wide range of use cases. Here’s an overview:
To fine-tune the server behavior, additional configuration options are available. Key settings include security tokens and restricted bucket access:
"env": {
"SECURITY_TOKEN": "your-security-token",
"TOS_BUCKETS": ["bucket1", "bucket2"]
}
These settings can be tailored to meet specific security requirements or to control data access in multi-user environments.
uv
and uvx
commands?The choice depends on the build type needed for your project. Use uv
for minimal setups, while uvx
offers additional features through subdirectory cloning.
Yes, it supports efficient data retrieval even for large datasets by optimizing cache and I/O operations.
The server can be configured to handle token renewal automatically or require periodic manual refreshes depending on your setup needs.
By default, all connections are encrypted, and access is controlled via environment variables ensuring secure operations.
Absolutely, it is compatible with various cloud platforms supporting containerized deployments like Docker or Kubernetes.
Interested developers can contribute by submitting issues or pull requests:
For more information on MCP and related resources, visit the official MCP documentation page.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
graph TD
T[TOS Storage] -->|MCP Request| I[Inventory Management]
U[User Interface] -->|Data Fetch| I
style T fill:#cde1ff
style I fill:#dafaeb
This documentation positions the TOS MCP Server as a crucial component in modern AI workflows, providing comprehensive guidance for its implementation and integration with various MCP clients.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration