Efficiently query AWS resources using a containerized MCP server with Python and boto3 integration
The AWS Resources Model Context Protocol (MCP) Server is a specialized implementation that leverages Python and boto3 for secure, flexible access to AWS resources through a standardized protocol. This server provides developers with a robust toolset to manage AWS resources in a controlled environment while ensuring security and compliance.
This server offers several core features designed to enhance the integration between AI applications and AWS services:
Dynamic Resource Management: The aws://query_resources
resource allows dynamic access to various AWS services via boto3 queries, providing a flexible interface for querying and modifying resources.
Python-Based Execution: It runs directly from a Docker image, eliminating the need for local setup and ensuring consistent execution across different environments.
Secure Code Execution: The server includes strict safety features such as AST-based code analysis to validate imports and code structure, limited built-in functions, and proper error handling.
Containerization and Clean Setup: Everything is containerized and clean, making it easy for Python developers to contribute back with minimal setup overhead.
Cross-Platform Support: The server supports multiple platforms (Linux/amd64, Linux/arm64, Linux/arm/v7), ensuring broad compatibility across different systems.
The AWS Resources MCP Server implements the Model Context Protocol to enable seamless integration between AI applications and AWS resources. Key aspects of its architectural design include:
Dynamic Resource Interface: The server exposes a dynamic resource named aws://query_resources
that allows generating Python code for querying any AWS service through boto3.
Code Execution Environment: It provides a sandboxed execution environment with restricted imports and operations, ensuring secure and controlled code execution.
MCP Protocol Compliance: The implementation adheres to MCP standards, allowing seamless communication between the server and various AI applications like Claude Desktop, Continue, and Cursor.
To install and run the AWS Resources MCP Server for Claude Desktop using Smithery, follow these steps:
Install via Smithery:
npx -y @smithery/cli install mcp-server-aws-resources-python --client claude
Docker Installation: You can either pull the Docker image from Docker Hub or build it locally. The server is compatible with Linux/amd64, Linux/arm64, and Linux/arm/v7.
Pull from Docker Hub:
docker pull buryhuang/mcp-server-aws-resources:latest
Build Locally:
docker build -t mcp-server-aws-resources .
Run the container with necessary environment variables:
docker run \
-e AWS_ACCESS_KEY_ID=your_access_key_id_here \
-e AWS_SECRET_ACCESS_KEY=your_secret_access_key_here \
-e AWS_DEFAULT_REGION=us-east-1 \
buryhuang/mcp-server-aws-resources:latest
Alternatively, using stored credentials and a profile:
docker run \
-e AWS_PROFILE=[AWS_PROFILE_NAME] \
-v ~/.aws:/root/.aws \
buryhuang/mcp-server-aws-resources:latest
Cross-Platform Publishing:
To publish the Docker image for multiple platforms, use docker buildx
as follows:
docker buildx create --use
docker buildx build --platform linux/amd64,linux/arm64,linux/arm/v7 -t buryhuang/mcp-server-aws-resources:latest --push .
docker buildx imagetools inspect buryhuang/mcp-server-aws-resources:latest
Real-world use cases for the AWS Resources MCP Server include:
Automated S3 Bucket Management:
s3 = session.client('s3')
result = s3.list_buckets()
Deployment Tracking in CodePipeline:
def get_latest_deployment(pipeline_name):
codepipeline = session.client('codepipeline')
result = codepipeline.list_pipeline_executions(
pipelineName=pipeline_name,
maxResults=5
)
if result['pipelineExecutionSummaries']:
latest_execution = max(
[e for e in result['pipelineExecutionSummaries']
if e['status'] == 'Succeeded'],
key=itemgetter('startTime'),
default=None
)
if latest_execution:
result = codepipeline.get_pipeline_execution(
pipelineName=pipeline_name,
pipelineExecutionId=latest_execution['pipelineExecutionId']
)
else:
result = None
else:
result = None
return result
result = get_latest_deployment("your-pipeline-name")
The AWS Resources MCP Server is compatible with multiple AI applications, including:
This ensures that developers can integrate the server seamlessly into their AI workflows without worrying about client-specific limitations.
Below is a compatibility matrix highlighting MCP client support:
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
Advanced users can configure the server using environment variables and custom MCP configurations. An example configuration is as follows:
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
This configuration ensures that the server is securely set up with appropriate API keys and other necessary settings.
The server uses AST-based code analysis to validate imports and code structure, limiting accessible operations through a sandboxed environment. This ensures that only permissible code can run without compromising security.
This server is fully compatible with Claude Desktop and Continue for resources, tools, and prompts. Cursor supports tools but not full resource management.
You can use Smithery to install the server or pull it from Docker Hub. Follow the installation instructions provided in the documentation to set up a seamless integration with your workflow.
Yes, the GitHub repository hosts detailed contribution guidelines that guide developers through setting up and contributing new features or bug fixes.
While the server is highly reliable, some AI applications may not support all resource management functions. Always refer to the compatibility matrix for specific client support details.
If you are interested in contributing to this project, please follow these guidelines:
Explore more about the broader Model Context Protocol ecosystem, including other open-source projects, tutorials, and community support through official repositories and discussion forums.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
By following this comprehensive document, developers can effectively use the AWS Resources MCP Server to enhance their AI applications with secure and efficient access to AWS resources.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods