Learn how to set up and run remote MCP servers with Azure Container Apps using Node.js and TypeScript
This document provides detailed instructions on setting up and utilizing an MCP (Model Context Protocol) server within a Node.js and TypeScript environment, leveraging Azure Container Apps. This guide aims to facilitate seamless integration and collaboration among various AI tools and services through the standardized Model Context Protocol.
The remote MCP server, running in an Azure Container App, serves as a critical component that facilitates communication between different AI models and tools by implementing the Model Context Protocol (MCP). This protocol ensures interoperability and data exchange between diverse AI applications, enabling them to collaborate effectively. By providing this infrastructure, the MCP server acts as a bridge, facilitating tasks such as running tools, managing context, and ensuring secure interactions.
This section outlines the core features provided by our remote MCP server implementation and aligns with key capabilities of the Model Context Protocol:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
The architecture of the remote MCP server is designed to handle interactions between various AI tools and services via MCP. The following diagram illustrates the flow of information within this architecture:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Postgres DB]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
The following Mermaid diagram outlines the flow of the Model Context Protocol within our remote server:
flowchart TD
user(("User or Agent"))
client[Client]
azureOpenAI[Azure OpenAI API]
GitHubAPI[GitHub API]
openAITexts[Open AI API Texts]
service["Remote MCP Service"]
database[(Postgres DB)]
user --> client
client --> |SSE| service
service -- handleRequest --> azureOpenAI, GitHubAPI, openAITexts
service --> |StoreResults| database
style service fill:#f9e79f stroke:#333
double_arrow
arrow_loop_left
To set up and run the remote MCP server on your local environment or deploy it using Azure Container Apps, follow these steps:
Prerequisites: Ensure you have Node.js (version 20+), npm, Docker (optional for running Postgres in a container), and an existing or newly created Project.
Prepare Environment:
docker compose up -d --build 'postgres'
Clone and Install Dependencies:
npm install
Start the Server:
npm start
This use case involves managing TODO items using a Postgres database and integrating with various AI tools. Users can add, list, complete, or delete tasks within the application environment.
Technical Implementation: The client sends requests to the MCP server endpoint, which processes these requests through well-defined functions that interact with the Postgres database managing the task lists.
Another use case showcases how different AI applications can communicate and collaborate using the remote MCP server. For example, integrating multiple tools such as text generation APIs, knowledge base queries, and scheduling services ensures a cohesive user experience across all applications.
To connect the MCP server to your local client environment for development purposes, follow these steps:
{
"servers": {
"mcp-server-sse": {
"type": "sse",
"url": "http://localhost:3000/sse"
}
}
}
mcp.json
configuration file to your VS Code workspace.http://localhost:3000/sse
.Install and start the MCP Inspector using:
npx -y @modelcontextprotocol/inspector@latest node build/index.js
Open the web app from the URL provided by the Inspector.
Tool Operations: Test communication between the client and server by running tools.
While the remote MCP server provides full support for Claude Desktop, Continue, and other clients, it offers only limited compatibility with Cursor due to its current focus on tool integration without full prompts support.
Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
When configuring the remote MCP server, ensure secure connections and implement proper security measures such as API key validation, role-based access control (RBAC), and encrypted data storage.
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
A1: Yes, the remote MCP server is compatible with a wide range of AI tools and services. Please refer to the integration matrix for specific details.
A2: Additional clients can be integrated by configuring them as needed in mcp.json
. Ensure the client's API supports SSE events and follows MCP standards.
A3: Yes, ensure you implement robust security measures such as secure connections, proper authentication, and encrypted data handling to protect your system.
A4: For larger datasets or more complex operations, consider optimizing API endpoints and batch operations when interacting with the Postgres database.
This comprehensive guide positions our remote MCP server as a vital tool for enhancing model context protocol integrations across diverse AI applications, providing reliable communication and seamless collaboration.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration