Build a modular MCP server with NestJS to interact with LLMs and tools via HTTP for scalable applications
The MCP (Model Context Protocol) Server built using NestJS is an extensible, modular framework designed to enable interaction between AI models (like Claude, GPT, etc.) or external clients with specialized tools through HTTP requests. With dynamic tool registration and robust validation mechanisms, it seamlessly integrates into the AI workflow ecosystem.
The MCP server acting as a mediator uses NestJS, a progressive Node.js framework for building efficient and scalable applications. This project showcases how AI applications can leverage MCP to connect with backend tools and services through standardized protocols. It serves as a base for developers integrating LLMs (like Claude, GPT) into their workflows, ensuring seamless interaction and data handling.
The server is built around the core concept of dynamic tool registration and execution via HTTP endpoints. Each tool, such as calculator, temperature, filesystem, code-analyzer-local, is registered with a name, validation schema using Zod, and a handler function. This modular architecture allows for easy expansion and customization, making it highly adaptable to different AI application needs.
calculator: Supports basic mathematical operations (addition, subtraction, multiplication, division).
{
"toolName": "calculator",
"params": {
"a": 10,
"b": 2,
"operation": "multiply"
}
}
temperature: Provides a fictional temperature based on a prompt.
{
"toolName": "temperature",
"params": {
"prompt": "Temperatura en Madrid"
}
}
filesystem: Handles file operations like reading, writing, and deletion.
{
"toolName": "filesystem",
"params": {
"path": "src/main.ts",
"action": "read"
}
}
code-analyzer-local: Generates descriptive .md files by analyzing code directories.
{
"toolName": "code-analyzer-local",
"params": {
"directory": "mcp-server/src"
}
}
The architecture of the MCP server is based on modular design, ensuring that tools can be dynamically registered and utilized. Zod plays a crucial role in validating both user inputs and AI model outputs to ensure consistent data flow.
import { z } from "zod";
// Define expected schema
const TaskSchema = z.object({
title: z.string(),
completed: z.boolean(),
});
// Validate LLM output
try {
const parsed = TaskSchema.parse(JSON.parse(llmOutput));
console.log("✅ Valid data:", parsed);
} catch (err) {
console.error("❌ Invalid data:", err);
}
The MCP protocol flow can be visualized as:
graph TD
A[AI Application] -->|MCP Client| B[MCP Server]
B --> C[MCP Client]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
To install the necessary dependencies, navigate to your project directory and run:
npm install zod
Once installed, you can configure the server using .env
files:
PORT=3000
ANALYSIS_DIR=./src/docs
Real-world use cases include:
Financial Analysis Tool Integration:
Code Optimization Pipeline:
The server supports a range of MCP clients, including:
Compatibility matrix for MCP clients:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
Configuring the server involves setting up environment variables and specifying server commands. Here's a sample configuration:
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
Contributions can be made by cloning the repository, setting up the environment, and submitting pull requests for new features or fixes.
For more resources, developers are encouraged to explore official documentation, courses, video tutorials, and a job board dedicated to AI applications and MCP integrations.
This comprehensive documentation positions the MCP server as a robust solution for integrating AI applications with backend tools, enhancing their capabilities through standardized protocols.
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Explore community contributions to MCP including clients, servers, and projects for seamless integration
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Integrate AI with GitHub using MCP Server for profiles repos and issue creation
Python MCP client for testing servers avoid message limits and customize with API key
Explore MCP servers for weather data and DigitalOcean management with easy setup and API tools