Learn how to serve tRPC routes via MCP with easy setup and implementation steps
tRPC <-> MCP
?tRPC <-> MCP
serves as a bridge between the powerful tRPC framework and the versatile MCP protocol, enabling developers to easily integrate AI applications with specific data sources and tools through a standardized interface. This server acts as an adapter, facilitating seamless communication between AI tools like Claude Desktop, Continue, Cursor, and others.
The tRPC <-> MCP
server is designed to provide core features that align with the Model Context Protocol (MCP). These capabilities include enabling tRPC procedures via MCP, ensuring compatibility across various clients, and optimizing the flow of data through a standardized protocol. By using this server, developers can leverage existing tRPC schema definitions to create robust APIs that meet MCP requirements.
The tRPC <-> MCP
server supports a range of compatible MCP clients, including:
MCP ensures that these applications can efficiently interact with external services and data sources. Developers can easily integrate different tools and resources into their applications, enhancing functionality and improving user experience.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
graph TD
A[API Endpoint] --> B[tRPC Router]
B --> C[MCP Metadata]
C --> D[MCP Server]
D --> E[Data Source/Tool]
style A fill:#d1f3ff
style C fill:#fff5e8
style D fill:#e8f5e8
To get started, follow these steps:
Add MCP Metadata to tRPC Schema:
import { initTRPC } from '@trpc/server';
import { type McpMeta } from 'trpc-to-openapi';
const t = initTRPC.meta<McpMeta>().create();
Enable MCP for Specific Routes:
export const appRouter = t.router({
sayHello: t.procedure
.meta({ openapi: { enabled: true, description: 'Greet the user' } })
.input(z.object({ name: z.string() }))
.output(z.object({ greeting: z.string() }))
.query(({ input }) => {
return { greeting: `Hello ${input.name}!` };
});
});
Create and Serve the MCP Server:
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
import { createMcpServer } from 'trpc-mcp';
const mcpServer = createMcpServer(
{ name: 'trpc-mcp-example', version: '0.0.1' },
appRouter,
);
const transport = new StdioServerTransport();
await mcpServer.connect(transport);
Consider an example where your application needs to fetch user data from a database. Using tRPC <-> MCP
, this process is streamlined:
Define the Data Fetch Procedure:
export const appRouter = t.router({
fetchData: t.procedure
.meta({ openapi: { enabled: true, description: 'Fetch user data' } })
.input(z.object({ userId: z.string() }))
.output(z.object({ user: z.object({ id: z.string(), name: z.string() }) }))
.query(({ input }) => {
// Logic to fetch user data from the database
return { user: { id: '123', name: 'John Doe' } };
});
});
Serve the Procedure via MCP:
const mcpServer = createMcpServer(
{ name: 'trpc-mcp-example', version: '0.0.1' },
appRouter,
);
const transport = new StdioServerTransport();
await mcpServer.connect(transport);
Suppose your application needs to interact with an external tool, such as a language model API:
Define the API Endpoints:
export const appRouter = t.router({
getResponse: t.procedure
.meta({ openapi: { enabled: true, description: 'Get response from external tool' } })
.input(z.object({ prompt: z.string() }))
.output(z.object({ response: z.string() }))
.query(({ input }) => {
// Logic to send the prompt and get a response
return { response: "Here is the generated text." };
});
});
Serve the API via MCP:
const mcpServer = createMcpServer(
{ name: 'trpc-mcp-example', version: '0.0.1' },
appRouter,
);
const transport = new StdioServerTransport();
await mcpServer.connect(transport);
The tRPC <-> MCP
server ensures compatibility with a range of MCP clients, supporting full integration for applications like Claude Desktop and Continue. It provides a seamless way to connect AI tools through the MCP protocol, allowing for efficient data exchange and robust API capabilities.
graph LR
A[AI Application] -->|MCP Client| B{Compatible Clients}
B --> C[Claude Desktop]
B --> D[Continue]
B --> E[Cursor]
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
The tRPC <-> MCP
server can be configured to enhance security and performance. The following configuration sample provides an example of how to set up the environment:
{
"mcpServers": {
"[server-name]": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-[name]"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
This configuration ensures that the server is properly secured and optimized for production environments.
Q: How does tRPC <-> MCP
ensure compatibility with different AI applications?
A: The server supports a wide range of clients, including Claude Desktop, Continue, and Cursor. Each client has specific capabilities; for instance, Continue offers full support across all features.
Q: Can I use this server to integrate external tools with my application?
A: Yes, the tRPC <-> MCP
server allows you to define API endpoints that can interact with external tools seamlessly through the MCP protocol.
Q: How does the data flow work in the tRPC <-> MCP
configuration?
A: The data flow starts from an AI application or client, which sends requests via the MCP Protocol. These requests are then processed by the server and may involve fetching data from a database or interacting with an external tool before returning results to the client.
Q: Is there any performance overhead when using this integration? A: The implementation is designed to minimize performance overhead, particularly in high-traffic scenarios. Proper configuration can significantly enhance performance and reduce latency.
Q: Can I customize the MCP metadata for specific procedures?
A: Absolutely. You can add detailed metadata for each procedure via the meta
function, providing descriptions, input/output types, etc., that are critical for compatibility with different clients.
Contributions to the tRPC <-> MCP
project are encouraged and greatly appreciated. If you wish to contribute, please follow these guidelines:
Stay up-to-date with the latest developments in the Model Context Protocol ecosystem by visiting the official documentation and community resources:
By leveraging these resources, you can gain deeper insights into MCP and its applications, ensuring that your integration efforts are as effective as possible.
This comprehensive documentation positions the tRPC <-> MCP
server as a vital tool for integrating tRPC routes with Model Context Protocol clients, enhancing AI workflows and application capabilities.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods