Simplify MCP server integration with LangChain using TypeScript utility for seamless external tool access
This document provides comprehensive technical documentation for integrating an MCP server with LangChain using TypeScript, including setup instructions, usage examples, and advanced configurations. The utility described simplifies the process of leveraging 2000+ functional components available as MCP servers to enhance LangChain applications.
Model Context Protocol (MCP) is an open source technology announced by Anthropic that dramatically expands Language Model’s scope by enabling integration with external tools and resources. Over 2000 functional components are available as MCP servers, including but not limited to Google Drive, Slack, Notion, Spotify, Docker, PostgreSQL, and more. This utility streamlines the use of these services in LangChain applications.
The core feature of this utility is the convertMcpToLangchainTools()
function, which:
This utility supports a wide range of MCP clients, making it highly versatile for development and experimentation.
The architecture leverages the Model Context Protocol (MCP) to ensure seamless integration between AI applications such as Claude Desktop, Continue, Cursor, and server-side components. The protocol flow diagram illustrates the interaction:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
The compatibility matrix shows which MCP clients are supported:
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
To use this utility in your project:
npm i @h1deya/langchain-mcp-tools
Imagine integrating a system where your language model needs to access Google Drive to retrieve and upload documents. Here's how you can set it up:
const mcpServers = {
filesystem: {
command: "npx",
args: ["-y", "@modelcontextprotocol/server-filesystem", "."]
}
};
const { tools, cleanup } = await convertMcpToLangchainTools(mcpServers);
// Use tools with LangChain
A more complex scenario involves fetching data from a Notion database and incorporating it into your prompts or responses. This can be done using:
const mcpServers = {
notion: {
command: "npx",
args: ["-y", "@modelcontextprotocol/server-notion"]
}
};
const { tools, cleanup } = await convertMcpToLangchainTools(mcpServers);
// Use tools with LangChain
As seen in the compatibility matrix, this utility supports several clients out of the box. For instance, initializing MCP servers for Claude Desktop
is as simple as:
const mcpServers = {
filesystem: {
command: "npx",
args: ["-y", "@modelcontextprotocol/server-filesystem", "."]
},
notion: {
command: "npx",
args: ["-y", "@modelcontextprotocol/server-notion"]
}
};
const { tools, cleanup } = await convertMcpToLangchainTools(mcpServers);
This utility ensures compatibility with a wide range of MCP servers. The following matrices detail performance and support:
For advanced users, the utility supports various configurations:
Remote Server Support:
"sse-server-name": {
url: `http://${sse_server_host}:${sse_server_port}/...`
},
"ws-server-name": {
url: `ws://${ws_server_host}:${ws_server_port}/...`
}
Working Directory Configuration:
"local-server-name": {
command: "...",
args: [...],
cwd: "/working/directory" // the working dir to be use by the server
}
MCP Server stderr Redirection:
const logPath = `mcp-server-${serverName}.log`;
const logFd = fs.openSync(logPath, "w");
mcpServers[serverName].stderr = logFd;
Q: Which MCP clients are supported? A: Supported MCP clients include Claude Desktop and Continue.
Q: Can this utility be used with servers hosted remotely? A: Yes, it supports remote servers via SSE or Websocket.
Q: How can I configure the working directory for local servers?
A: Specify the cwd
field in your configuration object to set a custom working directory.
Q: What types of data can be integrated using this utility? A: The utility currently supports text-based results from tool calls and is expanding to include more data types.
Q: Are there any performance considerations while initializing multiple servers? A: Initializing multiple servers in parallel ensures faster setup, but resource usage should be monitored to avoid bottlenecks.
Contributions are welcome! To get started:
npm install
.Explore more about MCP on its official site or visit the MCP.so platform for discovering additional servers and clients.
This comprehensive guide positions the utility not only as a tool but also as an essential part of building robust and versatile AI applications through MCP.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods