Implement server-side tools and resources for LLM integration with VectorMCP in Ruby
VectorMCP provides server-side tools for implementing the Model Context Protocol (MCP) in Ruby applications. MCP is a specification that enables Large Language Models (LLMs) to discover and interact with external tools, resources, and prompt templates provided by separate applications—known as MCP Servers. This library allows developers to easily create custom MCP servers, exposing their application's capabilities such as functions, data sources, or predefined prompt templates to compatible LLM clients like Claude Desktop App.
VectorMCP implements the core server-side aspects of the MCP specification. It enables you to define and register custom tools (functions) that LLMs can invoke. You can also expose data sources (files, database results, API outputs) for the LLM to read or query. Additionally, VectorMCP supports structured prompt templates that LLMs can use during their workflows.
With VectorMCP, you can create and register custom tools. These tools are functions your server exposes to clients. For example, you might define a tool called calculate_sum
, which adds two numbers together. The input schema for this tool describes its expected parameters and the return value is automatically formatted according to MCP standards.
Resources are data sources that LLMs can read from. You can register resources using VectorMCP by defining a unique URI, providing a description, specifying the MIME type (if applicable), and implementing the underlying logic for retrieving or generating the data. For instance, you could create a resource called server_status
that provides current server information in JSON format.
Prompts are templates or workflows clients can request. VectorMCP allows you to define prompts with specific arguments and generate structured responses according to MCP standards. These prompts help guide LLMs through complex tasks by providing detailed instructions and parameter requirements.
VectorMCP is built on the Model Context Protocol (MCP), which defines a standardized way for AI applications to interact with external resources using JSON-RPC requests over STDIN/STDOUT or Server-Sent Events (SSE). The protocol ensures interoperability between different LLM clients and your custom MCP servers.
VectorMCP supports two main transport mechanisms:
To get started with VectorMCP, you have two primary installation methods:
# In your Gemfile
gem 'vector_mcp'
gem install vector_mcp
Here’s a basic example of how you can create an MCP server and start listening on STDIN/STDOUT:
require 'vector_mcp'
# Create a server named "Echo Server"
server = VectorMCP.new('Echo Server')
# Register a tool called `echo`
server.register_tool(
name: 'echo',
description: 'Returns whatever message you send.',
input_schema: {
type: 'object',
properties: { message: { type: 'string' } },
required: ['message']
}
) do |args, _session|
args['message'] # This will be returned as the result
end
# Start listening on STDIN/STDOUT
server.run
To test your MCP server using stdin/stdout:
$ ruby my_server.rb
# Then paste JSON-RPC requests, one per line:
{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"CLI","version":"0.1"}}}
Or use a script to send multiple requests:
{
printf '%s\n' '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"CLI","version":"0.1"}}}';
printf '%s\n' '{"jsonrpc":"2.0","method":"initialized"}';
printf '%s\n' '{"jsonrpc":"2.0","id":2,"method":"tools/list","params":{}}';
printf '%s\n' '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"echo","arguments":{"message":"Hello VectorMCP!"}}}';
} | ruby my_server.rb | jq
Imagine an application that integrates financial data from multiple sources. You can use VectorMCP to create a server that fetches stock prices, news articles related to finance, and other relevant information. The LLM client can then request specific data points or entire datasets for analysis.
With VectorMCP, you could build an AI tool that generates summaries of lengthy documents based on user prompts. For instance, a researcher might input the title of a paper and key topics they want highlighted. The LLM would use this prompt to generate a concise summary suitable for quick reference.
VectorMCP is compatible with various MCP clients, including:
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ❌ |
Cursor | ❌ | ✅ | ❌ |
VectorMCP is designed to perform well in a variety of environments. It has been tested with different LLM clients and data sources, ensuring compatibility across various use cases.
A financial analyst uses an MCP server integrated into their workflow by VectorMCP. The application fetches stock prices from multiple sources, news articles relevant to the sector, and economic reports. An LLM client can then generate insights based on these integrations.
server.register_tool(
name: 'fetch_stock_prices',
description: 'Fetch real-time stock quotes.',
input_schema: {
type: 'object',
properties: { symbol: { type: 'string' } },
required: ['symbol']
}
)
A researcher needs to quickly summarize a long scientific paper. They use VectorMCP to define prompts that guide the LLM through requesting specific sections and key points.
server.register_prompt(
name: 'generate_summary',
description: 'Summarize long documents based on provided prompts.',
input_schema: {
type: 'object',
properties: { title: { type: 'string' }, topics: { type: 'array', items: { type: 'string' } } },
required: ['title', 'topics']
}
)
VectorMCP offers advanced configuration options to customize your server’s behavior. You can set environment variables, adjust logging levels, and ensure secure communication between clients and servers.
For example, you might want to configure the server to use HTTPS instead of plain HTTP:
server = VectorMCP.new('Secure Server', env: { SSL_KEY: 'path/to/ssl_key.pem', SSL_CERT: 'path/to/cert_chain.pem' })
You can integrate an MCP server with Continue by defining resources and tools. Continue supports full resource access but not tools or prompts yet.
Tools are executable functions that return results based on input parameters. Prompts are structured requests for specific outputs based on user-defined criteria.
Yes, you can implement custom logic to control access to resources or tools based on client identity or authentication status.
VectorMCP includes mechanisms for handling errors and exceptions. Ensure your handlers are robust to maintain a smooth user experience during development and deployment.
By default, VectorMCP uses JSON-RPC over STDIO streams but supports Server-Sent Events (SSE) through additional configurations.
Contributing to VectorMCP involves creating and registering tools, resources, and prompts. Follow the project’s guidelines for setting up a local development environment, submitting pull requests, and maintaining best practices in coding standards.
git clone https://github.com/yourusername/vector-mcp.git
bundle install
rake spec
The broader Model Context Protocol (MCP) ecosystem includes various tools, frameworks, and services that support integration between AI applications and external resources. Explore the official MCP documentation for more information and examples.
For further assistance, join the community forums or chat channels dedicated to MCP and its implementations.
This comprehensive guide positions VectorMCP as a valuable tool for developers building AI integrations and working with MCP clients like Claude Desktop App. By understanding the core features, architecture, and real-world use cases, you can harness the power of MCP to enhance your application’s capabilities significantly.
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Python MCP client for testing servers avoid message limits and customize with API key
Explore community contributions to MCP including clients, servers, and projects for seamless integration
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Powerful GitLab MCP Server enables AI integration for project management, issues, files, and collaboration automation
SingleStore MCP Server for database querying schema description ER diagram generation SSL support and TypeScript safety