A Ruby client for the Model Context Protocol (MCP) that seamlessly integrates with RubyLLM. This gem enables Ruby applications to connect to MCP servers and use their tools, resources, and prompts as part of LLM conversations.
Currently full support for MCP protocol version up to 2025-06-18
.
Key Features
- 🔌 Multiple Transport Types: Streamable HTTP, STDIO, and legacy SSE transports
- 🛠️ Tool Integration: Automatically converts MCP tools into RubyLLM-compatible tools
- 📄 Resource Management: Access and include MCP resources (files, data) and resource templates in conversations
- 🎯 Prompt Integration: Use predefined MCP prompts with arguments for consistent interactions
- 🎛️ Client Features: Support for sampling and roots
- 🎨 Enhanced Chat Interface: Extended RubyLLM chat methods for seamless MCP integration
- 🔄 Multiple Client Management: Create and manage multiple MCP clients simultaneously
- 📚 Simple API: Easy-to-use interface that integrates seamlessly with RubyLLM
- 🚀 Rails Integration: Built-in Rails support with generators and configuration
Installation
bundle add ruby_llm-mcp
Or add to your Gemfile:
gem 'ruby_llm-mcp'
Quick Start
require 'ruby_llm/mcp'
# Configure RubyLLM
RubyLLM.configure do |config|
config.openai_api_key = "your-api-key"
end
# Connect to an MCP server
client = RubyLLM::MCP.client(
name: "filesystem",
transport_type: :stdio,
config: {
command: "bunx",
args: [
"@modelcontextprotocol/server-filesystem",
File.expand_path("..", __dir__)
]
}
)
# Use MCP tools in a chat
chat = RubyLLM.chat(model: "gpt-4")
chat.with_tools(*client.tools)
response = chat.ask("Can you help me search for files in my project?")
puts response
Transport Types
STDIO Transport
Best for local MCP servers or command-line tools:
client = RubyLLM::MCP.client(
name: "local-server",
transport_type: :stdio,
config: {
command: "python",
args: ["-m", "my_mcp_server"],
env: { "DEBUG" => "1" }
}
)
Streamable HTTP Transport
Best for HTTP-based MCP servers that support streaming:
client = RubyLLM::MCP.client(
name: "streaming-server",
transport_type: :streamable,
config: {
url: "https://your-mcp-server.com/mcp",
headers: { "Authorization" => "Bearer your-token" }
}
)
SSE Transport
Best for web-based MCP servers:
client = RubyLLM::MCP.client(
name: "web-server",
transport_type: :sse,
config: {
url: "https://your-mcp-server.com/mcp/sse",
headers: { "Authorization" => "Bearer your-token" }
}
)
Core Concepts
Tools
MCP tools are automatically converted into RubyLLM-compatible tools, enabling LLMs to execute server-side operations.
Resources
Static or dynamic data that can be included in conversations - from files to API responses.
Prompts
Pre-defined prompts with arguments for consistent interactions across your application.
Notifications
Real-time updates from MCP servers including logging, progress, and resource changes.
Getting Started
- Getting Started - Get up and running quickly
- Configuration - Configure clients and transports
- Rails Integration - Use with Rails applications
- Transports - Build custom transport implementations
Server Interactions
- Working with Tools - Execute server-side operations
- Using Resources - Include data in conversations
- Prompts - Use predefined prompts with arguments
- Notifications - Handle real-time updates
Client Interactions
- Sampling - Allow servers to use your LLM
- Roots - Provide filesystem access to servers
- Elicitation - Handle user input during conversations
Examples
Complete examples are available in the examples/
directory:
- Local MCP Server: Complete stdio transport example
- SSE with GPT: Server-sent events with OpenAI
- Resource Management: List and use resources
- Prompt Integration: Use prompts with streamable transport
Contributing
We welcome contributions! Bug reports and pull requests are welcome on GitHub.
License
Released under the MIT License.