MCP made simple for RubyLLM.
ruby_llm-mcp connects Ruby applications to Model Context Protocol (MCP) servers and integrates them directly with RubyLLM.
Simple Configuration
require 'ruby_llm/mcp'
RubyLLM.configure do |config|
config.openai_api_key = ENV.fetch('OPENAI_API_KEY')
end
RubyLLM::MCP.configure do |config|
config.request_timeout = 8_000
end
client = RubyLLM::MCP.client(
name: 'filesystem',
transport_type: :stdio,
config: {
command: 'npx',
args: ['@modelcontextprotocol/server-filesystem', Dir.pwd]
}
)
Core Use Cases
# Use MCP tools in a chat
chat = RubyLLM.chat(model: 'gpt-4o-mini')
chat.with_tools(*client.tools)
puts chat.ask('List the Ruby files in this project and summarize what you find.')
# Add a server resource to chat context
resource = client.resource('project_readme')
chat = RubyLLM.chat(model: 'gpt-4o-mini')
chat.with_resource(resource)
puts chat.ask('Summarize this project in 5 bullets.')
# Execute a predefined MCP prompt with arguments
prompt = client.prompt('code_review')
chat = RubyLLM.chat(model: 'gpt-4o-mini')
response = chat.ask_prompt(prompt, arguments: {
language: 'ruby',
focus: 'security'
})
puts response
# Authenticate to a protected MCP server with browser OAuth
client = RubyLLM::MCP.client(
name: 'oauth-server',
transport_type: :streamable,
start: false,
config: {
url: 'https://mcp.example.com/mcp',
oauth: { scope: 'mcp:read mcp:write' }
}
)
client.oauth(type: :browser).authenticate
client.start
# Poll a long-running MCP task and fetch its final result
task = client.task_get('task-123')
until task.completed? || task.failed? || task.cancelled?
sleep((task.poll_interval || 250) / 1000.0)
task = task.refresh
end
if task.completed?
payload = client.task_result(task.task_id)
puts payload.dig('content', 0, 'text')
else
puts "Task ended with status: #{task.status}"
end
Support At A Glance
- Native MCP client implementation (
:ruby_llm) with full protocol support through2025-11-25 - Official MCP SDK adapter support (
:mcp_sdk) via themcpgem for teams that prefer SDK-backed integration - OAuth implementation for authenticated streamable HTTP MCP servers
- Transports:
stdio,sse,streamable/streamable_http - Core server features: tools, resources, resource templates, prompts, notifications
- Advanced client features: sampling, roots, progress tracking, human-in-the-loop, elicitation
- Task lifecycle APIs (
tasks/list,tasks/get,tasks/result,tasks/cancel) are experimental
MCP task support is experimental and subject to change in both the MCP spec and this gem’s implementation.
Install
Add to your Gemfile:
gem 'ruby_llm-mcp'
Optional (for :mcp_sdk adapter):
gem 'mcp', '~> 0.7'
Then run:
bundle install
Setup
- Set your RubyLLM provider credentials (for example
OPENAI_API_KEY). - Start or access an MCP server.
- Create a
RubyLLM::MCP.clientand attach its tools/resources/prompts to chat flows.
Documentation
- Getting Started - Get up and running quickly
- Configuration - Configure clients and transports
- Adapters & Transports - Choose adapters and configure transports
- Server: Tools - Execute server-side operations
- Server: Resources - Include data in conversations
- Server: Prompts - Use predefined prompts with arguments
- Server: Tasks - Poll and manage long-running background work (experimental)
- Server: Notifications - Handle real-time updates
- Client: Sampling - Allow servers to use your LLM
- Client: Roots - Provide filesystem access to servers
- Client: Elicitation - Handle user input during conversations
Contributing
Bug reports and pull requests are welcome on GitHub.
License
Released under the MIT License.