Overview
RubyLLM MCP connects RubyLLM chats to Model Context Protocol (MCP) servers so your app can use external tools, resources, prompts, and notifications in a consistent API.
Table of contents
- What RubyLLM MCP Adds
- Simplicity with RubyLLM
- High-Level Architecture
- Core Interaction Model
- Response Patterns
- Native vs MCP SDK Adapter
- Minimal Sync Flow
- Minimal Task Flow
- Next Steps
What RubyLLM MCP Adds
- Connect to MCP servers over
stdio,streamable/http, andsse - Map MCP tools into RubyLLM-compatible tools
- Read MCP resources and resource templates into chat context
- Retrieve and execute MCP prompts with arguments
- Handle real-time server notifications and progress events
Simplicity with RubyLLM
RubyLLM MCP is designed so the basic integration path stays short:
require "ruby_llm/mcp"
client = RubyLLM::MCP.client(
name: "filesystem",
transport_type: :stdio,
config: {
command: "npx",
args: ["@modelcontextprotocol/server-filesystem", "."]
}
)
chat = RubyLLM.chat(model: "gpt-4.1")
chat.with_tools(*client.tools)
puts chat.ask("List the top-level files and summarize this project")
The same client API can then be extended with resources, prompts, notifications, and tasks.
High-Level Architecture
- RubyLLM Chat - Your application chat/session logic
- RubyLLM MCP Client - Connection and protocol wrapper
- Adapter -
:ruby_llm(full) or:mcp_sdk(core/passive extensions) - Transport -
stdio,streamable,sse - MCP Server - External capability provider
Core Interaction Model
RubyLLM MCP is split into server and client capability surfaces:
- Server: tools, resources, prompts, notifications
- Client: sampling, roots, elicitation
- Extensions: optional capability negotiation (including MCP Apps/UI)
Response Patterns
RubyLLM MCP commonly uses three response patterns:
- Synchronous response: Immediate return value from a request, such as
tool.execute. - Notification-driven async updates: Server-sent updates during execution (logging, progress, resource updates).
- Task lifecycle response: Pollable background work via
tasks/get,tasks/result, and optional cancellation.
Use synchronous responses for short operations, notifications for real-time status updates, and tasks for long-running workflows.
Native vs MCP SDK Adapter
RubyLLM MCP supports two adapters:
- Native (
:ruby_llm) - Full-featured implementation with advanced MCP capabilities (sampling, roots, notifications, progress, tasks, elicitation). - MCP SDK (
:mcp_sdk) - Official SDK-backed adapter focused on core surfaces (tools, resources, prompts, templates, logging).
Choose :ruby_llm when you need full protocol coverage and advanced interactions. Choose :mcp_sdk when you only need core MCP surfaces and want SDK alignment.
For the full feature matrix, see Adapters & Transports.
Minimal Sync Flow
require "ruby_llm/mcp"
client = RubyLLM::MCP.client(
name: "filesystem",
transport_type: :stdio,
config: {
command: "npx",
args: ["@modelcontextprotocol/server-filesystem", "."]
}
)
chat = RubyLLM.chat(model: "gpt-4")
chat.with_tools(*client.tools)
response = chat.ask("List files in this project")
puts response
Minimal Task Flow
task = client.task_get("task-123")
until task.completed? || task.failed? || task.cancelled?
sleep((task.poll_interval || 250) / 1000.0)
task = task.refresh
end
puts(task.completed? ? task.result : task.status_message)
Next Steps
- Getting Started - Build your first integration
- Configuration - Configure adapters, transports, and behavior
- Adapters & Transports - Choose native vs MCP SDK
- Notifications - Handle async server updates
- Tasks - Manage long-running operations
- Server - Use tools/resources/prompts
- Client - Enable client-side capabilities