RubyLLM::MCP

RubyLLM::MCP gives you a direct way to use Model Context Protocol (MCP) servers from RubyLLM.

Highlights: Tools, Resources, Prompts, MCP OAuth 2.1 auth support, Notification + response handlers, Rails OAuth setup, Browser OAuth for CLI.

Get started GitHub

Gem Version Gem Downloads

Why RubyLLM::MCP?

MCP integration in Ruby apps should be easy to reason about.

RubyLLM::MCP focuses on:

  • Ruby-first APIs for using MCP tools, resources, and prompts in RubyLLM chat workflows
  • Stable protocol defaults (2025-06-18) with explicit draft opt-in (2026-01-26)
  • Built-in notification and response handlers for real-time and interactive workflows
  • MCP OAuth 2.1 authentication support (PKCE, dynamic registration, discovery, and automatic token refresh)
  • OAuth setup paths for Rails apps (per-user connections) and browser-based CLI flows
  • Straightforward integration for Ruby apps, background jobs, and Rails projects

Show me the code

# Basic setup
require "ruby_llm/mcp"

RubyLLM.configure do |config|
  config.openai_api_key = ENV.fetch("OPENAI_API_KEY")
end

client = RubyLLM::MCP.client(
  name: "filesystem",
  transport_type: :stdio,
  config: {
    command: "bunx",
    args: ["@modelcontextprotocol/server-filesystem", Dir.pwd]
  }
)

chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_tools(*client.tools)
puts chat.ask("Find test files with pending TODOs")
# Resources (simple)
resource = client.resource("release_notes")
chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_resource(resource)

# More complex: use client.resource_template(...) with chat.with_resource_template(...)
puts chat.ask("Summarize release notes for the team in 5 bullet points.")
# Prompts (simple)
prompt = client.prompt("code_review")
chat = RubyLLM.chat(model: "gpt-4.1-mini")

response = chat.ask_prompt(
  prompt,
  arguments: {
    language: "ruby",
    focus: "security"
  }
)

# More complex: combine prompts + resources/templates in the same chat workflow
puts response
# Handlers (response + notifications)
client.on_progress do |progress|
  puts "Progress: #{progress.progress}% - #{progress.message}"
end

client.on_logging do |logging|
  puts "[#{logging.level}] #{logging.message}"
end

chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_tools(*client.tools)

chat.ask("Run a repository scan and summarize risks.") do |chunk|
  print chunk.content
end
# OAuth setup (Rails and CLI)
# Rails: per-user OAuth client (after running rails generate ruby_llm:mcp:oauth:install User)
client = current_user.mcp_client
chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_tools(*client.tools)
puts chat.ask("What changed in my connected repos this week?")

# CLI: browser-based OAuth flow
cli_client = RubyLLM::MCP.client(
  name: "oauth-server",
  transport_type: :streamable,
  start: false,
  config: {
    url: ENV.fetch("MCP_SERVER_URL"),
    oauth: { scope: "mcp:read mcp:write" }
  }
)

cli_client.oauth(type: :browser).authenticate
cli_client.start
puts RubyLLM.chat(model: "gpt-4.1-mini").with_tools(*cli_client.tools).ask("List my open tasks.")
cli_client.stop

Features

  • Tools: Convert MCP tools into RubyLLM-compatible tools
  • Resources: Work with resources and resource templates in chat context
  • Prompts: Execute server prompts with typed arguments
  • Transports: :stdio, :streamable, and :sse
  • Client capabilities: Sampling, roots, progress tracking, and elicitation
  • Handlers: Built-in notification and response handlers for real-time and interactive workflows
  • MCP Authentication: OAuth 2.1 support with PKCE, dynamic registration, discovery, and automatic token refresh
  • OAuth setup paths: Rails per-user OAuth setup and browser-based OAuth for CLI tools
  • Extensions: Global/per-client extension negotiation, including MCP Apps
  • Multi-client support: Manage multiple MCP servers in one workflow
  • Protocol control: Stable default with explicit draft opt-in
  • Adapters: Native :ruby_llm adapter (full feature set) and optional :mcp_sdk

Installation

bundle add ruby_llm-mcp

or in Gemfile:

gem "ruby_llm-mcp"

Optional official SDK adapter:

gem "mcp", "~> 0.7"

Rails

rails generate ruby_llm:mcp:install

For per-user OAuth flows:

rails generate ruby_llm:mcp:oauth:install User

OAuth quick example:

client = RubyLLM::MCP.client(
  name: "oauth-server",
  transport_type: :streamable,
  start: false,
  config: {
    url: ENV.fetch("MCP_SERVER_URL"),
    oauth: { scope: "mcp:read mcp:write" }
  }
)

client.oauth(type: :browser).authenticate
client.start

chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_tools(*client.tools)
puts chat.ask("What should I prioritize today?")

client.stop

Use connection blocks in jobs/services/controllers for clean startup and cleanup.

See Rails Integration for end-to-end patterns.

Documentation

Contributing

Contributions are welcome on GitHub.

License

Released under the MIT License.