No complex protocols, no integration headaches, no compatibility issues – just beautiful, expressive Ruby code.
AI models are powerful, but they need to interact with your applications to be truly useful. Traditional approaches mean wrestling with:
- 🔄 Complex communication protocols and custom JSON formats
 - 🔌 Integration challenges with different model providers
 - 🧩 Compatibility issues between your app and AI tools
 - 🧠 Managing the state between AI interactions and your data
 
Fast MCP solves all these problems by providing a clean, Ruby-focused implementation of the Model Context Protocol, making AI integration a joy, not a chore.
- 🛠️ Tools API - Let AI models call your Ruby functions securely, with in-depth argument validation through Dry-Schema.
 - 📚 Resources API - Share data between your app and AI models
 - 🔄 Multiple Transports - Choose from STDIO, HTTP, or SSE based on your needs
 - 🧩 Framework Integration - Works seamlessly with Rails, Sinatra or any Rack app.
 - 🔒 Authentication Support - Secure your AI-powered endpoints with ease
 - 🚀 Real-time Updates - Subscribe to changes for interactive applications
 
# Define tools for AI models to use
server = FastMcp::Server.new(name: 'recipe-ai', version: '1.0.0')
# Define a tool by inheriting from FastMcp::Tool
class CreateUserTool < FastMcp::Tool
  description "Find recipes based on ingredients"
  
    # These arguments will generate the needed JSON to be presented to the MCP Client
    # And they will be validated at run time.
    # The validation is based off Dry-Schema, with the addition of the description.
  arguments do
    required(:first_name).filled(:string).description("First name of the user")
    optional(:age).filled(:integer).description("Age of the user")
    required(:address).hash do
      optional(:street).filled(:string)
      optional(:city).filled(:string)
      optional(:zipcode).filled(:string)
    end
  end
  
  def call(first_name:, age: nil, address: {})
    User.create!(first_name:, age:, address:)
  end
end
# Register the tool with the server
server.register_tool(CreateUserTool)
# Share data resources with AI models by inheriting from FastMcp::Resource
class PopularUsers < FastMcp::Resource
  uri "file://popular_users.json"
  resource_name "Popular Users"
  mime_type "application/json"
  
  def content
    JSON.generate(User.popular.limit(5).as_json)
  end
end
# Register the resource with the server
server.register_resource(IngredientsResource)
# Accessing the resource through the server
server.read_resource(IngredientsResource.uri)
# Notify the resource content has been updated to clients
server.notify_resource_updated(IngredientsResource.uri)bundle add fast-mcp
bin/rails generate fast_mcp:installThis will add a configurable fast_mcp.rb initializer
require 'fast_mcp'
FastMcp.mount_in_rails(
  Rails.application,
  name: Rails.application.class.module_parent_name.underscore.dasherize,
  version: '1.0.0',
  path: '/mcp' # This is the default path
  # authenticate: true,       # Uncomment to enable authentication
  # auth_token: 'your-token', # Required if authenticate: true
) do |server|
  Rails.application.config.after_initialize do
    # FastMcpwill automatically discover and register:
    # - All classes that inherit from ApplicationTool
    # - All classes that inherit from ApplicationResource
    server.register_tools(*ApplicationTool.descendants)
    server.register_resources(*ApplicationResource.descendants)
    # alternatively, you can register tools and resources manually:
    # server.register_tool(MyTool)
    # server.register_resource(MyResource)
  end
endThe install script will also:
- add app/resources folder
 - add app/tools folder
 - add app/tools/sample_tool.rb
 - add app/resources/sample_resource.rb
 - add ApplicationTool to inherit from
 - add ApplicationResource to inherit from as well
 
I'll let you check out the dedicated sinatra integration docs.
require 'fast_mcp'
# Create an MCP server
server = FastMcp::Server.new(name: 'my-ai-server', version: '1.0.0')
# Define a tool by inheriting from FastMcp::Tool
class SummarizeTool < FastMcp::Tool
  description "Summarize a given text"
  
  arguments do
    required(:text).filled(:string).description("Text to summarize")
    optional(:max_length).filled(:integer).description("Maximum length of summary")
  end
  
  def call(text:, max_length: 100)
    # Your summarization logic here
    text.split('.').first(3).join('.') + '...'
  end
end
# Register the tool with the server
server.register_tool(SummarizeTool)
# Create a resource by inheriting from FastMcp::Resource
class StatisticsResource < FastMcp::Resource
  uri "data/statistics"
  resource_name "Usage Statistics"
  description "Current system statistics"
  mime_type "application/json"
  
  def content
    JSON.generate({
      users_online: 120,
      queries_per_minute: 250,
      popular_topics: ["Ruby", "AI", "WebDev"]
    })
  end
end
# Register the resource with the server
server.register_resource(StatisticsResource)
# Start the server
server.startMCP has developed a very useful inspector. You can use it to validate your implementation. I suggest you use the examples I provided with this project as an easy boilerplate. Clone this project, then give it a go !
npx @modelcontextprotocol/inspector examples/server_with_stdio_transport.rbOr to test with an SSE transport using a rack middleware:
npx @modelcontextprotocol/inspector examples/rack_middleware.rbOr to test over SSE with an authenticated rack middleware:
npx @modelcontextprotocol/inspector examples/authenticated_rack_middleware.rbYou can test your custom implementation with the official MCP inspector by using:
# Test with a stdio transport:
npx @modelcontextprotocol/inspector path/to/your_ruby_file.rb
# Test with an HTTP / SSE server. In the UI select SSE and input your address.
npx @modelcontextprotocol/inspector# app.rb
require 'sinatra'
require 'fast_mcp'
use FastMcp::RackMiddleware.new(name: 'my-ai-server', version: '1.0.0') do |server|
  # Register tools and resources here
  server.register_tool(SummarizeTool)
end
get '/' do
  'Hello World!'
endAdd your server to your Claude Desktop configuration at:
- macOS: 
~/Library/Application Support/Claude/claude_desktop_config.json - Windows: 
%APPDATA%\Claude\claude_desktop_config.json 
{
  "mcpServers": {
    "my-great-server": {
      "command": "ruby",
      "args": [
        "/Users/path/to/your/awesome/fast-mcp/server.rb"
      ]
    }
  }
}Please refer to configuring_mcp_clients
| Feature | Status | 
|---|---|
| ✅ JSON-RPC 2.0 | Full implementation for communication | 
| ✅ Tool Definition & Calling | Define and call tools with rich argument types | 
| ✅ Resource Management | Create, read, update, and subscribe to resources | 
| ✅ Transport Options | STDIO, HTTP, and SSE for flexible integration | 
| ✅ Framework Integration | Rails, Sinatra, Hanami, and any Rack-compatible framework | 
| ✅ Authentication | Secure your AI endpoints with token authentication | 
| ✅ Schema Support | Full JSON Schema for tool arguments with validation | 
- 🤖 AI-powered Applications: Connect LLMs to your Ruby app's functionality
 - 📊 Real-time Dashboards: Build dashboards with live AI-generated insights
 - 🔗 Microservice Communication: Use MCP as a clean protocol between services
 - 📚 Interactive Documentation: Create AI-enhanced API documentation
 - 💬 Chatbots and Assistants: Build AI assistants with access to your app's data
 
- 🚀 Getting Started Guide
 - 🧩 Integration Guide
 - 🛤️ Rails Integration
 - 🌐 Sinatra Integration
 - 📚 Resources
 - 🛠️ Tools
 
Check out the examples directory for more detailed examples:
- 
🔨 Basic Examples:
 - 
🌐 Web Integration:
 
- Ruby 3.2+
 
We welcome contributions to Fast MCP! Here's how you can help:
- Fork the repository
 - Create your feature branch (
git checkout -b my-new-feature) - Commit your changes (
git commit -am 'Add some feature') - Push to the branch (
git push origin my-new-feature) - Create a new Pull Request
 
Please read our Contributing Guide for more details.
This project is available as open source under the terms of the MIT License.
- The Model Context Protocol team for creating the specification
 - The Dry-Schema team for the argument validation.
 - All contributors to this project