RubyLLM 1.14 ships a full chat UI generator. Two commands and you have a working AI chat app with Turbo streaming, model selection, and tool call display, in under two minutes. The demo above shows the whole thing: new Rails app to working chat in 1:46, including trying it out.
Why This Matters
RubyLLM turned one last week. 1.0 shipped on March 11, 2025 with Rails integration from day one: ActiveRecord models, acts_as_chat, Turbo streaming, persistence out of the box. 1.4 added the install generator. 1.7 brought the first scaffold chat UI with Turbo Streams. 1.12 introduced agents with prompt conventions. Each release got closer to the same thing: AI that works the way Rails works.
1.14 fully realizes that goal. A beautiful Tailwind chat UI (with automatic fallback to scaffold if you’re not using Tailwind). Generators for agents and tools. Conventional directories for everything. All of it extracted from Chat with Work, where it’s been running in production for months.
What You Get
Two generators. That’s it.
bin/rails generate ruby_llm:install
bin/rails generate ruby_llm:chat_ui
Your app now has this structure:
app/
├── agents/
├── controllers/
│ ├── chats_controller.rb
│ └── messages_controller.rb
├── helpers/
│ └── messages_helper.rb
├── jobs/
│ └── chat_response_job.rb
├── models/
│ ├── chat.rb
│ ├── message.rb
│ ├── model.rb
│ └── tool_call.rb
├── prompts/
├── schemas/
├── tools/
└── views/
├── chats/
│ ├── index.html.erb
│ ├── show.html.erb
│ └── _chat.html.erb
└── messages/
├── _assistant.html.erb
├── _user.html.erb
├── _tool.html.erb
├── _error.html.erb
├── create.turbo_stream.erb
├── tool_calls/
│ └── _default.html.erb
└── tool_results/
└── _default.html.erb
Separate partials for each message role. Turbo Stream templates for real-time updates via broadcasts_to. A background job that handles the AI response. Tool calls and tool results each get their own rendering pipeline. A complete Tailwind chat interface, not a scaffold you need to fight with.
Full Tutorial: New App from Scratch
If you want to start from zero, this is what the demo shows. The whole thing takes just a minute.
rails new chat_app --css tailwind
cd chat_app
bundle add ruby_llm
bin/rails generate ruby_llm:install
bin/rails generate ruby_llm:chat_ui
bin/rails db:migrate
bin/rails ruby_llm:load_models
bin/dev
That’s a new Rails app with Tailwind, RubyLLM installed, the chat UI generated, the database set up, models loaded, and the server running. Open localhost:3000/chats and start talking to an AI.
Generators for Agents, Tools, and Schemas
Now the fun part. You scaffold agents, tools, and schemas the same way you’d scaffold anything else in Rails:
bin/rails generate ruby_llm:agent SupportAgent
app/
├── agents/
│ └── support_agent.rb
└── prompts/
└── support_agent/
└── instructions.txt.erb
The agent class comes with the 1.12 DSL ready to go. The instructions file is an ERB template for your system prompt, so you can version it, review it in PRs, and template it with runtime context.
bin/rails generate ruby_llm:tool WeatherTool
app/
├── tools/
│ └── weather_tool.rb
└── views/
└── messages/
├── tool_calls/
│ └── _weather.html.erb
└── tool_results/
└── _weather.html.erb
Each tool gets its own partials for rendering calls and results. Show a weather widget for the weather tool, a search results list for a search tool, all through Rails partials.
bin/rails generate ruby_llm:schema Product
app/
└── schemas/
└── product_schema.rb
This creates a schema for structured output validation.
More on all of this in the Rails integration docs, and the dedicated guides for agents and tools.
Self-Registering Provider Config
For people building provider gems: providers now register their own configuration options instead of patching a monolithic Configuration class.
class DeepSeek < RubyLLM::Provider
class << self
def configuration_options
%i[deepseek_api_key deepseek_api_base]
end
end
end
When the provider is registered, its options become attr_accessors on RubyLLM::Configuration automatically. Third-party gems can add their config keys without touching the core.
Bug Fixes
- Faraday logging memory bloat: logging no longer serializes large payloads (like base64-encoded PDFs) when the log level is above DEBUG.
- Agent
assume_model_existspropagation: setting this on the agent class now actually works. - Renamed model associations: foreign key references with
acts_ashelpers are fixed. - MySQL/MariaDB compatibility: JSON column defaults work correctly now.
- Error.new with string argument: no longer raises a
NoMethodError.
Full list in the release notes.
gem 'ruby_llm', '~> 1.14'