RubyLLM 1.14: From Zero to AI Chat App in Under Two Minutes
RubyLLM 1.14 ships a Tailwind chat UI, Rails generators for agents and tools, and a simplified config DSL. Watch the full setup in 1:46.
RubyLLM 1.14 ships a Tailwind chat UI, Rails generators for agents and tools, and a simplified config DSL. Watch the full setup in 1:46.
A pragmatic, code-first argument for Ruby as the best language to ship AI products in 2026.
Agents aren't magic. They're LLMs that can call your code. RubyLLM 1.12 adds a clean DSL to define and reuse them.
Stop typing every prompt. Speak it instead, with a fast Rust stack and a clean Omarchy setup.
Nano Banana hides behind Google's chat endpoint. Here's the straight line to ship it with RubyLLM.
Structured output that works, Rails generators that didn't exist, and why we shipped Wednesday, Friday, and Friday again.
How Ruby's async ecosystem transforms resource-intensive LLM applications into efficient, scalable systems - without rewriting your codebase.
Attachments figure themselves out, contexts isolate configuration per tenant, and model data stays current automatically.
The standard API for LLM model information I announced last month is now live and already integrated into RubyLLM 1.3.0.