Carmine Paolino's Picture

Carmine Paolino

I build AI tools at Chat with Work and RubyLLM. Co-founded Freshflow. Outside tech, I make music, run Floppy Disco, and take photos.

RubyLLM 1.14: From Zero to AI Chat App in Under Two Minutes

RubyLLM 1.14: From Zero to AI Chat App in Under Two Minutes

RubyLLM 1.14 ships a Tailwind chat UI, Rails generators for agents and tools, and a simplified config DSL. Watch the full setup in 1:46.

Ruby Is the Best Language for Building AI Apps

Ruby Is the Best Language for Building AI Apps

A pragmatic, code-first argument for Ruby as the best language to ship AI products in 2026.

RubyLLM 1.12: Agents Are Just LLMs with Tools

RubyLLM 1.12: Agents Are Just LLMs with Tools

Agents aren't magic. They're LLMs that can call your code. RubyLLM 1.12 adds a clean DSL to define and reuse them.

Dictation Is the New Prompt (Voxtype on Omarchy)

Dictation Is the New Prompt (Voxtype on Omarchy)

Stop typing every prompt. Speak it instead, with a fast Rust stack and a clean Omarchy setup.

Nano Banana with RubyLLM

Nano Banana with RubyLLM

Nano Banana hides behind Google's chat endpoint. Here's the straight line to ship it with RubyLLM.

RubyLLM 1.4-1.5.1: Three Releases in Three Days

RubyLLM 1.4-1.5.1: Three Releases in Three Days

Structured output that works, Rails generators that didn't exist, and why we shipped Wednesday, Friday, and Friday again.

Async Ruby is the Future of AI Apps (And It's Already Here)

Async Ruby is the Future of AI Apps (And It's Already Here)

How Ruby's async ecosystem transforms resource-intensive LLM applications into efficient, scalable systems - without rewriting your codebase.

RubyLLM 1.3.0: Smarter Attachments, Multi-Tenancy, and No More Manual Model Tracking

RubyLLM 1.3.0: Smarter Attachments, Multi-Tenancy, and No More Manual Model Tracking

Attachments figure themselves out, contexts isolate configuration per tenant, and model data stays current automatically.

The LLM Capabilities and Pricing API is Live

The LLM Capabilities and Pricing API is Live

The standard API for LLM model information I announced last month is now live and already integrated into RubyLLM 1.3.0.