
The Happy Path to Multi-Tenant AI
RubyLLM 1.3.0 is here (in release candidate), and it’s packed with many awesome improvements. This is a big release, let’s go through all that we’ve in store for you.
Configuration Contexts
The global configuration pattern works beautifully for simple applications. Set your API keys once, use them everywhere. But the moment you need different configurations for different customers, environments, or features, that simplicity becomes a liability.
We could have forced everyone to pass configuration objects around. We could have built some complex dependency injection system. Instead, we built contexts:
# Each tenant gets their own isolated configuration
tenant_context = RubyLLM.context do |config|
config.openai_api_key = tenant.openai_key
config.anthropic_api_key = tenant.anthropic_key
config.request_timeout = 180 # This tenant needs more time
end
# Use it without polluting the global namespace
response = tenant_context.chat.ask("Process this customer request...")
# The default configuration remains untouched
RubyLLM.chat.ask("This still uses your global settings")
Simple, elegant, Ruby-like. Your multi-tenant application doesn’t need architectural gymnastics. Each context is isolated, thread-safe, and garbage-collected when you’re done with it.
Ollama support
We’ve added Ollama support because your development machine shouldn’t need to phone home to OpenAI every time you want to test something:
RubyLLM.configure do |config|
config.ollama_api_base = 'http://localhost:11434/api'
end
# Same API, different model
chat = RubyLLM.chat(model: 'gemma3:4b')
response = chat.ask("Explain Ruby's eigenclass")
This matters for development, for testing, for compliance, for costs. Sometimes the best model is the one running on your own hardware.
OpenRouter support
OpenRouter gives you access to hundreds of models through a single API. We now support it natively:
chat = RubyLLM.chat(
model: 'claude-3.5-sonnet',
provider: 'openrouter'
)
The End of Manual Model Tracking: Parsera
Here’s where things get interesting. Remember that LLM capabilities API I wrote about? It’s live, and it’s transforming how RubyLLM handles model information.
We’ve partnered with Parsera to create a single source of truth for LLM capabilities and pricing. No more manually updating capabilities.rb
files every time OpenAI changes their pricing. No more hunting through documentation to find context windows.
When you run RubyLLM.models.refresh!
, you’re now pulling from the Parsera API - a continuously updated registry that scrapes model information directly from provider documentation. Context windows, pricing, capabilities, supported modalities - it’s all there, always current. They also built a nice model browser UI, neat!
However, providers don’t always document everything perfectly. We discovered plenty of older models still available through their APIs but missing from official docs. That’s why we kept our capabilities.rb
files in the provider code - it fills in the gaps for models the Parsera API doesn’t cover yet. Between the two sources, we support virtually every model worth using. Check our model guide to see the complete list.
Dear providers, if you’re reading, take inspiration from our API and deliver all the info we need directly from your list models endpoints.
Read more about the Parsera LLM Capabilities and Pricing API here.
Attachments That Just Work
We’ve made attachments smarter without breaking what already worked. Previously, you had to explicitly specify file types:
# The old way (still works!)
chat.ask "What's in this image?", with: { image: "ruby_conf.jpg" }
chat.ask "Describe this meeting", with: { audio: "meeting.wav" }
chat.ask "Summarize this document", with: { pdf: "contract.pdf" }
Now RubyLLM figures out the type automatically:
# The new way - just pass the files
chat.ask("What's in this file?", with: "quarterly_report.pdf")
# Mix and match without categorizing
chat.ask("Analyze these", with: [
"revenue_chart.png",
"board_meeting.mp3",
"financial_summary.pdf"
])
ActiveStorage integration
For Rails developers, this release is particularly sweet. We’ve deeply integrated attachment handling with ActiveStorage:
# In your Message model
class Message < ApplicationRecord
acts_as_message
has_many_attached :attachments # Enable attachment support
end
# Handle file uploads directly from forms
chat_record.ask("Analyze this document", with: params[:uploaded_file])
# Work with existing ActiveStorage attachments
chat_record.ask("Summarize my resume", with: current_user.resume)
# Process multiple uploads at once
chat_record.ask("Compare these proposals", with: params[:proposal_files])
The type detection works seamlessly with Rails’ upload handling. Your users can upload any supported file type, and RubyLLM handles the rest.
Ship It
This release candidate is available now:
gem 'ruby_llm', '1.3.0.rc1'
We’ll release the final 1.3.0 next week after the community has had a chance to test. The features are solid, the tests are comprehensive (197+ and growing), and the API is stable.
This Is Just The Beginning
Want to shape RubyLLM’s future? Join us on GitHub.