Releases: OlympiaAI/raix
Releases · OlympiaAI/raix
Release 2.0.1
Release 2.0.0
Breaking Changes
- Migrated from OpenRouter/OpenAI gems to RubyLLM - Raix now uses RubyLLM as its unified backend for all LLM providers. This provides better multi-provider support and a more consistent API.
- Configuration changes - API keys are now configured through RubyLLM's configuration system instead of separate client instances.
- Removed direct client dependencies -
openrouterandruby-openaigems are no longer direct dependencies; RubyLLM handles provider connections.
Added
before_completionhook - New hook system for intercepting and modifying chat completion requests before they're sent to the AI provider.- Configure at global, class, or instance levels
- Hooks receive a
CompletionContextwith access to messages, params, and the chat completion instance - Messages are mutable for content filtering, PII redaction, adding system prompts, etc.
- Params can be modified for dynamic model selection, A/B testing, and more
- Supports any callable object (Proc, Lambda, or object responding to
#call)
FunctionToolAdapter- New adapter for converting Raix function declarations to RubyLLM tool formatTranscriptAdapter- New adapter for bridging Raix's abbreviated message format with standard OpenAI format
Changed
- Chat completions now use RubyLLM's unified API for all providers (OpenAI, Anthropic, Google, etc.)
- Improved provider detection based on model name patterns
- Streamlined internal architecture with dedicated adapters
Migration Guide
Update your configuration from:
Raix.configure do |config|
config.openrouter_client = OpenRouter::Client.new(access_token: "...")
config.openai_client = OpenAI::Client.new(access_token: "...")
endTo:
RubyLLM.configure do |config|
config.openrouter_api_key = ENV["OPENROUTER_API_KEY"]
config.openai_api_key = ENV["OPENAI_API_KEY"]
# Also supports: anthropic_api_key, gemini_api_key
endRelease 1.0.3
Added support for GPT5
v1.0.2
What's Changed
Added
- Added method to check for API client availability in Configuration
Changed
- Updated ruby-openai dependency to ~> 8.1
Fixed
- Fixed gemspec file reference
Full Changelog: v1.0.1...v1.0.2
v1.0.1
What's Changed
Fixed
- Fixed PromptDeclarations module namespace - now properly namespaced under Raix (#8)
- Removed Rails.logger dependencies from PromptDeclarations for non-Rails environments
- Fixed documentation example showing incorrect
openai: trueusage (should be model string) (#9) - Added comprehensive tests for PromptDeclarations module
Changed
- Improved error handling in PromptDeclarations to catch StandardError instead of generic rescue
Issues Resolved
- Closes #8 - Prompt declarations in README.md to not match the code
- Closes #9 - Chat Completion Fails Due to Invalid JSON Payload (Status 400)
Full Changelog: v1.0.0...v1.0.1
v1.0.0
Major Release: Automatic Tool Call Continuation
This major release introduces automatic continuation after tool calls, eliminating the need for the loop parameter entirely. The system now automatically handles tool execution and continues the conversation until the AI provides a final text response.
Breaking Changes
- Deprecated
loopparameter - The system now automatically continues conversations after tool calls. Theloopparameter shows a deprecation warning but still works for backwards compatibility. - Tool-based completions now return strings instead of arrays - When functions are called, the final response is a string containing the AI's text response, not an array of function results.
stop_looping\!renamed tostop_tool_calls_and_respond\!- Better reflects the new automatic continuation behavior.
New Features
- Automatic conversation continuation - Chat completions automatically continue after tool execution without needing the
loopparameter. max_tool_callsparameter - Controls the maximum number of tool invocations to prevent infinite loops (default: 25).- Configuration for
max_tool_calls- Addedmax_tool_callsto the Configuration class with sensible defaults.
Migration Guide
# Before
response = ai.chat_completion(loop: true)
# After (automatic)
response = ai.chat_completion
# To limit tool calls
response = ai.chat_completion(max_tool_calls: 5)Other Changes
- Improved CI/CD workflow to use
bundle exec rake cifor consistent testing - Fixed conflict between
loopattribute and Ruby'sKernel.loopmethod (fixes #11) - Fixed various RuboCop warnings using keyword argument forwarding
- Improved error handling with proper warning messages
See the CHANGELOG for full details.
v0.9.2
What's Changed
Fixed
- Fixed OpenAI chat completion compatibility
- Fixed SHA256 hexdigest generation for MCP tool names
- Added ostruct as explicit dependency to prevent warnings
- Fixed rubocop lint error for alphabetized gemspec dependencies
- Updated default OpenRouter model
Full Changelog: v0.9.1...v0.9.2
v0.8
Adds experimental support for declaring MCP servers as tool functions
Full Changelog: v0.7.3...v0.8
v0.7.3
0.7.1
Significantly improved PromptDeclarations module with many additional features.
Smaller changes:
- Make automatic JSON parsing available to non-OpenAI providers that don't support the response_format parameter by scanning for json XML tags
Full Changelog: v0.6...0.7