Skip to content

Releases: OlympiaAI/raix

Release 2.0.1

20 Mar 19:21

Choose a tag to compare

Changed

  • Replaced require_relative with Zeitwerk autoloading (thanks @seuros, PR #47)
  • Fixed RuboCop documentation offense

Release 2.0.0

20 Mar 18:37

Choose a tag to compare

Breaking Changes

  • Migrated from OpenRouter/OpenAI gems to RubyLLM - Raix now uses RubyLLM as its unified backend for all LLM providers. This provides better multi-provider support and a more consistent API.
  • Configuration changes - API keys are now configured through RubyLLM's configuration system instead of separate client instances.
  • Removed direct client dependencies - openrouter and ruby-openai gems are no longer direct dependencies; RubyLLM handles provider connections.

Added

  • before_completion hook - New hook system for intercepting and modifying chat completion requests before they're sent to the AI provider.
    • Configure at global, class, or instance levels
    • Hooks receive a CompletionContext with access to messages, params, and the chat completion instance
    • Messages are mutable for content filtering, PII redaction, adding system prompts, etc.
    • Params can be modified for dynamic model selection, A/B testing, and more
    • Supports any callable object (Proc, Lambda, or object responding to #call)
  • FunctionToolAdapter - New adapter for converting Raix function declarations to RubyLLM tool format
  • TranscriptAdapter - New adapter for bridging Raix's abbreviated message format with standard OpenAI format

Changed

  • Chat completions now use RubyLLM's unified API for all providers (OpenAI, Anthropic, Google, etc.)
  • Improved provider detection based on model name patterns
  • Streamlined internal architecture with dedicated adapters

Migration Guide

Update your configuration from:

Raix.configure do |config|
  config.openrouter_client = OpenRouter::Client.new(access_token: "...")
  config.openai_client = OpenAI::Client.new(access_token: "...")
end

To:

RubyLLM.configure do |config|
  config.openrouter_api_key = ENV["OPENROUTER_API_KEY"]
  config.openai_api_key = ENV["OPENAI_API_KEY"]
  # Also supports: anthropic_api_key, gemini_api_key
end

Release 1.0.3

08 Aug 23:46

Choose a tag to compare

Added support for GPT5

v1.0.2

16 Jul 01:55
f18ce41

Choose a tag to compare

What's Changed

Added

  • Added method to check for API client availability in Configuration

Changed

  • Updated ruby-openai dependency to ~> 8.1

Fixed

  • Fixed gemspec file reference

Full Changelog: v1.0.1...v1.0.2

v1.0.1

05 Jun 01:58
v1.0.1
bbbf3fe

Choose a tag to compare

What's Changed

Fixed

  • Fixed PromptDeclarations module namespace - now properly namespaced under Raix (#8)
  • Removed Rails.logger dependencies from PromptDeclarations for non-Rails environments
  • Fixed documentation example showing incorrect openai: true usage (should be model string) (#9)
  • Added comprehensive tests for PromptDeclarations module

Changed

  • Improved error handling in PromptDeclarations to catch StandardError instead of generic rescue

Issues Resolved

  • Closes #8 - Prompt declarations in README.md to not match the code
  • Closes #9 - Chat Completion Fails Due to Invalid JSON Payload (Status 400)

Full Changelog: v1.0.0...v1.0.1

v1.0.0

04 Jun 22:23
v1.0.0
7d290d6

Choose a tag to compare

Major Release: Automatic Tool Call Continuation

This major release introduces automatic continuation after tool calls, eliminating the need for the loop parameter entirely. The system now automatically handles tool execution and continues the conversation until the AI provides a final text response.

Breaking Changes

  • Deprecated loop parameter - The system now automatically continues conversations after tool calls. The loop parameter shows a deprecation warning but still works for backwards compatibility.
  • Tool-based completions now return strings instead of arrays - When functions are called, the final response is a string containing the AI's text response, not an array of function results.
  • stop_looping\! renamed to stop_tool_calls_and_respond\! - Better reflects the new automatic continuation behavior.

New Features

  • Automatic conversation continuation - Chat completions automatically continue after tool execution without needing the loop parameter.
  • max_tool_calls parameter - Controls the maximum number of tool invocations to prevent infinite loops (default: 25).
  • Configuration for max_tool_calls - Added max_tool_calls to the Configuration class with sensible defaults.

Migration Guide

# Before
response = ai.chat_completion(loop: true)

# After (automatic)
response = ai.chat_completion

# To limit tool calls
response = ai.chat_completion(max_tool_calls: 5)

Other Changes

  • Improved CI/CD workflow to use bundle exec rake ci for consistent testing
  • Fixed conflict between loop attribute and Ruby's Kernel.loop method (fixes #11)
  • Fixed various RuboCop warnings using keyword argument forwarding
  • Improved error handling with proper warning messages

See the CHANGELOG for full details.

v0.9.2

03 Jun 18:56
v0.9.2
e976d07

Choose a tag to compare

What's Changed

Fixed

  • Fixed OpenAI chat completion compatibility
  • Fixed SHA256 hexdigest generation for MCP tool names
  • Added ostruct as explicit dependency to prevent warnings
  • Fixed rubocop lint error for alphabetized gemspec dependencies
  • Updated default OpenRouter model

Full Changelog: v0.9.1...v0.9.2

v0.8

23 Apr 18:36
de45377

Choose a tag to compare

Adds experimental support for declaring MCP servers as tool functions

Full Changelog: v0.7.3...v0.8

v0.7.3

23 Apr 15:16

Choose a tag to compare

Small change to function call handling. Commits both tool call and result to transcript in one operation for thread safety.

0.7.1

13 Apr 23:58

Choose a tag to compare

Significantly improved PromptDeclarations module with many additional features.

Smaller changes:

  • Make automatic JSON parsing available to non-OpenAI providers that don't support the response_format parameter by scanning for json XML tags

Full Changelog: v0.6...0.7