Skip to content

Add model parameter to InvokeLLM for per-call model selection#137

Merged
faygade merged 2 commits intomainfrom
support-model-in-invoke-llm
Mar 8, 2026
Merged

Add model parameter to InvokeLLM for per-call model selection#137
faygade merged 2 commits intomainfrom
support-model-in-invoke-llm

Conversation

@faygade
Copy link
Contributor

@faygade faygade commented Mar 3, 2026

No description provided.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@claude
Copy link

claude bot commented Mar 3, 2026

Claude finished @faygade's task —— View job


Pull Request Review

Review Tasks

  • Read the changed file to understand the modifications
  • Check for repo-specific guidelines (CLAUDE.md)
  • Analyze code quality and best practices
  • Check for potential bugs or issues
  • Evaluate performance considerations
  • Review security concerns
  • Assess test coverage for the changes
  • Check for related files that might need updates

Summary

This PR adds an optional model parameter to the InvokeLLMParams interface, allowing users to override the app-level model setting for individual API calls. The change adds 5 lines to define this new optional parameter with proper TypeScript typing and documentation.

Code Quality and Best Practices ✅

  • Well-documented: The parameter includes clear JSDoc comments explaining its purpose and listing available options
  • Type-safe: Uses a union type to restrict model values to valid options only
  • Follows conventions: The code style and documentation format match the existing codebase patterns
  • Optional parameter: Correctly marked as optional with ?, maintaining backward compatibility

Potential Bugs or Issues ⚠️

  1. Inconsistent model names: The PR uses gpt-4 in existing examples (lines 26, 401, 432) but the new type definition doesn't include gpt-4. Only these models are allowed:

    • gpt_5_mini, gemini_3_flash, gpt_5, gemini_3_pro, claude_sonnet_4_6, claude_opus_4_6

    This inconsistency will cause TypeScript errors when users try to use the examples.

  2. Documentation examples need updates: The existing code examples in the same file show model: 'gpt-4' which won't compile with the new type definition.

Performance Considerations ✅

  • No performance impact: This is a type definition change only, adding an optional parameter
  • The actual implementation handling this parameter would need to be reviewed separately

Security Concerns ✅

  • No security issues identified: The change is limited to type definitions
  • Model selection is properly constrained to a predefined list, preventing arbitrary model injection

Test Coverage ❌

  • Missing tests: No test files found for InvokeLLM functionality
  • Recommendation: Add unit tests to verify:
    • The model parameter is properly passed through to the API
    • Invalid model values are rejected at compile time
    • Omitting the model parameter maintains current behavior

Related Files That Need Updates 📝

  1. Implementation file: The actual implementation that handles this parameter needs to be updated (not included in this PR)
  2. Documentation examples: Update all examples using model: 'gpt-4' to use valid model values from the new type union
  3. Migration guide: Consider documenting this change for users who might be using string literals for model names

Recommendations

  1. Either add gpt-4 to the allowed model types OR update all examples to use a valid model from the union type
  2. Include the implementation changes that actually use this new parameter
  3. Add tests for the new functionality
  4. Consider adding a comment about which model is used by default when this parameter is omitted
  5. Document any breaking changes if the model naming convention has changed

@github-actions
Copy link

github-actions bot commented Mar 3, 2026

🚀 Package Preview Available!


Install this PR's preview build with npm:

npm i @base44-preview/sdk@0.8.20-pr.137.2242f71

Prefer not to change any import paths? Install using npm alias so your code still imports @base44/sdk:

npm i "@base44/sdk@npm:@base44-preview/sdk@0.8.20-pr.137.2242f71"

Or add it to your package.json dependencies:

{
  "dependencies": {
    "@base44/sdk": "npm:@base44-preview/sdk@0.8.20-pr.137.2242f71"
  }
}

Preview published to npm registry — try new features instantly!

@github-actions github-actions bot added the docs-draft PR has auto-drafted documentation suggestions label Mar 3, 2026
- Update JSDoc examples to use valid model value ('gpt_5') instead of 'gpt-4'
- Add unit tests for InvokeLLM model parameter pass-through

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@claude
Copy link

claude bot commented Mar 4, 2026

Claude finished @faygade's task —— View job


Pull Request Review

Review Tasks

  • Check for repo-specific guidelines (CLAUDE.md)
  • Review changes to integrations.types.ts
  • Review the new test file (integrations.test.ts)
  • Check if previous review issues were addressed
  • Analyze code quality and best practices
  • Check for potential bugs or issues
  • Evaluate performance considerations
  • Review security concerns
  • Assess test coverage for the changes

Summary

This PR successfully adds an optional model parameter to the InvokeLLM function and includes comprehensive test coverage. The previous review's main concern about inconsistent model names has been fully addressed.

Previous Issues Resolved ✅

  1. Model naming inconsistency: All examples now use valid models (gpt_5 instead of gpt-4)
  2. Missing tests: Comprehensive test suite added with 3 test cases covering different scenarios

Code Quality and Best Practices ✅

  • Well-documented: Clear JSDoc comments explaining the parameter purpose and listing all available options
  • Type-safe: Uses TypeScript union type to restrict model values to valid options only
  • Follows conventions: Code style matches the existing codebase patterns
  • Optional parameter: Correctly marked as optional with ?, maintaining backward compatibility
  • Test structure: Tests follow standard patterns using Vitest and proper mocking with nock

Potential Bugs or Issues ✅

No issues found. The implementation correctly:

  • Uses underscores in model names (gpt_5 not gpt-5) consistently
  • All examples in the file now use valid model values
  • Type definition properly constrains the allowed models

Performance Considerations ✅

  • No performance impact: This is a type definition change and test addition only
  • Tests are well-structured and won't impact runtime performance

Security Concerns ✅

  • No security issues identified
  • Model selection is properly constrained to a predefined list, preventing arbitrary model injection
  • Tests don't expose any sensitive information

Test Coverage ✅

Excellent test coverage added:

  1. Basic usage test: Verifies model parameter is passed to API correctly
  2. Backward compatibility test: Ensures the function works without the model parameter
  3. Complex scenario test: Tests model parameter alongside other optional parameters like response_json_schema

The tests properly mock the API calls and verify the expected behavior.

Implementation Note ⚠️

While the type definitions and tests are complete, the actual implementation that handles this parameter in the API call logic should be verified to be working correctly. The tests mock the API responses, so they assume the backend properly handles the model parameter.

Recommendations

  1. Consider adding a test case that verifies TypeScript compilation fails when an invalid model is provided (type-level test)
  2. Document in the README or migration guide if users need to update their model references from gpt-4 to the new naming convention

Overall Assessment

This PR successfully addresses all previous concerns and adds proper test coverage. The implementation is clean, type-safe, and maintains backward compatibility. Ready for merge once the backend implementation is confirmed to handle the new parameter correctly.

@faygade faygade merged commit e8df716 into main Mar 8, 2026
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

docs-draft PR has auto-drafted documentation suggestions

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants