Skip to content

Conversation

@fede-kamel
Copy link
Contributor

Summary

Implements response_format parameter to enable JSON mode and JSON schema output formatting for both Meta Llama and Cohere models. This allows users to leverage structured output capabilities via with_structured_output() and direct response_format configuration.

Changes

  • Add response_format field to OCIGenAIBase with comprehensive documentation
  • Implement response_format handling in ChatOCIGenAI._prepare_request method
  • Support response_format via class initialization, bind(), and model_kwargs
  • Add comprehensive unit tests covering all configuration methods
  • Add integration tests validating end-to-end functionality with real API calls

Implementation Details

The implementation ensures response_format is properly passed to both GenericChatRequest (Meta Llama) and CohereChatRequest (Cohere) models by adding the parameter to the model_kwargs in the _prepare_request method.

Testing

  • Unit Tests: 9/9 tests passing

    • Class-level response_format
    • Response_format via bind()
    • Response_format via model_kwargs
    • Pass-through to API for Generic and Cohere models
    • with_structured_output integration (json_mode and json_schema)
  • Integration Tests: 10/10 tests passing with real OCI API calls

    • JSON mode with Meta Llama and Cohere
    • Structured output with Pydantic models
    • JSON schema mode with Meta Llama
    • Complex nested structures

Fixes

Fixes #33

Breaking Changes

None - this is a new feature that is fully backward compatible.

Implements response_format parameter to enable JSON mode and JSON schema
output formatting for both Meta Llama and Cohere models. This allows
users to leverage structured output capabilities via with_structured_output()
and direct response_format configuration.

Changes:
- Add response_format field to OCIGenAIBase with comprehensive documentation
- Implement response_format handling in ChatOCIGenAI._prepare_request method
- Support response_format via class initialization, bind(), and model_kwargs
- Add comprehensive unit tests covering all configuration methods
- Add integration tests validating end-to-end functionality with real API calls

The implementation ensures response_format is properly passed to both
GenericChatRequest (Meta Llama) and CohereChatRequest (Cohere) models.

Fixes oracle#33
@oracle-contributor-agreement oracle-contributor-agreement bot added the OCA Verified All contributors have signed the Oracle Contributor Agreement. label Oct 31, 2025
@fede-kamel
Copy link
Contributor Author

Hi @YouNeedCryDear! 👋

I hope you're doing well! I wanted to gently follow up on this PR that adds response_format parameter support for structured output.

Why this matters:
This implements a feature requested in issue #33 and enables users to leverage JSON mode and JSON schema output formatting for both Meta Llama and Cohere models through LangChain's standard with_structured_output() method.

What's been validated:

  • ✅ All 9 unit tests passing
  • ✅ All 10 integration tests passing with live OCI API calls
  • ✅ Fully backward compatible (no breaking changes)
  • ✅ Comprehensive documentation and examples included

This follows the same parameter pattern as other recent enhancements and should be a straightforward addition to the SDK's capabilities.

Would you have a chance to take a look when you get a moment? I'm happy to address any feedback or questions you might have!

Thanks so much for maintaining this project! 🙏

@YouNeedCryDear
Copy link
Member

@fede-kamel Why do we need response_format in the class? Can it be inside the model_kwargs? I don't see other vendors doing this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

OCA Verified All contributors have signed the Oracle Contributor Agreement.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Unrecognized keyword arguments: response_format

2 participants