Skip to content

Conversation

@furqan-shaikh-dev
Copy link

Summary

Adds support for OpenAI Responses API by introducing a new class OCIChatOpenAI. This allows users to invoke OCI Generative AI to leverage OpenAI Responses API.

Changes

  • Add new file oci_generative_ai_responses_api.py which implements OCIChatOpenAI class and OCI Auth.
  • Add comprehensive unit tests covering most common langchain and langgraph scenarios.
  • Add examples folder to provide sample usages
  • Add documentation related to the change in README.md

Testing

  • Unit Tests: 5/5 tests passing
  • Integration Tests: To be raised as a separate PR

Breaking Changes

None - this is a new feature that is fully backward compatible.

@oracle-contributor-agreement
Copy link

Thank you for your pull request and welcome to our community! To contribute, please sign the Oracle Contributor Agreement (OCA).
The following contributors of this PR have not signed the OCA:

To sign the OCA, please create an Oracle account and sign the OCA in Oracle's Contributor Agreement Application.

When signing the OCA, please provide your GitHub username. After signing the OCA and getting an OCA approval from Oracle, this PR will be automatically updated.

If you are an Oracle employee, please make sure that you are a member of the main Oracle GitHub organization, and your membership in this organization is public.

@oracle-contributor-agreement oracle-contributor-agreement bot added the OCA Required At least one contributor does not have an approved Oracle Contributor Agreement. label Nov 4, 2025
@furqan-shaikh-dev furqan-shaikh-dev marked this pull request as ready for review November 10, 2025 03:47
OUTPUT_VERSION = "responses/v1"


class OCIHttpxAuth(httpx.Auth):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the oci_openai has been published in github. Can we use that?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry, u mean the internal SDK (OCI OpenAI Client SDK)? For LA this is not being published to pypi but only to internal artifactory. @anbhaduri

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can only start taking dependency on oci_openai post GA when we start publishing it to pypi,

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why cant we leverage this? https://github.com/oracle-samples/oci-openai/tree/v0.2.1
I don't think it is a good idea to have separate Auth defined.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just tested the oci-openai is already available thru pip install
https://docs.oracle.com/en-us/iaas/Content/generative-ai/oci-openai.htm

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can take dependency on it. That repo needs to be modified to include things specific to oci openai response api for eg: conversation store id etc. Also the names of the headers are different for compartment id etc.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the users are expected to use oci-openai repo to interact with our service, then I think we should use it.

super().__init__(signer=signer)


def get_base_url(region: str, override_url: str = "") -> str:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Those going to utils?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you mean the existing utils under llms? we have kept it at a single place since its being only used by oci_generative_ai_responses_api module. wdyt?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is better to put into utils for extensibility, even for now it is only used for responses api.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DONE

)


class OCIChatOpenAI(ChatOpenAI):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not following the naming standard? If the genai client is called ChatOCIGenAI, then I think this should better be ChatOCIOpenAI

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Renamed to ChatOCIOpenAI

import requests
from langchain_openai import ChatOpenAI
from oci.config import DEFAULT_LOCATION, DEFAULT_PROFILE
from openai import (
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

openai not required in dependencies?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on the way current packages are managed, we give error messages when specific required packages are not installed. For eg: oracle-ads, langchain-openai. So we have followed the same. When langchain-openai is not installed an error message is thrown. langchain-openai brings in the openai as a transitive dependency. wdyt?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is going to fail because you are importing at the top level. Could you check what ads is doing.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DONE

@streamnsight
Copy link
Member

@YouNeedCryDear
wouldn't it be a lot easier to switch to the oci_openai package when making calls with external models (llama, grok, gpt...), then the whole support would be propagated from the openai original SDK, and the only support we need to implement is support for Cohere models
https://github.com/oracle-samples/oci-openai

@YouNeedCryDear
Copy link
Member

@YouNeedCryDear wouldn't it be a lot easier to switch to the oci_openai package when making calls with external models (llama, grok, gpt...), then the whole support would be propagated from the openai original SDK, and the only support we need to implement is support for Cohere models https://github.com/oracle-samples/oci-openai

Sounds like a good idea. But this might be a huge change to the code base and I don't think the Reponses API team is able to do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

OCA Required At least one contributor does not have an approved Oracle Contributor Agreement.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants