Skip to content

Conversation

@Ratish1
Copy link
Contributor

@Ratish1 Ratish1 commented Sep 29, 2025

Description

This pull request adds support for non-streaming responses to the OpenAIModel. Users can now set streaming=False during model initialization to receive a single, complete response object instead of an event stream.

Key Changes

src/strands/models/openai.py:

  • Added optional streaming: Optional[bool] config key.

  • format_request now respects streaming (defaults to True to preserve existing behavior).

  • Stream method supports both streaming and non-streaming flows (converts non-streaming provider response into streaming-style events).

  • Small helper _convert_non_streaming_to_streaming added to normalize non-streaming responses into the same event format.

src/strands/models/litellm.py:

  • Updated to include a streaming option in its config and to preserve compatibility with OpenAIModel.

tests/strands/models/test_openai.py:

  • Updated/added tests to cover non-streaming behavior.

Related Issues

Closes #778

Documentation PR

N/A

Type of Change

Bug fix

Testing

How have you tested the change? Verify that the changes do not break functionality or introduce warnings in consuming repositories: agents-docs, agents-tools, agents-cli

  • I ran hatch run prepare
  • I ran hatch fmt --formatter locally.
  • I ran hatch fmt --linter and pre-commit run --all-files.
  • I ran unit tests locally:
  • tests/strands/models/test_openai.py::test_stream → Passed
  • tests/strands/models/test_openai.py::test_stream_respects_streaming_flag → Passed

Checklist

  • I have read the CONTRIBUTING document
  • I have added any necessary tests that prove my fix is effective or my feature works
  • I have updated the documentation accordingly
  • I have added an appropriate example to the documentation to outline the feature, or no new docs are needed
  • My changes generate no new warnings
  • Any dependent changes have been merged and published

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@Ratish1 Ratish1 changed the title fix(models): Add non-streaming suppor to OpenAIModel fix(models): Add non-streaming support to OpenAIModel Sep 29, 2025
@Ratish1
Copy link
Contributor Author

Ratish1 commented Oct 23, 2025

Wanted to follow up on this, could you take a look at this @zastrowm. Thanks.

Copy link
Member

@Unshure Unshure left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add an integration test for this code as well?


model_id: str
params: Optional[dict[str, Any]]
streaming: Optional[bool]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
streaming: Optional[bool]
streaming: bool | None

"model": self.config["model_id"],
"stream": True,
# Use configured streaming flag; default True to preserve previous behavior.
"stream": bool(self.get_config().get("streaming", True)),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"stream": bool(self.get_config().get("streaming", True)),
"stream": self.config.get("streaming"),

Can you update this throughout the pr as well?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Unshure I've simplified the code to use self.config.get("streaming") as suggested. However, I kept the , True default parameter because without it, the default behavior becomes non-streaming, wouldn't it become a breaking change for existing users?

Comment on lines 38 to 39
streaming: Optional flag to indicate whether provider streaming should be used.
If omitted, defaults to True (preserves existing behaviour).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need this in this pr? Can we punt Litellm non-streaming to another pr?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes will remove it since this pr is about OpenAI, should I create another pr for Litellm non-streaming

Copy link
Contributor Author

@Ratish1 Ratish1 Oct 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, the pre-commit check is failing with a mypy error on the get_config method in LiteLLMModel. This was because removing the streaming field from LiteLLMConfig (as you requested) made its signature incompatible with the overridden method in OpenAIModel. So, to resolve it, I told mypy to ignore it.

@Ratish1
Copy link
Contributor Author

Ratish1 commented Oct 28, 2025

Can you add an integration test for this code as well?

Will do

@Ratish1 Ratish1 force-pushed the feat/openai-non-streaming branch from 2f670b4 to 6a3f111 Compare October 28, 2025 19:46
@Ratish1 Ratish1 requested a review from Unshure October 28, 2025 19:46
@Ratish1
Copy link
Contributor Author

Ratish1 commented Nov 7, 2025

Hello @Unshure , I wanted to follow up on this. lmk if you need more changes, thanks.

This commit introduces a streaming parameter to OpenAIModel to allow for non-streaming responses.

  The initial implementation revealed a type incompatibility in the inheriting LiteLLMModel. This has been resolved by updating LiteLLMConfig to be consistent with the parent
  OpenAIConfig, ensuring all pre-commit checks pass.

  The associated unit tests for OpenAIModel have also been improved to verify the non-streaming behavior.
@Ratish1 Ratish1 force-pushed the feat/openai-non-streaming branch from 6a3f111 to 0db7641 Compare November 17, 2025 16:04
@github-actions github-actions bot removed the size/m label Nov 17, 2025
@Ratish1
Copy link
Contributor Author

Ratish1 commented Dec 9, 2025

Hello @Unshure , a gentle remainder about this . Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Stream can't be disabled in OpenAIModel

2 participants