Skip to content

feat: add OpenAI OAuth provider option#65

Merged
warren618 merged 2 commits intoHKUDS:mainfrom
ykykj:fix-openai-oauth-provider
Apr 30, 2026
Merged

feat: add OpenAI OAuth provider option#65
warren618 merged 2 commits intoHKUDS:mainfrom
ykykj:fix-openai-oauth-provider

Conversation

@ykykj
Copy link
Copy Markdown
Collaborator

@ykykj ykykj commented Apr 30, 2026

Summary

  • Add an openai-codex provider backed by ChatGPT/Codex OAuth via oauth-cli-kit
  • Add vibe-trading provider login openai-codex for interactive OAuth login and token storage
  • Route this provider through a dedicated Codex adapter instead of mapping OAuth credentials into OPENAI_API_KEY
  • Restrict OAuth requests to https://chatgpt.com/backend-api/codex/responses and send chatgpt-account-id
  • Expose OAuth provider metadata in Settings and init flows
  • Add focused regression coverage for the Codex adapter and provider defaults

Closes #49.

Tests

  • python -m py_compile agent/src/providers/llm.py agent/src/providers/openai_codex.py agent/cli.py agent/api_server.py agent/src/preflight.py
  • pytest agent/tests/test_openai_codex.py agent/tests/test_llm.py agent/tests/test_settings_api.py agent/tests/test_cli_init.py --tb=short -q
  • Live smoke: openai-codex/gpt-5.3-codex returned OK from the ChatGPT Codex OAuth endpoint
  • GitHub Actions CI

Copy link
Copy Markdown
Collaborator

@warren618 warren618 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for taking this on. I do not think we should merge this in its current form.

The issue asks for an OpenAI account/OAuth option, but this PR only aliases OPENAI_OAUTH_TOKEN into the existing OPENAI_API_KEY path. That does not implement an OAuth flow, account linking, token refresh/expiration handling, secure local storage, or user-facing setup/error behavior. It also risks presenting a bearer token variable as OAuth support even though the runtime still treats it exactly like an API key.

Because this touches the protected provider layer, we should keep the scope stricter here. Before changing agent/src/providers, please first define the actual supported auth flow and how it integrates with the existing ChatOpenAI path. A mergeable version should include documented token source, refresh/expiry semantics, storage/setup guidance, compatibility behavior, and tests that prove it is more than an environment-variable alias.

Keeping #49 open for discussion; this PR should not close it as-is.

@ykykj ykykj force-pushed the fix-openai-oauth-provider branch from de913aa to 1cf2e9c Compare April 30, 2026 11:22
@ykykj ykykj force-pushed the fix-openai-oauth-provider branch from 1cf2e9c to 772bae2 Compare April 30, 2026 11:41
@ykykj ykykj closed this Apr 30, 2026
@ykykj
Copy link
Copy Markdown
Collaborator Author

ykykj commented Apr 30, 2026

Updated.

This no longer maps OAuth tokens to OPENAI_API_KEY. The old openai-oauth / OPENAI_OAUTH_TOKEN path has been removed.

The PR now adds a separate openai-codex provider using ChatGPT/Codex OAuth via oauth-cli-kit:

  • login command: vibe-trading provider login openai-codex
  • OAuth token is stored by oauth-cli-kit, not in .env
  • requests include chatgpt-account-id
  • supported endpoint is restricted to https://chatgpt.com/backend-api/codex/responses
  • Settings UI marks it as OAuth and disables arbitrary base URL editing
  • preflight shows a clear login-required message

Verification:

  • pytest agent/tests/test_llm.py agent/tests/test_settings_api.py agent/tests/test_cli_init.py: 39 passed
  • Full agent/tests: 688 passed, 1 skipped, 3 failed on existing Windows POSIX path expectations in test_loop_helpers.py,
    unrelated to this PR.

@ykykj ykykj reopened this Apr 30, 2026
@ykykj ykykj closed this Apr 30, 2026
@ykykj ykykj reopened this Apr 30, 2026
Copy link
Copy Markdown
Collaborator

@warren618 warren618 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for updating this beyond the original env-alias approach. The separate provider, OAuth CLI path, and restricted endpoint are the right direction, but I do not think this is merge-ready yet.

Blocking issue: the default model configured by this PR does not work with the new ChatGPT Codex account endpoint. I installed the new dependencies and ran a live smoke through OpenAICodexLLM with the PR defaults (LANGCHAIN_PROVIDER=openai-codex, LANGCHAIN_MODEL_NAME=openai-codex/gpt-5.1-codex, OPENAI_CODEX_BASE_URL=https://chatgpt.com/backend-api/codex/responses). The endpoint returned HTTP 400: The 'gpt-5.1-codex' model is not supported when using Codex with a ChatGPT account. Since this model is the default in llm_providers.json, .env.example, and vibe-trading init, a user following the documented setup gets a broken provider immediately. Please switch the defaults to a model that is actually accepted by this OAuth path and add a regression/smoke test around the request body model normalization so this does not drift again.

Also, this PR adds ~400 lines of protected provider/auth behavior but currently only adds one env-mapping test. Before merge, please add focused tests for the new adapter pieces that can be covered without live credentials: base URL validation, build_llm() returning the Codex adapter, token-missing error path with an isolated token path, message/tool conversion, SSE parsing for text and function-call events, and non-200 HTTP error handling. The current CI would not catch the default-model failure or most adapter regressions.

I approved the pending GitHub Actions run so CI can execute on this PR, but please do not merge this until the default model is fixed and the adapter coverage is added. Also update the PR body/test plan; it still describes the old OPENAI_OAUTH_TOKEN -> OPENAI_API_KEY implementation and says Closes #49, which is misleading while the implementation is still being corrected.

Copy link
Copy Markdown
Collaborator

@warren618 warren618 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Default model fixed to the live-verified ChatGPT Codex OAuth model, adapter coverage added, PR body updated, and CI is green.

@warren618 warren618 merged commit dea99ec into HKUDS:main Apr 30, 2026
1 check passed
@warren618
Copy link
Copy Markdown
Collaborator

warren618 commented Apr 30, 2026

Merged, thanks for the contribution. I added a maintainer follow-up commit before merge to switch the default ChatGPT Codex OAuth model to the live-verified openai-codex/gpt-5.3-codex path and add focused adapter regression coverage. This also closes #49.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature] Add Oauth option for OpenAI Account instead of API key

3 participants