feat: add OpenAI OAuth provider option#65
Conversation
warren618
left a comment
There was a problem hiding this comment.
Thanks for taking this on. I do not think we should merge this in its current form.
The issue asks for an OpenAI account/OAuth option, but this PR only aliases OPENAI_OAUTH_TOKEN into the existing OPENAI_API_KEY path. That does not implement an OAuth flow, account linking, token refresh/expiration handling, secure local storage, or user-facing setup/error behavior. It also risks presenting a bearer token variable as OAuth support even though the runtime still treats it exactly like an API key.
Because this touches the protected provider layer, we should keep the scope stricter here. Before changing agent/src/providers, please first define the actual supported auth flow and how it integrates with the existing ChatOpenAI path. A mergeable version should include documented token source, refresh/expiry semantics, storage/setup guidance, compatibility behavior, and tests that prove it is more than an environment-variable alias.
Keeping #49 open for discussion; this PR should not close it as-is.
de913aa to
1cf2e9c
Compare
1cf2e9c to
772bae2
Compare
|
Updated. This no longer maps OAuth tokens to The PR now adds a separate
Verification:
|
warren618
left a comment
There was a problem hiding this comment.
Thanks for updating this beyond the original env-alias approach. The separate provider, OAuth CLI path, and restricted endpoint are the right direction, but I do not think this is merge-ready yet.
Blocking issue: the default model configured by this PR does not work with the new ChatGPT Codex account endpoint. I installed the new dependencies and ran a live smoke through OpenAICodexLLM with the PR defaults (LANGCHAIN_PROVIDER=openai-codex, LANGCHAIN_MODEL_NAME=openai-codex/gpt-5.1-codex, OPENAI_CODEX_BASE_URL=https://chatgpt.com/backend-api/codex/responses). The endpoint returned HTTP 400: The 'gpt-5.1-codex' model is not supported when using Codex with a ChatGPT account. Since this model is the default in llm_providers.json, .env.example, and vibe-trading init, a user following the documented setup gets a broken provider immediately. Please switch the defaults to a model that is actually accepted by this OAuth path and add a regression/smoke test around the request body model normalization so this does not drift again.
Also, this PR adds ~400 lines of protected provider/auth behavior but currently only adds one env-mapping test. Before merge, please add focused tests for the new adapter pieces that can be covered without live credentials: base URL validation, build_llm() returning the Codex adapter, token-missing error path with an isolated token path, message/tool conversion, SSE parsing for text and function-call events, and non-200 HTTP error handling. The current CI would not catch the default-model failure or most adapter regressions.
I approved the pending GitHub Actions run so CI can execute on this PR, but please do not merge this until the default model is fixed and the adapter coverage is added. Also update the PR body/test plan; it still describes the old OPENAI_OAUTH_TOKEN -> OPENAI_API_KEY implementation and says Closes #49, which is misleading while the implementation is still being corrected.
warren618
left a comment
There was a problem hiding this comment.
Default model fixed to the live-verified ChatGPT Codex OAuth model, adapter coverage added, PR body updated, and CI is green.
|
Merged, thanks for the contribution. I added a maintainer follow-up commit before merge to switch the default ChatGPT Codex OAuth model to the live-verified openai-codex/gpt-5.3-codex path and add focused adapter regression coverage. This also closes #49. |
Summary
openai-codexprovider backed by ChatGPT/Codex OAuth viaoauth-cli-kitvibe-trading provider login openai-codexfor interactive OAuth login and token storageOPENAI_API_KEYhttps://chatgpt.com/backend-api/codex/responsesand sendchatgpt-account-idCloses #49.
Tests
python -m py_compile agent/src/providers/llm.py agent/src/providers/openai_codex.py agent/cli.py agent/api_server.py agent/src/preflight.pypytest agent/tests/test_openai_codex.py agent/tests/test_llm.py agent/tests/test_settings_api.py agent/tests/test_cli_init.py --tb=short -qopenai-codex/gpt-5.3-codexreturnedOKfrom the ChatGPT Codex OAuth endpoint