feat: support Codex ChatGPT OAuth login#176
feat: support Codex ChatGPT OAuth login#176BakedSoups wants to merge 8 commits intopapercomputeco:mainfrom
Conversation
|
Where in Tapes do Claude/OpenCode responses get turned into a ‘turn’, and can we plug Codex websocket events into that same layer? |
|
@jpmcb the OAuth auth side of this PR looks good — but there's a blocker on session recording that's worth flagging. Codex uses WebSocket transport instead of HTTP/SSE for its response streaming. Tapes can proxy and route the WS connection fine, but we can't capture or store the WebSocket event stream, so What's blocking: The proxy ( Proposed approach — use the ingest server:
This keeps the blast radius small: Related: PCC-326 tracks the OAuth login side but doesn't cover WS capture. There's no ticket yet for the WebSocket recording piece — we should create one that blocks PCC-326's last acceptance criterion ("Proxy still captures and stores conversation turns"). Worth noting: getting Codex working end-to-end through tapes gives us a solid fallback when Claude is down or blocked. Having both Codex and OpenCode as alternatives means we're not single-provider dependent. |
|
got it working with a python proxy for now :
How the recording works:
|
|
Abandoned the old generic OpenAI/base-url path because Codex uses |
We should not attempt to handle a websocket upgrade since we don't/won't have the RPC client/server in OpenAI's backend to handle the turns from Codex: we could in theory just stream the socket through to their backend but then that completely defeats the purpose of attempting to read/store message turns. Codex does support HTTPS fallback when the websocket fails: this is exactly what OpenRouter does - https://openrouter.ai/docs/guides/coding-agents/codex-cli#step-3-configure-codex-for-openrouter I wonder if the problem you ran into is this where the auth header is getting stripped after websocket falls back to HTTPS: |
|
Thanks for the tip, I'll look into implementing it that way in a new branch. |
This PR makes it so we can use the
native login session for codex so we can do:
instead of only having the api as an option
Implementing Codex needed a different flow because its ChatGPT login
path is built around a local ChatGPT-style proxy surface, not the API flow used by OpenCode and Claude, so we added a Codex-specific local proxy path and raw Responses relay using
github.com/gofiber/contrib/v3/websocketto capture and store turns.Tapes now:
relay
Test note:
preserve OAuth tokens and add OPENAI_API_KEY