Skip to content

feat: support Codex ChatGPT OAuth login#176

Draft
BakedSoups wants to merge 8 commits intopapercomputeco:mainfrom
BakedSoups:feat/codex-oauth-login
Draft

feat: support Codex ChatGPT OAuth login#176
BakedSoups wants to merge 8 commits intopapercomputeco:mainfrom
BakedSoups:feat/codex-oauth-login

Conversation

@BakedSoups
Copy link
Copy Markdown

@BakedSoups BakedSoups commented Apr 6, 2026

This PR makes it so we can use the

native login session for codex so we can do:

  codex login
  tapes start codex

instead of only having the api as an option

Implementing Codex needed a different flow because its ChatGPT login
path is built around a local ChatGPT-style proxy surface, not the API flow used by OpenCode and Claude, so we added a Codex-specific local proxy path and raw Responses relay using github.com/gofiber/contrib/v3/websocket to capture and store turns.

Tapes now:

  • reads ~/.codex/auth.json
  • uses the existing OAuth session
  • routes Codex traffic to chatgpt.com
  • handles the local proxy surface Codex expects
  • captures and stores turns through a Codex-specific raw Responses
    relay

Test note:

  • updated pkg/credentials/codex_test.go to match current behavior:
    preserve OAuth tokens and add OPENAI_API_KEY

@bdougie bdougie marked this pull request as draft April 6, 2026 20:41
@BakedSoups
Copy link
Copy Markdown
Author

Where in Tapes do Claude/OpenCode responses get turned into a ‘turn’, and can we plug Codex websocket events into that same layer?

@bdougie
Copy link
Copy Markdown
Contributor

bdougie commented Apr 6, 2026

@jpmcb the OAuth auth side of this PR looks good — but there's a blocker on session recording that's worth flagging.

Codex uses WebSocket transport instead of HTTP/SSE for its response streaming. Tapes can proxy and route the WS connection fine, but we can't capture or store the WebSocket event stream, so tapes deck won't show any Codex sessions.

What's blocking: The proxy (proxy.go:handleProxy) is built entirely around HTTP request/response pairs. A WebSocket Upgrade request has no body to parse, and the bidirectional frame stream doesn't fit the ParseRequest/ParseResponse flow.

Proposed approach — use the ingest server:

  1. Add a WebSocket relay handler in the proxy that detects Upgrade: websocket, tunnels the connection to upstream, and tees frames to a recorder
  2. The recorder accumulates WS events per turn (response.createdresponse.done)
  3. On response.done, reconstruct equivalent ChatRequest/ChatResponse JSON and POST /v1/ingest
  4. From there, the existing ingest → worker pool → merkle DAG → storage pipeline handles everything — no changes needed downstream

This keeps the blast radius small: merkle.Bucket stays untouched, storage is unchanged, and the new code is isolated to the transport layer.

Related: PCC-326 tracks the OAuth login side but doesn't cover WS capture. There's no ticket yet for the WebSocket recording piece — we should create one that blocks PCC-326's last acceptance criterion ("Proxy still captures and stores conversation turns").

Worth noting: getting Codex working end-to-end through tapes gives us a solid fallback when Claude is down or blocked. Having both Codex and OpenCode as alternatives means we're not single-provider dependent.

@BakedSoups
Copy link
Copy Markdown
Author

BakedSoups commented Apr 6, 2026

got it working with a python proxy for now :
What it does:

  • listens locally on 127.0.0.1:8765
  • accepts Codex POST /v1/responses
  • rewrites that to https://chatgpt.com/backend-api/codex/responses
  • forwards the original body with minimal header changes
  • reads the streamed response events
  • reconstructs one final turn from those events
  • optionally posts that reconstructed turn to Tapes ingest

How the recording works:

  • when it sees response.created, it opens a turn
  • while the stream is active, it collects text from events like response.output_text.delta
  • when it sees response.done, response.completed, or response.failed, it closes the turn
  • it builds:
    • a minimal request: model + user message text
    • a minimal response: model + assistant text
  • then it logs turn_reconstructed
  • if --ingest-url is set, it POSTs that to /v1/ingest

@BakedSoups
Copy link
Copy Markdown
Author

BakedSoups commented Apr 7, 2026

Abandoned the old generic OpenAI/base-url path because Codex uses
socket-backed transport patterns rather than the straightforward API
flows used by OpenCode or Claude. Switched to the probe-backed
pattern: minimal request mutation, explicit path rewrite, and
streamed turn reconstruction for storage.

@BakedSoups BakedSoups changed the title feat: support Codex OAuth login feat: support Codex ChatGPT OAuth login Apr 7, 2026
@jpmcb
Copy link
Copy Markdown
Contributor

jpmcb commented Apr 7, 2026

Add a WebSocket relay handler in the proxy that detects Upgrade: websocket, tunnels the connection to upstream, and tees frames to a recorder

We should not attempt to handle a websocket upgrade since we don't/won't have the RPC client/server in OpenAI's backend to handle the turns from Codex: we could in theory just stream the socket through to their backend but then that completely defeats the purpose of attempting to read/store message turns.

Codex does support HTTPS fallback when the websocket fails: this is exactly what OpenRouter does - https://openrouter.ai/docs/guides/coding-agents/codex-cli#step-3-configure-codex-for-openrouter

I wonder if the problem you ran into is this where the auth header is getting stripped after websocket falls back to HTTPS:

@BakedSoups
Copy link
Copy Markdown
Author

Thanks for the tip, I'll look into implementing it that way in a new branch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants