Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
88 changes: 88 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ structured chat data and Codex sessions in the browser.
| :-------------------------- | :------------------------------------------------------------------------------------------------------------------------- |
| Harmony conversation viewer | Renders Harmony conversations with support for different message types and metadata. |
| Codex session viewer | Detects Codex session JSONL files, converts them into a conversation, and renders them in the same viewer. |
| Codex sessions browser | Lists local Codex sessions with pagination, prompt/response filters, fuzzy search, and full-text search over indexed turns. |
| Flexible loading | Loads data from the clipboard, local `.json` or `.jsonl` files, or public HTTP(S) JSON/JSONL URLs. |
| Markdown and HTML rendering | Renders markdown in message content, including formulas and optional HTML blocks. |
| Translation | Translates non-English text into English in normal mode or frontend-only mode with a user-provided OpenAI API key. |
Expand Down Expand Up @@ -58,6 +59,33 @@ There are two ways you can use Euphony.
3. If the conversation is stored at some top-level field → Euphony renders all conversations and treat other top-level fields as each conversation’s metadata
4. Else → Euphony renders the data as raw JSON objects

### Browse Local Codex Sessions

When you run the local backend, Euphony also exposes a dedicated sessions page
at `/sessions.html`.

The sessions page can:

- paginate through local session logs under `~/.codex/sessions` or
`CODEX_HOME/sessions`
- show each session's first prompt, last prompt, last response preview, and
timestamps
- open any session directly in the main Euphony viewer
- filter by:
- first prompt
- last prompt
- last response
- search across sessions with three modes:
- `Exact`: keyword matching on session summary fields such as prompts,
response preview, `cwd`, session id, and file path
- `Fuzzy`: typo-tolerant matching on those same summary fields
- `Full text`: indexed search across multi-turn user and assistant text from
the whole session

The full-text index is built lazily the first time you run a full-text search
and is stored locally at `~/.codex/euphony-codex-sessions.sqlite3` or inside
`CODEX_HOME` when that environment variable is set.

### Integrate Euphony into My Web App

#### Web Component API
Expand Down Expand Up @@ -119,13 +147,73 @@ The current backend includes a remote URL fetch path for loading JSON and JSONL
To develop Euphony locally, install Node.js and a package manager such as
[pnpm](https://pnpm.io/).

For the optional FastAPI backend, use Python 3.9 or newer and install the
Python dependencies from `pyproject.toml`. Euphony currently pins
`openai-harmony==0.0.4` because newer releases may require a local Rust
toolchain during installation on some environments.

After installing dependencies, the easiest way to run Euphony locally is with
`./start.sh`.

Start in the default production-style local mode:

```bash
python3.9 -m pip install -e .
pnpm install
pnpm run build
./start.sh
```

This mode starts only the FastAPI server on `http://127.0.0.1:8020` and serves
the prebuilt frontend from `dist/`, including `http://127.0.0.1:8020/sessions.html`.

Use this mode when you mainly want to open the local app and the sessions page
without running a separate Vite dev server.

Start in explicit development mode:

```bash
./start.sh dev
```

Development mode starts both:

- the backend with `uvicorn --reload`
- the Vite frontend dev server with hot module reload

Use this mode when you are changing frontend or backend code and want immediate
reloads while developing.

You can override ports in either mode:

```bash
BACKEND_PORT=8025 FRONTEND_PORT=3005 ./start.sh dev
```

`./start.sh` also accepts `MODE=prod` or `MODE=dev` if you prefer environment
variables over positional arguments.

Common local URLs:

- backend-served app: `http://127.0.0.1:8020/`
- backend-served sessions page: `http://127.0.0.1:8020/sessions.html`
- frontend dev server in `./start.sh dev`: `http://127.0.0.1:3000/`

If you prefer to launch the pieces manually, use the commands below.

Start the backend server:

```bash
python3.9 -m pip install -e .
pnpm install
uvicorn fastapi-main:app --app-dir server --host 127.0.0.1 --port 8020 --reload
```

The first backend Harmony render may download `o200k_base.tiktoken` from
`https://openaipublic.blob.core.windows.net/encodings/`. If your environment
cannot reach that URL, set `TIKTOKEN_ENCODINGS_BASE` to a local directory that
already contains the tokenizer file.

Start the frontend development server:

```bash
Expand Down
Loading