Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
.env
.git
.github
docs
*.md
backend/venv
backend/__pycache__
backend/*.pyc
backend/data/sessions
backend/data/fastf1-cache
backend/data/pit_loss_raw.json
frontend/node_modules
frontend/.next
frontend/out
25 changes: 25 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# F1 Replay Timing — Environment Variables
# Copy to .env and fill in your values

PORT=8000
DATA_DIR=/data

# Storage: "local" (default) or "r2"
# Local stores computed session data on disk. R2 reads pre-computed data from Cloudflare R2.
STORAGE_MODE=local

# Cloudflare R2 (only needed when STORAGE_MODE=r2)
R2_ACCOUNT_ID=
R2_ACCESS_KEY_ID=
R2_SECRET_ACCESS_KEY=
R2_BUCKET_NAME=f1timingdata

# Optional — authentication
# AUTH_ENABLED=true
# AUTH_PASSPHRASE=your-passphrase

# Optional — photo sync feature (requires OpenRouter API key)
# OPENROUTER_API_KEY=your-key-here

# Only needed for pre-compute script (not required at runtime)
# FASTF1_CACHE_DIR=/data/fastf1-cache
11 changes: 0 additions & 11 deletions .github/workflows/pr-checks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,14 +30,3 @@ jobs:
- name: npm audit
run: npm audit --audit-level=high

osv-scan:
name: Google OSV Scan
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4

- uses: google/osv-scanner-action/osv-scanner-action@v2
with:
scan-args: |-
--recursive
.
23 changes: 6 additions & 17 deletions .github/workflows/publish-docker.yml
Original file line number Diff line number Diff line change
@@ -1,14 +1,12 @@
name: Publish Docker Images
name: Publish Docker Image

on:
push:
tags:
- "v*"
branches:
- dev # TEMPORARY: remove before release
release:
types: [published]

env:
REGISTRY: ghcr.io
IMAGE_NAME: f1replaytiming

jobs:
build-and-push:
Expand All @@ -17,14 +15,6 @@ jobs:
contents: read
packages: write

strategy:
matrix:
include:
- image: f1replaytiming-backend
context: ./backend
- image: f1replaytiming-frontend
context: ./frontend

steps:
- name: Checkout
uses: actions/checkout@v4
Expand All @@ -40,7 +30,7 @@ jobs:
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ github.repository_owner }}/${{ matrix.image }}
images: ${{ env.REGISTRY }}/${{ github.repository_owner }}/${{ env.IMAGE_NAME }}
tags: |
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
Expand All @@ -52,11 +42,10 @@ jobs:
- name: Build and push
uses: docker/build-push-action@v6
with:
context: ${{ matrix.context }}
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-args: ${{ matrix.build-args || '' }}
platforms: linux/amd64,linux/arm64
cache-from: type=gha
cache-to: type=gha,mode=max
37 changes: 36 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,46 @@

All notable changes to F1 Replay Timing will be documented in this file.

## 2.0.0

### Migrating from v1.x

The Docker image has changed. The old `f1replaytiming-backend` and `f1replaytiming-frontend` images will no longer receive updates. The new image is `ghcr.io/adn8naiagent/f1replaytiming:latest` (single unified image).

To migrate:
1. Pull the new image: `docker pull ghcr.io/adn8naiagent/f1replaytiming:latest`
2. Copy `.env.example` to `.env` and configure (most defaults work out of the box)
3. Replace your `docker-compose.yml` with the one from the repo. It's now a single service on one port
4. `docker compose up`

You no longer need `NEXT_PUBLIC_API_URL`, `FRONTEND_URL`, or any CORS settings. Your session data volume carries over as-is, no reprocessing needed.

### Breaking Changes
- **Single container architecture** — frontend and backend are now merged into a single Docker container serving everything from one port. The separate frontend and backend containers have been removed
- **Simplified configuration** — all config is now in a single `.env` file. `NEXT_PUBLIC_API_URL`, `FRONTEND_URL`, and CORS configuration are no longer needed
- **Static frontend** — Next.js switched from `output: 'standalone'` to `output: 'export'`, producing static HTML/CSS/JS served by FastAPI. No Node.js runtime in the final image
- **URL format change** — dynamic routes (`/replay/2026/5`) replaced with query parameters (`/replay?year=2026&round=5&type=R`). Old URLs redirect automatically

### Improvements
- **No CORS** — frontend and API are the same origin, eliminating all cross-origin issues
- **Reverse proxy friendly** — single port means Traefik, nginx, and Cloudflare tunnels just work with no special configuration
- **WebSocket reliability** — same-origin WebSocket connections no longer break behind TLS termination or mixed protocol proxies
- **Screen wake lock** — prevents screen dimming and device sleep during replay playback and live sessions

---

## 1.3.2.2

### Fixes
- **Replay timing drift** — replaced fixed-duration sleeps with wall-clock-anchored playback to prevent timing drift during sessions (contributed by [@stephenwilley](https://github.com/stephenwilley))

---

## 1.3.2.1

### Fixes
- **Lap analysis lap number** — fixed showing incomplete current lap data; now only displays completed laps
- **Mobile lap analysis scroll** — section is now scrollable on mobile


---

Expand Down
35 changes: 35 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# ── Stage 1: Build frontend ──
FROM node:20-alpine AS frontend
WORKDIR /app
COPY frontend/package.json frontend/package-lock.json* ./
RUN npm ci
COPY frontend/ ./
RUN npm run build
# Produces /app/out/ with static HTML/CSS/JS

# ── Stage 2: Production ──
FROM python:3.11-slim

WORKDIR /app

# System deps (numpy/pandas, HEIC support)
RUN apt-get update && apt-get install -y --no-install-recommends \
gcc g++ libheif-dev && \
rm -rf /var/lib/apt/lists/*

COPY backend/requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY backend/ .

# Copy frontend build output
COPY --from=frontend /app/out /app/static

# Create data directory
RUN mkdir -p /data/fastf1-cache

EXPOSE 8000

ENV PORT=8000
ENV STATIC_DIR=/app/static
CMD sh -c "cp -n /app/data/pit_loss.json /data/pit_loss.json 2>/dev/null; uvicorn main:app --host 0.0.0.0 --port $PORT"
Loading
Loading