This repository contains an example implementation of TensorZero applied to a Wordle-playing agent using bandit algorithms to select among multiple language model variants.
You can run Wordle episodes and see how different model variants perform in the UI.
It is straightforward to configure additional variants or modify the existing ones by editing the TensorZero config file (config/tensorzero.toml) and associated templates.
-
Install uv (if not already installed):
# macOS/Linux curl -LsSf https://astral.sh/uv/install.sh | sh # Windows powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
See uv installation docs for more options.
-
Install dependencies:
uv sync uv run vf-install wordle --from-repo --branch v0.1.6.post0
-
Set up environment variables:
cp .env.example .env # The database URLs should be correct but you'll want an OpenRouter API key -
Required environment variables:
TENSORZERO_CLICKHOUSE_URL- ClickHouse connection URL (already filled in)TENSORZERO_POSTGRES_URL- PostgreSQL connection URL (already filled in)- API keys (at least one required):
OPENROUTER_API_KEY- Get one at openrouter.aiOPENAI_API_KEY- Get one at platform.openai.comANTHROPIC_API_KEY- Get one at console.anthropic.com
-
Start the services:
docker compose up -d
This starts the gateway on
localhost:3000and the UI onlocalhost:4000. -
Run wordle episodes:
# Run a single episode uv run python rollout.py # Run multiple episodes with concurrency uv run python rollout.py --num-episodes 10 --concurrency 5
-
View results in the UI at localhost:4000
After editing the config, restart the services:
docker compose restart gateway ui--num-episodes: Number of episodes to run (default: 1)--concurrency: Maximum concurrent episodes (default: 5)--config-file: Path to TensorZero config (default: config/tensorzero.toml)
The TensorZero config supports three model variants for experimentation:
glm: GLM-4.5-Air via OpenRouter (free tier)openai: GPT-5.1-mini via OpenAIanthropic: Claude 4.5 Sonnet via Anthropic
By default, only the OpenAI variant is enabled. The GLM and Anthropic variants are commented out in the config.
To enable all three variants:
- Set the API keys in your
.envfile - Uncomment the model and variant sections in
config/tensorzero.toml:# Uncomment these model definitions [models.glm_4_5_air] routing = ["openrouter"] # ... [models.claude_4_5_sonnet] routing = ["anthropic"] # ... # Uncomment these variant definitions [functions."verifiers_v0::verifiers::wordle".variants.glm] # ... [functions."verifiers_v0::verifiers::wordle".variants.anthropic] # ...
- Update the
candidate_variantslist to include all active variants:candidate_variants = ["glm", "openai", "anthropic"]
The config uses track-and-stop experimentation to find the best variant among those configured.