Texet is a small API service (web endpoints you call) that:
- accepts inbound messages for a user,
- stores them in Postgres (database),
- generates a reply using OpenAI,
- sends the reply to your SMS webhook (a URL you control).
There is no end-user UI. You interact with it via HTTP (web requests) or the console dashboard (admin web page).
If you are monitoring or managing, start at Console. If you are setting things up, start at Quick start and Configuration. For build/test details, see Developer details.
Local vs live: "local" means on your machine (http://localhost:8000). "live" means the deployed server URL.
Links below are local unless they say "live".
Use the console to monitor data, manage API access, and export datasets.
- Console home (local):
http://localhost:8000/console - Admin views (read-only):
/console/admin(speakers, conversations, utterances) - API keys:
/console/api-keys(create keys; shown once) - Exports:
/console/exports(download ConvoKit corpora; ConvoKit is a conversation dataset format) - API docs:
/console/docs(interactive docs; try it out)
Notes:
- Console access requires
ADMIN_USERNAME,ADMIN_PASSWORD,ADMIN_SECRET_KEY(optional env vars that enable the console/admin UI). - API docs run against the current environment. Use test keys/data when validating changes.
- Exports are logged in the console (status, counts, verification, timestamps).
GET /health- service health.GET /db/health- database health.POST /response- accepts{ "user_id": "...", "input": "...", "mode": "text", "metadata": { ... } }(mode istexttoday; metadata is optional extra info).
Response:
202 queuedwith{ "id", "object", "status", "conversation_id", "mode" }.
Example (local):
curl -H "Authorization: Bearer <API_KEY>" \
-H "Content-Type: application/json" \
-X POST http://localhost:8000/response \
-d '{"user_id":"u1","input":"hello","mode":"text","metadata":{"source":"sms"}}'received: inbound user message stored.queued: outbound reply persisted, pending send.sent: outbound reply delivered to SMS webhook.failed: outbound reply failed;errorcaptures the failure.
Prereqs:
- Docker Desktop installed and running (runs containers locally).
- A text editor to edit
.envfiles. - An OpenAI API key and an SMS webhook URL (optional for smoke tests).
Command examples below are bash (macOS/Linux). On Windows, use WSL or PowerShell equivalents.
This README uses make targets for consistency; see Makefile for what each target runs and
docker-compose.yml for the local services (Docker Compose runs the app stack). For dependency management and local tooling, see
Package management (uv) below.
-
Create local env files:
cp .env.db.example .env.db cp .env.api.example .env.api
-
Edit
.env.apiand set:OPENAI_API_KEYandOPENAI_MODEL(example:gpt-4o-mini).SMS_OUTBOUND_URLto your SMS webhook endpoint (for testing, a request bin works).SMS_OUTBOUND_AUTHORIZATIONif your SMS webhook expects an Authorization header.- If you change DB creds in
.env.db, updateDATABASE_URLandDATABASE_URL_TESTin.env.apito match.
-
Start the stack:
make start
-
Apply migrations (database changes):
make migrate
-
Create an API key (CLI or console):
make api-key
Or open
http://localhost:8000/console, click "API Keys", and create one there. Copy the printed key and keep it somewhere safe. The server does not readAPI_KEYfrom the environment; clients use it in theAuthorizationheader. Keys are shown once. For local smoke tests, export it in your shell:export API_KEY=texet_... -
Verify in a browser:
http://localhost:8000/healthhttp://localhost:8000/console(admin + API docs; login required)
-
Send a message:
- In
/console/docs, use POST/responseand setAuthorization: Bearer <API_KEY>(bearer token = shared secret). - The API returns
202 queued. The reply is sent toSMS_OUTBOUND_URL.
- In
-
Optional smoke test:
make smoke
If
API_KEYis not set, the smoke test creates a temporary key for you.
To stop everything:
make downNote: make reset removes volumes, which wipes local DB data.
Want to visualize the local setup? Skim Makefile (command entry points) and
docker-compose.yml (services and ports). If these tools are new, the docs help:
https://docs.docker.com/get-started/, https://docs.docker.com/compose/, and
https://makefiletutorial.com/.
Required for server:
DATABASE_URL- database connection string (async SQLAlchemy format).OPENAI_API_KEY- OpenAI API key.OPENAI_MODEL- model name.SMS_OUTBOUND_URL- webhook URL for outbound replies.
Client auth:
API_KEY- create withmake api-keyor/console/api-keys(bearer token = shared secret).- Store it securely and distribute to internal users as needed.
Optional:
SMS_TIMEOUT_SECONDS- outbound HTTP timeout in seconds (default15).SMS_OUTBOUND_AUTHORIZATION- Authorization header value for outbound SMS (e.g.,Bearer <token>).ADMIN_USERNAME,ADMIN_PASSWORD,ADMIN_SECRET_KEY- enable console (admin UI).ADMIN_SESSION_TTL_SECONDS- console/admin login session lifetime in seconds (default28800).
Testing:
DATABASE_URL_TEST- test database connection string (async SQLAlchemy format).- Keep this separate from production data; tests wipe this database.
Timezone:
- API and database sessions default to
EST(UTC-05:00).
app/main.py,app/config.py,app/db.py— core app setup.app/response/— response endpoint schemas + service + CRUD.app/auth/— API key auth + key creation.app/console/— console UI + exports + admin views.app/models/— models grouped by domain.
We use uv (Python package manager) to manage dependencies and tooling. Day-to-day runtime still
uses Docker Compose; use uv when you need to add/upgrade packages or run local tools.
-
Install uv: https://docs.astral.sh/uv/getting-started/installation/
-
Create the local environment and lockfile:
uv sync
-
Add a package:
uv add <package>
-
Upgrade a package:
uv lock --upgrade-package <package>
-
Migrations use Alembic (migration tool) and the
DATABASE_URLfrom the running Docker Compose stack. -
Migrations are how the database schema is created or updated.
-
Define or update models in
app/models/, then generate a migration. -
Create a new migration:
make migration name=add_speakers
-
Apply migrations:
make migrate
-
If running locally (outside Docker), set
DATABASE_URLbefore running Alembic.
- Background task pipeline uses Kani (LLM orchestration) with OpenAI (model provider) for reply generation.
- Requires
OPENAI_API_KEYandOPENAI_MODEL. - Failures mark the reply utterance as
failedwith an error message.
-
Run the full test suite with coverage (Docker):
make test -
Requires the DB running via
make start. -
Uses the
texet_testdatabase and applies Alembic migrations before running. -
See
Makefilefor the exact commands.
-
Start the stack and apply migrations first:
make start make migrate
-
Run the end-to-end smoke test:
make smoke
-
If
SMS_OUTBOUND_URLis empty, the smoke test expects SMS delivery to fail and will assert failed status counts instead of sent counts.
-
Run lint/format/typecheck/audit:
make fix
-
See
Makefilefor the exact commands.
make startbuilds and starts the stack.make downstops the stack (keeps volumes).make resetstops the stack and removes volumes.make testruns the full test suite with coverage (requires the DB running viamake start).make lintruns lint checks (no fixes).make fixapplies lint fixes and formatting.make typeruns type checks.make auditruns a dependency vulnerability audit.make migration name=...creates a new Alembic revision (requires the DB running).make migrateapplies Alembic migrations (requires the DB running).
- Do not commit
.envfiles or real API keys. Rotate any keys that have been shared. - Model schema informed by ConvoKit: https://convokit.cornell.edu/
- Kani docs: https://kani.readthedocs.io/