cd docker
docker-compose up --buildAccess the API at http://localhost:8000
Feature summaries are generated locally after saving features. This runs on kernel.terminate so the save response isn't blocked.
Steps:
- Start the local model service (
ollamais included indocker-compose.override.yml). - Pull a model inside the container (example:
ollama pull llama3.2:3b). - Ensure your
.env.localpoints to the model server (e.g.FEATURE_SUMMARY_BASE_URL=http://ollama:11434).