An AI-powered task management REST API that automatically prioritizes tasks using a locally-running LLM (Ollama + llama3) — no API keys, no external services needed.
- You create a task with a title and description
- The API saves it instantly and returns a response with
priority: pending - In the background, the task is sent to llama3 running locally via Ollama
- The AI assigns
low/medium/highpriority plus a one-line reason - The task is updated automatically
| Layer | Technology |
|---|---|
| API Framework | FastAPI 0.115 (async) |
| Database | PostgreSQL 16 + SQLAlchemy 2 |
| AI Model | Ollama + llama3 (local) |
| Containerization | Docker + Docker Compose |
| Testing | pytest + pytest-asyncio |
| Linting | Ruff |
| CI | GitHub Actions |
git clone https://github.com/JeffiN11/smart-task-api.git
cd smart-task-api
docker compose up --buildAPI: http://localhost:8000 Docs: http://localhost:8000/docs
| Method | Endpoint | Description |
|---|---|---|
| GET | / | Health check |
| POST | /tasks/ | Create task (AI prioritizes in background) |
| GET | /tasks/ | List tasks (filter by status/priority) |
| GET | /tasks/{id} | Get single task |
| PATCH | /tasks/{id} | Update task |
| DELETE | /tasks/{id} | Delete task |
| POST | /tasks/{id}/reprioritize | Force AI re-prioritization |
Create a task:
curl -X POST http://localhost:8000/tasks/ \
-H "Content-Type: application/json" \
-d '{"title": "Fix login bug", "description": "Users cannot log in on mobile when 2FA is enabled."}'Instant response:
{
"id": 1,
"title": "Fix login bug",
"priority": "pending",
"ai_reasoning": null
}After AI responds:
{
"id": 1,
"title": "Fix login bug",
"priority": "high",
"ai_reasoning": "Login failures block all users from accessing the product."
}pip install -r requirements.txt -r requirements-dev.txt
pytest -v- high — Urgent, blocking others, production issues, deadlines within 24h
- medium — Important but not urgent, should be done this week
- low — Nice-to-have, minor improvements, no immediate impact
If Ollama is unavailable, the API gracefully defaults to medium priority and never fails.