Skip to content

mayinx/docker-fastapi-tests

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

12 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ‹ Docker Exam Project: FastAPI Sentiment Test Pipeline

Tested sentiment analysis API β€’ Python-based tests β€’ Reproducible β€’ CI/CD-style β€’ One container per test suite β€’ Shared aggregated log

🎯 What this project demonstrates

This repository implements a Docker Compose test pipeline for the sentiment analysis API image datascientest/fastapi:1.0.0.

βœ… API container exposed on host port 8000 (endpoints: /status, /permissions, /v1/sentiment, /v2/sentiment)
βœ… 3 separate Python test containers (one per suite) that validate:

  • Authentication (/permissions)
  • Authorization (/v1/sentiment vs /v2/sentiment)
  • Content (positive/negative score checks for given sentences)

βœ… Automatic sequential execution via Compose depends_on conditions: API β†’ Authentication β†’ Authorization β†’ Content
βœ… LOG=1 support: all suites append into a single shared api_test.log (kept in ./shared/).
βœ… setup.sh runs the whole pipeline reproducibly and produces log.txt (submission artifact)


🎭 Tech Stack

πŸ‹ Docker / Docker Compose | 🐍 Python 3.12 | 🌐 requests | βš™οΈ Makefile orchestration


🧠 Engineering Notes (Beyond Requirements): Shared, reusable test framework

While the exam only requires β€œ3 test containers + test scripts + a shared log”, I deliberately invested extra effort to keep the solution abstract, reusable, and maintainableβ€”so each suite only defines its test cases in a (mroe or less) DSL-like manner, while the execution (incl. assertions) + logging pipeline stays consistent across all suites.

What’s abstracted (and why it matters)

  • Central config loading (tests/_shared/config.py)
    All suites use the same env contract (API_ADDRESS, API_PORT, LOG, LOG_PATH, HTTP_TIMEOUT) so behavior is consistent across containers and host runs.

  • One generic request runner (tests/_shared/runner.py)
    A single function executes HTTP requests, validates status codes, and (only when required) validates sentiment score direction.
    β†’ Suites don’t duplicate request/validation logic.

  • Unified, deterministic logging (tests/_shared/logging.py)
    Consistent suite headers/footers + per-test formatting for stdout and (when LOG=1) a shared append-only log file.
    β†’ The aggregated api_test.log stays readable and stable across runs.

  • Generic params handling (tests/_shared/params.py)
    iter_params(...) normalizes suite-specific param objects (dicts, dataclasses, NamedTuples, etc.) into (key, value) pairs for logging and request execution.
    β†’ Each suite can model its test parameters however it wants without changing the logger/runner.

  • Shared types for clarity (tests/_shared/types.py)
    Common TestCase + TestResult structures keep the contract between suite definitions and the shared engine explicit.

Result

Each suite module focuses on only:

  • defining test cases (endpoint + params + expected outcomes)
  • invoking the shared runner/logger
  • returning an exit code suitable for CI/CD

Everything else (config, readiness waiting, request execution, output format, file logging) is handled once in tests/_shared/.


πŸ—οΈ Architecture (Pipeline + Shared Log)

                                (host)
                       ./shared/  +  ./log.txt
                          β–²               β–²
                          β”‚               β”‚  snapshot copy
         bind mount       β”‚               └─ setup.sh / make snapshot-log
      ./shared:/shared    β”‚
                          β”‚
+-------------------------------------------------------------------------------+
|                          docker compose project                               |
|                                                                               |
|   +--------------------------+            +------------------------------+    |
|   |        API service        |<---------->|   internal network           |   |
|   |  datascientest/fastapi    |  HTTP      |   sentiment_net (DNS: api)   |   |
|   |  host 8000 -> :8000       |            +------------------------------+   |
|   +-------------+------------+                                                |
|                 ^                                                             |
|                 |  (all test suites call http://api:8000/...)                 |
|                 |                                                             |
|   +-------------+--------------------------------------------------------+    |
|   |                                                                      |    |
|   |   +-------------------+     +-------------------+     +----------------+  |
|   |   | auth_test (suite) | --> | authz_test (suite)| --> | content_test    | |
|   |   | /permissions      |     | /v1 + /v2 access  |     | /v1 + /v2 score | |
|   |   +---------+---------+     +---------+---------+     +--------+--------+ |
|   |             |                       |                        |            |
|   |             | append                | append                 | append     |
|   |             v                       v                        v            |
|   |       +--------------------------------------------------------------+    |
|   |       |          shared bind mount: ./shared : /shared               |    |
|   |       |          aggregated log:    /shared/api_test.log             |    |
|   |       +--------------------------------------------------------------+    |
|   |                                                                      |    |
|   +----------------------------------------------------------------------+    |
|                                                                               |
+-------------------------------------------------------------------------------+

Sequential order is enforced by docker-compose `depends_on` conditions:
- `auth_test` waits for `api` to start (service_started) + polls /status until ready
- `authz_test` starts only after `auth_test` finished successfully (service_completed_successfully)
- `content_test` starts only after `authz_test` finished successfully (service_completed_successfully)

All suites append into the same shared file: /shared/api_test.log
At the end, setup.sh snapshots it to ./log.txt (exam artifact).

πŸ“ Project Structure (high level)

.
β”œβ”€β”€ docker-compose.yml
β”œβ”€β”€ Makefile
β”œβ”€β”€ setup.sh
β”œβ”€β”€ README.md
β”œβ”€β”€ log.txt                  # exam artifact (snapshotted from ./shared/api_test.log)
β”œβ”€β”€ docs/
    β”œβ”€β”€ IMPLEMENTATION.md
β”œβ”€β”€ shared/
β”‚   └── api_test.log         # aggregated suite logs (written by test containers when LOG=1)
└── tests/
    β”œβ”€β”€ _shared/             # common helpers (config, logging, readiness, runner, types)
    β”œβ”€β”€ authentication/
    β”‚   β”œβ”€β”€ Dockerfile
    β”‚   └── test_authentication.py
    β”œβ”€β”€ authorization/
    β”‚   β”œβ”€β”€ Dockerfile
    β”‚   └── test_authorization.py
    └── content/
        β”œβ”€β”€ Dockerfile
        └── test_content.py

πŸš€ Quick Start (Exam Runner)

1) Run the full pipeline (build β†’ start β†’ test β†’ snapshot log β†’ cleanup)

./setup.sh

This will:

  • reset to a clean state (containers/ports/logs)
  • start the API + test containers
  • run suites in order: AUTHENTICATION β†’ AUTHORIZATION β†’ CONTENT
  • write the aggregated log to ./shared/api_test.log (exam requirement via LOG=1)
  • copy it to ./log.txt (submission artifact)
  • stop everything (rerun-safe)

πŸ”Ž Manual API sanity checks (optional)

curl -s "http://localhost:8000/status"; echo
curl -s -o /dev/null -w "%{http_code}\n" "http://localhost:8000/docs"

βœ… Most useful Make targets

  • make start-project β€” start stack (detached) and build images
  • make stop-project β€” stop stack (normal down)
  • make stop-all β€” stop stack + remove orphans (quiet + idempotent)
  • make reset β€” guaranteed clean state (stop-all + kill-api + free-port-8000 + reset-logs)
  • make logs β€” follow logs for the whole stack
  • make logs-auth / make logs-authz / make logs-content β€” print suite logs (tail)
  • make snapshot-log β€” copy ./shared/api_test.log β†’ ./log.txt

🧾 Implementation log

Instead of maintaining a separate README_student.md, this project keeps a single detailed build diary:

➑️ See docs/IMPLEMENTATION.md for step-by-step implementation notes, decisions, and commands:


βš–οΈ Notes on portability (UID/GID + bind mounts)

The test containers write into a bind-mounted folder (./shared:/shared).
To avoid root-owned files on the host, the test services run as the host user:

  • setup.sh exports HOST_UID and HOST_GID
  • docker-compose.yml uses user: "${HOST_UID}:${HOST_GID}" for each test service

This keeps ./shared/api_test.log writable and removable without sudo, and makes reruns deterministic.


APPENDIX: Original Exam Brief (excerpt)

Goal: Build a small CI/CD-style Docker Compose pipeline that automatically tests a provided sentiment analysis FastAPI container image.

  • API image: datascientest/fastapi:1.0.0
  • Endpoints: /status, /permissions, /v1/sentiment, /v2/sentiment
  • Pipeline requirement: Docker Compose must launch 4 containers total:
    • 1Γ— API container
    • 3Γ— separate test containers (Authentication, Authorization, Content) β€” one python test suite per container
  • Logging requirement: When LOG=1, each suite must append its report into api_test.log (single aggregated file)
  • Expected test coverage:
    • Authentication: /permissions returns 200 for alice:wonderland and bob:builder, and 403 for clementine:mandarine
    • Authorization: bob can use v1 only, alice can use v1 and v2
    • Content: using alice, sentences "life is beautiful" (positive score) and "that sucks" (negative score) must be validated for both v1 and v2
  • Final deliverables include: docker-compose.yml, Python test scripts, Dockerfiles, setup.sh, and a submission log.txt containing the aggregated results.

βœ… Deliverables checklist (Exam Requirements)

  • βœ… docker-compose.yml contains the sequence of tests (API + 3 suites)
  • βœ… Python test files for Authentication / Authorization / Content
  • βœ… Dockerfiles to build each test image
  • βœ… setup.sh to build + launch the compose pipeline
  • βœ… log.txt containing the aggregated logs (snapshotted from ./shared/api_test.log)
  • βœ… Optional remarks file: docs/IMPLEMENTATION.md

About

Containerized CI/CD-style test pipeline for a FastAPI sentiment analysis API: Docker Compose spins up the API plus 3 dedicated test containers (authentication, authorization, content) executed sequentially, with optional aggregated logging and a reproducible setup.sh runner.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors