Add pytest foundation with Testcontainers and reusable fixtures#37
Add pytest foundation with Testcontainers and reusable fixtures#37saudzahirr wants to merge 6 commits intomainfrom
Conversation
* Add conftest.py with shared fixtures: postgres_container, app, client, db_session (transactional rollback for full test isolation) * Use Testcontainers to provision a throwaway PostgreSQL instance per session * Define unit and integration markers with pytest filtering support * Add smoke tests validating DB isolation, HTTP client, and marker registration * Add pytest-asyncio, httpx, and testcontainers[postgres] as dev dependencies * Configure asyncio_mode=auto and suppress pytest-asyncio deprecation warnings * Fix Pydantic deprecated class Config to model_config in ErrorResponse
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds Testcontainers-based PostgreSQL test infrastructure, pytest config and dev deps, unit and integration smoke tests, a migration runner and per-test transactional DB sessions, and converts Changes
Sequence Diagram(s)sequenceDiagram
participant Tester as Test Runner
participant TC as Postgres Testcontainer
participant Env as Test Env
participant Migrator as Migration Runner
participant App as FastAPI App
participant Pool as psycopg ConnectionPool
participant DB as Postgres DB
participant Client as httpx AsyncClient
Tester->>TC: start Postgres container
TC-->>Tester: provide connection info
Tester->>Env: set env vars for app using container info
Tester->>Migrator: run migrations (migrations/*.sql) against DB
Tester->>App: import/create app (env applied)
App->>Pool: create ConnectionPool and attach to app.state.db_pool
Tester->>Client: create httpx.AsyncClient bound to App
Tester->>Pool: acquire connection for db_session
Tester->>DB: execute test SQL within transaction (via cursor)
DB-->>Tester: return query results
Tester->>Pool: rollback transaction and release connection
Client->>App: request /api/v1/health
App->>Pool: optionally query DB or check health
App-->>Client: return 200 OK JSON
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
|
Resolves #8 |
There was a problem hiding this comment.
Actionable comments posted: 4
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@backend/pyproject.toml`:
- Around line 21-23: Remove the redundant dev dependency entry
"httpx<1.0.0,>=0.25.1" from the dev-dependencies list (the duplicate shown in
the diff) since it's already declared in the main dependencies; edit the
pyproject.toml dev-dependencies block (look for the entry
"httpx<1.0.0,>=0.25.1") and delete that line so only the single main dependency
remains.
In `@backend/tests/conftest.py`:
- Around line 45-65: Refactor the postgres_container fixture to use pytest's
yield-style teardown: replace the current setup that calls
request.addfinalizer(_stop) with a yield-based pattern inside the
postgres_container function (start the PostgresContainer, yield it to tests, and
stop it in a finally block); do the same refactor for the app fixture (the
fixture defined around lines 125-156) so both fixtures keep setup and teardown
together using try/finally around a yield instead of request.addfinalizer.
- Line 162: Remove the empty parentheses from the pytest fixture decorators so
they use the bare decorator: replace occurrences of `@pytest.fixture`() with
`@pytest.fixture` (e.g., on the test fixtures decorated with pytest.fixture in
this file) to comply with pytest/Ruff PT001 style; ensure you update every
instance (both places flagged) so no fixture decorator has empty parentheses.
In `@backend/tests/integration/test_smoke.py`:
- Around line 37-53: The two tests rely on ordering; merge the behavior of
test_db_session_rolls_back and test_db_session_is_isolated into a single test
(e.g., keep name test_db_session_rolls_back_and_isolation) that uses the same
db_session fixture to create the _test_isolation table, assert it exists inside
the transaction, then end/rollback that transaction and open a fresh db_session
(or reuse fixture behavior) to query information_schema.tables for
_test_isolation and assert it does not exist; update or remove the standalone
test_db_session_is_isolated to avoid order dependency.
ℹ️ Review info
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (8)
backend/app/schemas/response.pybackend/pyproject.tomlbackend/tests/__init__.pybackend/tests/conftest.pybackend/tests/integration/__init__.pybackend/tests/integration/test_smoke.pybackend/tests/unit/__init__.pybackend/tests/unit/test_smoke.py
…s for improved isolation and clarity
There was a problem hiding this comment.
Actionable comments posted: 3
♻️ Duplicate comments (1)
backend/tests/integration/test_smoke.py (1)
22-50:⚠️ Potential issue | 🟠 MajorThis test bypasses the fixture rollback it claims to validate.
Line 40 manually rolls back, so the assertion checks explicit rollback behavior rather than
db_sessionfixture teardown isolation. Remove manual rollback and assert isolation at test start instead.Proposed fix
`@pytest.mark.integration` def test_db_session_rolls_back_and_isolation(db_session: psycopg.Cursor) -> None: @@ - # Write inside the transactional db_session + # Verify clean state at start (proves prior test isolation) + db_session.execute( + "SELECT EXISTS (" + " SELECT FROM information_schema.tables " + " WHERE table_name = '_test_isolation'" + ")" + ) + assert not db_session.fetchone()[0] + + # Write inside the transactional db_session db_session.execute( "CREATE TABLE IF NOT EXISTS _test_isolation (id serial PRIMARY KEY, val text)" ) db_session.execute("INSERT INTO _test_isolation (val) VALUES ('should_be_gone')") db_session.execute("SELECT count(*) FROM _test_isolation") assert db_session.fetchone()[0] == 1 # visible inside this txn - - # Rollback the transaction to undo DDL + DML - db_session.connection.rollback() - - # After rollback the table should no longer exist - db_session.execute( - "SELECT EXISTS (" - " SELECT FROM information_schema.tables " - " WHERE table_name = '_test_isolation'" - ")" - ) - exists = db_session.fetchone()[0] - assert not exists🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@backend/tests/integration/test_smoke.py` around lines 22 - 50, The test currently calls db_session.connection.rollback(), which bypasses the fixture's teardown isolation; remove that manual rollback (the db_session.connection.rollback() call) and instead assert isolation at the start of test_db_session_rolls_back_and_isolation by verifying the _test_isolation table does not exist before creating it (use db_session.execute(...) + db_session.fetchone() to check information_schema.tables), then perform the CREATE/INSERT and assert the row is visible inside the transaction; this ensures the fixture provides isolation without manually rolling back.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@backend/tests/conftest.py`:
- Around line 80-81: Replace the use of os.environ.setdefault for test-only
environment variables so test behavior isn't influenced by the host environment:
in conftest.py change the calls that reference SECRET_KEY and PROJECT_NAME (the
lines using os.environ.setdefault("SECRET_KEY", ...) and
os.environ.setdefault("PROJECT_NAME", ...)) to unconditionally assign
deterministic test values via os.environ["SECRET_KEY"] = "<test-value>" and
os.environ["PROJECT_NAME"] = "<test-value>" (use the existing test strings),
ensuring tests always run with the same environment regardless of CI/local
overrides.
- Around line 106-107: The early return on the migrations_dir existence check
should fail loudly instead of silently skipping schema setup; replace the `if
not migrations_dir.exists(): return` behavior with a hard failure (e.g., raise
FileNotFoundError or call pytest.fail) in the fixture/function that references
migrations_dir so tests stop and report the missing migrations path, and include
the migrations_dir path in the error message for debugging.
- Around line 192-201: The teardown can leak the pooled connection if
conn.rollback() (or cursor.close()) raises; adjust the fixture so
pool.putconn(conn) is guaranteed to run: after the yield wrap conn.rollback()
and cursor.close() each in their own try/except (or a combined try that catches
and logs exceptions) and then call pool.putconn(conn) in an outer finally block
so pool.putconn(conn) is executed regardless of rollback/close failures; look
for the variables/functions pool.getconn, conn.rollback, cursor.close, and
pool.putconn to apply this change.
---
Duplicate comments:
In `@backend/tests/integration/test_smoke.py`:
- Around line 22-50: The test currently calls db_session.connection.rollback(),
which bypasses the fixture's teardown isolation; remove that manual rollback
(the db_session.connection.rollback() call) and instead assert isolation at the
start of test_db_session_rolls_back_and_isolation by verifying the
_test_isolation table does not exist before creating it (use
db_session.execute(...) + db_session.fetchone() to check
information_schema.tables), then perform the CREATE/INSERT and assert the row is
visible inside the transaction; this ensures the fixture provides isolation
without manually rolling back.
ℹ️ Review info
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (3)
backend/pyproject.tomlbackend/tests/conftest.pybackend/tests/integration/test_smoke.py
…integration tests
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@backend/tests/conftest.py`:
- Around line 194-210: Move the cursor creation into the try block so a thrown
exception from conn.cursor() won't skip the finally that returns the connection;
specifically, create cursor inside the try that currently surrounds yield (i.e.,
ensure conn.cursor() is called after entering the try), and guarantee
pool.putconn(conn) is executed in the outer finally regardless of where errors
occur. Also keep the cleanup except blocks around conn.rollback() and
cursor.close() but add debug-level logging of those exceptions (use the module
logger) instead of completely silent pass to aid debugging; reference
conn.cursor(), conn.rollback(), cursor.close(), and pool.putconn(conn) when
applying the changes.
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@backend/tests/conftest.py`:
- Around line 197-217: The finally block can raise NameError if conn.cursor()
fails because cursor is never assigned; before calling conn.cursor() set cursor
= None, and in the finally guard the close call (e.g., if cursor is not None:
cursor.close()) so cursor.close() is only invoked when assigned, ensuring
pool.putconn(conn) always runs and avoids leaking the connection; reference the
pool, conn, cursor, conn.cursor(), cursor.close(), conn.rollback(), and
pool.putconn(conn) symbols when making the change.
There was a problem hiding this comment.
we should be overriding the get_db dependency in fastapi.testclient in the fixtures. since the endpoints would be using the dependency injection. from the perspective of endpoint, there is only cursor object and there could be no pool for that is managed by get_db dependency. Following is an example snippet which might simplify this entire file.
import pytest
from testcontainers.postgres import PostgresContainer
from fastapi.testclient import TestClient
from psycopg import connect
from app.main import app
from app.api.deps import get_db
@pytest.fixture(scope="module")
def override_get_db():
def _get_db():
# use appropriate exception handling as necessary
with PostgresContainer(
"postgres:18.2-alpine3.23"
).with_volume_mapping(
"path/to/migration/dir",
"/docker-entrypoint-initdb.d"
) as container:
with connect(container.get_connection_url()) as conn:
with conn.cursor() as cursor:
yield cursor
app.dependency_overrides[get_db] = _get_db
yield
app.dependency_overrides.pop(get_db, None)…mproved isolation
There was a problem hiding this comment.
Actionable comments posted: 6
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@backend/pyproject.toml`:
- Line 26: The TODO comment next to the dependency "pytest-xdist>=3.5.0,<4.0.0"
("TODO: Remove when meaningful tests are added") should be tracked: create a new
issue in the project tracker referencing that TODO and link the issue id or URL
in the pyproject.toml comment (or remove the TODO if the dependency is intended
to be permanent); update the line containing "pytest-xdist>=3.5.0,<4.0.0" to
either remove the TODO comment and leave the dependency, or replace the TODO
with "tracked in ISSUE-<id>" so the intent is not forgotten (search for the
exact string "TODO: Remove when meaningful tests are added" to locate the spot).
In `@backend/tests/conftest.py`:
- Around line 125-144: The override_get_db fixture currently creates its own
PostgresContainer per module, duplicating the session-scoped postgres_container
and conflicting with _run_migrations and _set_test_env; change override_get_db
to reuse the session-scoped postgres_container (do not create a new container or
volume mapping), build the DSN from the session container (or environment
variables set by _set_test_env), open a connection and yield a cursor from that
DSN, and keep app.dependency_overrides[get_db] pointing to this new helper that
does not manage container lifecycle so migrations from _run_migrations and the
session container remain authoritative.
- Around line 147-150: Remove the redundant scope argument from the pytest
fixture declaration for client in conftest.py: update the `@pytest.fixture`
decorator on the client(override_get_db) fixture to omit scope="function" (the
default), leaving just `@pytest.fixture` to satisfy Ruff PT003 and keep the
fixture definition in the client function unchanged.
- Around line 131-140: Add an explicit return type annotation to the inner
function _get_db: change its signature to return a typing.Generator or
typing.Iterator of the DB cursor (e.g., -> Generator[psycopg2.extensions.cursor,
None, None] or -> Iterator[psycopg2.extensions.cursor]) and add the matching
imports (typing.Generator/Iterator and the cursor type from psycopg2/psycopg).
This ensures the generator yield type is declared for _get_db and improves
IDE/type-checker support.
In `@backend/tests/integration/test_smoke.py`:
- Around line 55-65: Add a type annotation to the test_client_health function's
client parameter (change def test_client_health(client) to def
test_client_health(client: TestClient)) to match other tests; import the
TestClient symbol used by the test-suite (e.g., from starlette.testclient import
TestClient or the project's fixture type) if not already imported so the
annotation resolves, and keep the function name test_client_health unchanged.
- Around line 7-22: Add a pytest fixture named db_session (e.g., in conftest.py)
that returns a psycopg Cursor per test: annotate with `@pytest.fixture` (function
scope), create a psycopg connection to the test Postgres, start a transaction
(or ensure autocommit=False), open a cursor (cursor = conn.cursor()), yield that
cursor to tests (db_session), and after yield roll back the transaction and
close cursor/connection to ensure isolation and cleanup; ensure the fixture
signature/type hints indicate psycopg.Cursor.
ℹ️ Review info
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (4)
backend/pyproject.tomlbackend/tests/conftest.pybackend/tests/integration/test_smoke.pypyproject.toml
…date pytest-xdist comment for tracking
| @pytest.fixture(scope="session", autouse=True) | ||
| def _set_test_env(postgres_container: PostgresContainer) -> None: | ||
| """Override environment variables for the test session. | ||
|
|
||
| Sets connection details so that ``app.core.config.Settings`` picks up | ||
| the Testcontainers Postgres instance instead of a real database. | ||
|
|
||
| Args: | ||
| postgres_container: The running Testcontainers Postgres instance. | ||
| """ | ||
| os.environ["POSTGRES_SERVER"] = postgres_container.get_container_host_ip() | ||
| os.environ["POSTGRES_PORT"] = str(postgres_container.get_exposed_port(5432)) | ||
| os.environ["POSTGRES_USER"] = postgres_container.username | ||
| os.environ["POSTGRES_PASSWORD"] = postgres_container.password | ||
| os.environ["POSTGRES_DB"] = postgres_container.dbname | ||
| os.environ["SECRET_KEY"] = "test-secret-key-for-testing-only" | ||
| os.environ["PROJECT_NAME"] = "permit-test" |
There was a problem hiding this comment.
we should not bother with setting env vars since dev environment should already have those. with that being said, the testcontainers should be providing the database connection so these env vars should never be directly used by any code.
| @pytest.fixture(scope="session", autouse=True) | ||
| def _run_migrations( | ||
| postgres_container: PostgresContainer, | ||
| _set_test_env: None, # ensure env is set first | ||
| ) -> None: | ||
| """Apply SQL migration files from the ``migrations/`` directory. | ||
|
|
||
| Reads every ``.sql`` file in alphabetical order and executes it | ||
| against the Testcontainers Postgres instance. | ||
|
|
||
| Args: | ||
| postgres_container: The running Testcontainers Postgres instance. | ||
| _set_test_env: Ensures environment variables are configured first. | ||
| """ | ||
| import pathlib | ||
|
|
||
| migrations_dir = ( | ||
| pathlib.Path(__file__).resolve().parent.parent.parent / "migrations" | ||
| ) | ||
| if not migrations_dir.exists(): | ||
| pytest.fail( | ||
| f"Migrations directory not found: {migrations_dir}. " | ||
| "Tests require migration SQL files to set up the database schema." | ||
| ) | ||
|
|
||
| dsn = _make_dsn(postgres_container) | ||
| with psycopg.connect(dsn) as conn: | ||
| conn.autocommit = True | ||
| for sql_file in sorted(migrations_dir.glob("*.sql")): | ||
| conn.execute(sql_file.read_text()) | ||
|
|
There was a problem hiding this comment.
this fixture is redundant since we are already mounting the migrations to /docker-entrypoint-initdb.d in postgres container so migrations should be applied automatically.
| "ruff<1.0.0,>=0.2.2", | ||
| "prek>=0.2.24,<1.0.0", | ||
| "coverage<8.0.0,>=7.4.3", | ||
| "pytest-xdist>=3.5.0,<4.0.0", # tracked in ISSUE-1 (see backend/ISSUE-1.md) |
There was a problem hiding this comment.
there is no such issue. plus we don't a TODO to schedule its removal as it should always be there to run tests in parallel.
| def override_get_db(postgres_container): | ||
| dsn = _make_dsn(postgres_container) | ||
|
|
||
| def _get_db() -> typing.Generator[Cursor, None, None]: | ||
| with connect(dsn) as conn: | ||
| with conn.cursor() as cursor: | ||
| yield cursor | ||
|
|
||
| app.dependency_overrides[get_db] = _get_db | ||
| yield | ||
| app.dependency_overrides.pop(get_db, None) |
There was a problem hiding this comment.
we should not be using the single container for the entire testing session since it might have unintended side effects when tests are run in the parallel. this override should start the testcontainer for each override on the module level.
| Yields: | ||
| A running PostgresContainer instance. | ||
| """ | ||
| container = PostgresContainer("postgres:16-alpine") |
There was a problem hiding this comment.
this image is different than the one we are using for the production in docker-compose.yml. we should keep the testing and production same for consistency.
| def _make_dsn(container: PostgresContainer) -> str: | ||
| dsn = container.get_connection_url() | ||
| # Patch SQLAlchemy/Testcontainers DSN to psycopg3-compatible | ||
| if dsn.startswith("postgresql+psycopg2://"): | ||
| dsn = dsn.replace("postgresql+psycopg2://", "postgresql://", 1) | ||
| return dsn | ||
|
|
||
|
|
There was a problem hiding this comment.
this functionality can be done by passing the optional driver to None in get_connection_url or when creating the testcontainer so this should be removed completely.
|
|
||
| # Load .env before importing app.main to ensure env vars are set | ||
| dotenv_path = os.path.join(os.path.dirname(__file__), "..", ".env") | ||
| load_dotenv(dotenv_path) | ||
|
|
There was a problem hiding this comment.
we should not load the full env here as well using the dotenv for the obvious reasons. this might work in the local env but .env files are not present in CI so this would be not working.
| @pytest.fixture | ||
| def client(override_get_db): | ||
| with TestClient(app) as c: | ||
| yield c | ||
|
|
There was a problem hiding this comment.
this simple wrapper does very basic thing of returning the fastapi.testclient so instead of importing this one, one should import the testclient directly imho
| db_session.execute("CREATE TABLE _test_isolation (id serial PRIMARY KEY, val text)") | ||
| db_session.execute("INSERT INTO _test_isolation (val) VALUES ('should_be_gone')") | ||
| db_session.execute("SELECT count(*) FROM _test_isolation") | ||
| assert db_session.fetchone()[0] == 1 # visible inside this txn |
There was a problem hiding this comment.
this should not work in theory. the db_session is not running anything in any kind of transaction. just because autocommit is off with a always rollback does not make it a transaction.
another thing that is wrong here is we are trying to create the table without specifying the schema which makes this dependent on the schema in search_path hence making it flaky.
Summary by CodeRabbit
Tests
Chores