A full‑stack take‑home implementation for managing books, users, and lending operations.
- REST API is acceptable (gRPC optional).
- Each loan represents one book copy checked out by one user.
copies_availableis managed by the service and cannot fall below active loans.- Books or users with active loans cannot be deleted.
- Users have a simple
rolefield (member,staff,admin) for future access control. - Book records can include optional
subjectandrack_numbermetadata for better in-library lookup. - Circulation policy is configurable (max active loans per user, max loan days, overdue fine/day).
- Onboarding supports bulk import via CSV/XLSX for books, users, and loans.
- Backend: Python, FastAPI, SQLAlchemy, PostgreSQL, uv (package/runtime manager)
- Frontend: Next.js (React)
- Node.js
>=20.19(or>=22.12) and npm - Python
>=3.10,<3.12 - PostgreSQL on
localhost:5432 uvpackage manager
backend/Python API serverfrontend/Next.js minimal UI
From repo root:
make install
make migrate
make upFor a fully clean DB reset:
PGPASSWORD=<postgres_password> make reset-db CONFIRM=YESFrom repo root:
./start.shWith Makefile shortcuts:
make help
make upCommon variants:
# run all + playwright
./start.sh --with-e2e
# run all + headed visual demo tests
./start.sh --with-e2e-visual
# run only backend
./start.sh --backend-only
# skip dependency install and migrations
./start.sh --skip-install --skip-migrateMake equivalents:
make up-e2e
make up-e2e-visual
make backend
make up-fast
make test-backend
make test-frontend-unit
make test-frontend-coverage
make test-e2e
make precommit
make clean
make reset-db CONFIRM=YESScript logs are written to:
.run-logs/backend.log.run-logs/frontend.log
Ensure PostgreSQL is running on localhost:5432 and create the DB/user:
psql -U postgres -h localhost -c "CREATE USER nls_user WITH PASSWORD '<your_local_password>';"
psql -U postgres -h localhost -c "CREATE DATABASE neighborhood_library OWNER nls_user;"# macOS
brew install uv
cd backend
uv sync --group devcp backend/.env.example backend/.envcd backend
uv run alembic -c alembic.ini upgrade headRecreate the database from scratch:
PGPASSWORD=<postgres_password> dropdb -h localhost -U postgres neighborhood_library
PGPASSWORD=<postgres_password> createdb -h localhost -U postgres neighborhood_library
cd backend
uv run alembic -c alembic.ini upgrade heador with Make:
PGPASSWORD=<postgres_password> make reset-db CONFIRM=YESIf your local Postgres trusts local connections, PGPASSWORD may not be required.
export DATABASE_URL=postgresql+asyncpg://nls_user:<your_local_password>@localhost:5432/neighborhood_library
cd backend
uv run uvicorn app.main:app --reloadAPI docs: http://localhost:8000/docs
POST /auth/loginGET /auth/meGET /users/me/loansGET /users/me/fine-paymentsPOST /booksGET /booksPOST /usersGET /usersPOST /loans/borrowPOST /loans/{loan_id}/returnGET /loansGET /loans/{loan_id}/fine-summaryGET /loans/{loan_id}/fine-paymentsPOST /loans/{loan_id}/fine-paymentsGET /books?subject=<value>&published_year=<year>GET /loans?overdue_only=truePOST /imports/books(CSV/XLSX upload)POST /imports/users(CSV/XLSX upload)POST /imports/loans(CSV/XLSX upload)- Imports are idempotent: existing rows are skipped and only new rows are inserted.
Most endpoints now require a Bearer JWT. Roles:
admin: full access (admin settings, catalog/users management, imports, circulation)staff: circulation workflow (borrow/return/fines) + read-only data needed for desk operationsmember: self-service view for own borrowing history, returns, and fines- includes fine payment history with payment mode and references
Login example:
EMAIL="<your_admin_email>"
PASSWORD="<your_admin_password>"
curl -X POST http://localhost:8000/auth/login \
-H "Content-Type: application/json" \
-d "{\"email\":\"$EMAIL\",\"password\":\"$PASSWORD\"}"Example borrow request:
TOKEN="<paste_access_token_here>"
curl -X POST http://localhost:8000/loans/borrow \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"book_id":1,"user_id":1,"days":14}'Bulk import examples:
TOKEN="<admin_token>"
curl -X POST http://localhost:8000/imports/books \
-H "Authorization: Bearer $TOKEN" \
-F "file=@backend/data/seed_india/books.csv"
curl -X POST http://localhost:8000/imports/users \
-H "Authorization: Bearer $TOKEN" \
-F "file=@backend/data/seed_india/users.csv"
curl -X POST http://localhost:8000/imports/loans \
-H "Authorization: Bearer $TOKEN" \
-F "file=@backend/data/seed_india/loans.csv"cd frontend
npm installCreate frontend/.env.local:
NEXT_PUBLIC_API_BASE=http://localhost:8000npm run dev- Open
http://localhost:3000/login - On a fresh DB with zero users, enter Name/Email/Password and click Bootstrap Admin (First Run).
- After bootstrap, use the same credentials to sign in.
- Staff sees circulation workflow only (borrow/return/fines). Admin additionally sees Admin Settings.
- Members can sign in and are routed to
/memberfor self-service loan/fine tracking.
Ensure the backend API is running and the frontend dev server is up.
cd frontend
npm install
npm run test:e2e:install
npm run test:e2eIf your frontend runs on a different URL:
E2E_BASE_URL=http://localhost:3000 npm run test:e2eIf API is not on :8000:
E2E_API_BASE=http://localhost:8000 npm run test:e2eBootstrap-aware admin creds for first run or existing setup:
E2E_ADMIN_NAME="E2E Admin" \
E2E_ADMIN_EMAIL="admin@example.com" \
E2E_ADMIN_PASSWORD="Admin@12345" \
E2E_EXISTING_ADMIN_EMAIL="existing-admin@example.com" \
E2E_EXISTING_ADMIN_PASSWORD="Existing@12345" \
npm run test:e2eVisual demo mode (headed + video + screenshots + trace):
npm run test:e2e:visualcd frontend
npm run test:unit
npm run test:coverageCoverage output is generated at:
frontend/coverage/index.html
- Error handling covers common edge cases (borrowing unavailable books, returning twice, deleting with active loans).
- CORS is configurable via
CORS_ORIGINSinbackend/.env(comma-separated list). - Login endpoint rate limiting is configurable via:
AUTH_LOGIN_RATE_LIMIT_PER_WINDOWAUTH_LOGIN_RATE_LIMIT_WINDOW_SECONDS
- Audit logs are emitted for mutating API calls and can be toggled with
AUDIT_LOG_ENABLED. - API response caching currently uses in-memory process-local storage (good for local/demo runs). For multi-instance production deployments, use a shared cache like Redis to avoid stale/uneven cache behavior across nodes.
- Frontend currently uses explicit fetch hooks/state instead of TanStack Query to keep take-home complexity controlled. If this were extended to production scale, migrating API data flows to TanStack Query would improve cache invalidation, refetch, and loading/error consistency.
- Circulation policy knobs:
CIRCULATION_MAX_ACTIVE_LOANS_PER_USERCIRCULATION_MAX_LOAN_DAYSOVERDUE_FINE_PER_DAY
- Curated onboarding CSV files are included in
backend/data/seed_india/with Indian books/users and historical loan transactions. - Fine payment modes supported:
cash,upi,card,net_banking,wallet,waiver,adjustment.
cd backend
uv run pytestInstall local hooks:
uvx pre-commit installRun all hooks manually:
uvx pre-commit run --all-filesMakefile shortcuts:
make precommit-install
make precommitCoverage HTML report will be generated in backend/htmlcov.
- GitHub Actions workflow (
.github/workflows/ci.yml) runs:- pre-commit hooks on all files
- backend tests + coverage gate
- frontend build
- Optional Playwright smoke test can be triggered manually via Actions → CI → Run workflow and enabling
run_e2e_smoke. - GitHub Actions workflow (
.github/workflows/secret-scan.yml) runs Gitleaks on push/PR and daily schedule.
- Create a few books and users in the UI.
- Borrow a book and see availability decrease.
- Return the loan and see availability increase.