Two Python microservices (Order + Analytics) that communicate via REST integrated with a PostgreSQL relational database. Each service is Dockerized, unit-tested, and has a CI pipeline that runs tests and security scans.
This repository contains two containerized RESTful backend microservices—an Orders Service and an Analytics Service—built using FastAPI and Docker with PostgreSQL integration. The project also includes a CI/CD pipeline implemented with GitHub Actions that automatically runs tests, builds Docker images, and performs security analysis.
Project Components:
- Orders Microservice
- Analytics Microservice
- PostgreSQL Database
- Alembic Migrations
- Multi-stage Docker Builds
- Docker Compose Orchestration
- GitHub Actions for CI
The Orders Microservice implements a basic RESTful API (FastAPI) that performs CRUD operations against an orders database. (Future work will include adding user authentication for additional security)
The Analytics Microservice will communicate via API calls with the orders service to retrieve order data and provide calculated statistical metrics. Some of these metrics include total orders, revenue statistics, average order value, and order distribution (Future work will include creating a basic GUI to display these results in an interactive environment)
The services and database are coordinated using Docker Compose, allowing the entire system to be started with a single command.
The repository includes a CI pipeline built using GitHub Actions that run automatically to enforce validation, Docker image builds, and code quality.
The pipeline performs the following:
- Tests - runs unit and integration tests using Pytest on every push and PR.
- Pylint - Lints the code using Pylint on every push and PR.
- Docker Build - Builds and uploads the PROD docker image to DockerHub on every PR to
main - CodeQL - Uses built-in GitHub CodeQL for static security analysis
- FastAPI
- SQLAlchemy
- Pydantic
- Alembic
- PostgreSQL
- Docker
- GitHub Actions
- Pytest
This project was constructed in order to gain hands-on experience with building a CI/CD pipeline from the ground-up. It was used as a method to gain experience in connecting microservices together to build scalable architecture.
LinkedIn: Willow Connelly
GitHub: WiIliu
.
{name}_service/ # the service folder
├──── alembic/ # holds all database migrations
├──── app/ # code for api and supporting elements
| ├─── main.py
| ├──── api/
| | ├──── v1/
| | | ├─── {service_routes}.py
| | | ├─── health.py # healthcheck routes
| ├──── core/ # configuration and database settings
| ├──── db_logic/
| | | ├─── crud.py # service table crud logic
| ├──── dependencies/
| ├──── models/
| ├──── schemas/
├──── tests/
| ├──── factories/
| ├──── integration_tests/
| ├──── unit_tests/
| | ├──── crud_tests/
| | ├──── schema_tests/
# misc
├── alembic.ini
├── Dockerfile
├── docker-entrypoint.sh
└── requirements.txt / requirements-dev.txt
-
Create a new environment (example name:
myproject):python -m venv myproject
-
Activate it:
Windows
myproject\Scripts\activate
MacOS / Linux
source myproject/bin/activate
Within your activated environment:
-
Update pip
python -m pip install --upgrade pip
-
Install dependencies via requirements file
Prod Requirements file
pip install -r requirements.txt
Dev Requirements file
pip install -r requirements-dev.txt
(Make sure you are in the new python environment so that packages are installed there.)
Typical Requirements (already in requirements.txt):
- Python 3.11 (or similar)
- FastAPI 0.128
- SQLAlchemy 2.0
- Pydantic 2.12
- Alembic 1.18
In project root run:
docker compose up --buildThis starts:
- Orders Service
- Analytics Service
- PostgreSQL database
If developing locally, a
testprofile can be used to run the test containers.
Once running, each Service can be accessed locally via the generated links:
Order API: http://localhost:8000
Analytics API: http://localhost:8001
Interactive Swagger API docs:
Order API Docs: http://localhost:8000/docs
Analytics API Docs: http://localhost:8001/docs
- Fork this repository.
- Create a new branch for your feature/fix:
git checkout -b feature-my-improvement
- Commit your changes and push to your fork:
git commit -m "Add my new feature" git push origin feature-my-improvement - Open a Pull Request into the
devbranch.
NOTE: The main branch is protected and only updated through
approved pull requests from the dev branch.
After review and testing, changes will be merged into main during the next release.
We welcome suggestions, bug reports, and community contributions!
This project is licensed under the MIT License. You’re free to use, modify, and distribute the code as allowed by that license.
Thank you for visiting python-CICD-pipeline! If you have any questions or issues, feel free to open an issue or reach out.