This is the backend of the Capstone Project Management System. It is a RESTful API that allows for the creation, modification, and deletion of projects and users.
We use the below tools/technologies to build and run our application:
- Python - Python 3.8 is used for this project
- Flask - Flask is a micro-web framework written in python to simplify creation of RESTFUL and HTTP applications
- GEvent - GEvent is a coroutine based python networking library that provides support for concurrent, api calls
- Kubernetes - Kubernetes is an open-source orchestration framework used for automating deployment, scaling and management of containerized applications
- Docker - Docker is a PAAS product that offers OS-level virtualization through containers
- MongoDB - MongoDB is a document based no SQL database
- Google Kubernetes Engine - Google Kubernetes Engine is an orchestration system for docker containers
- Artifact Registry (GCP) - Artifact Registry is a registry to store built docker images
- Secret Manager (GCP) - Secret Manager (GCP) is to store secrets that can be accessed securely from the application
- Cloud Build (GCP) - Cloud Build (GCP) is a serverless ci/cd platform which can be used to build and deploy containers in GKE
This application uses GKE Autopilot which removes the overhead of provisioning nodes for the kubernetes cluster. It automatically manages, optimizes the nodes, node pools in the kubernetes clusters for both development and production workloads. This reduces a lot of DevOps workload for maintaining the Kubernetes cluster. The customers only pay for the cpu, memory used.
This application uses MongoDB as a database storage layer. Since, most of the data is written and queried in a document-based format, MongoDB is chosen. MongoDB is deployed as a SAAS tool in GCP to make sure the rest api has very low latency.
This application uses Secret Manager (GCP) to store secrets such as api keys for song data provider last-fm and mongodb credentials. This provides a secure way to access the keys during runtime and helps avoid writing keys in the code.
This application uses Cloud Build (GCP) to automatically build and deploy the code into GKE autopilot cluster whenever new commits are pushed to the GitHub repo master branch. We have adopted the GitOps methodology of CI/CD to automatically deploy changes once they're developed and tested. We have created a trigger in Cloud Build that continuously polls the GitHub repo's master branch and looks for cloudbuild file and runs the following steps. Builds the image and pushes it to Artifact Registry (GCP). It will then deploy the kubernetes config files defined in tools/k8s/gke to GKE Autopilot cluster automatically.
- All the external python libraries are defined in requirements.txt
- The starting point of code is main.py which loads the environment variables and starts the rest api application server
- app.py consists of all the routing, logging information for the rest api
- src.utils package consists of utility functions to support calling mongodb
- cloudbuild.yaml file consists of the ci/cd steps defined as code
- tools/docker directory consists of the Dockerfile which packages the application as a container image
- tools/k8s directory consists of yaml files to run the built container image in kubernetes
- Google cloud - Create a Google cloud account, setup billing
- MongoDB cloud - Create a mongodb account and launch an instance in google cloud(free or standard)
This will run the application directly on the personal machine on port 5000. Below are the steps:
- Store the mongodb creds in a json format as an env variable with name
MONGO_CREDSand format{"username": "<update the value here>","password":"<update the value here>","cluster_id":"<update the value here>"} - Run
python main.pyto run the application. - Application can be accessed using this localhost:5000
This will run the application using Kubernetes on the personal machine on port 5000. Below are the steps:
- Update the
MONGO_CREDSin K8s config in a json format{"username": "<update the value here>","password":"<update the value here>","cluster_id":"<update the value here>"} - Build the docker image using
docker build -t capstone-backend:latest -f tools/docker/Dockerfile . - Deploy image using
kubectl apply -f .\tools\k8s\local\ - Application can be accessed using this localhost:5000
This will run the application using GKE autopilot on google cloud. Below are the steps:
- Create an GKE Autopilot Cluster
- Create a secret in Secret Manager with name
mongodb_credentialsin a json format{"username": "<update the value here>","password":"<update the value here>","cluster_id":"<update the value here>"} - Create a trigger in Cloud Build that triggers a build when a new commit is pushed to the master branch in GitHub
- Trigger should have the following properties:
- Event should be
Push to a branch - Connect the GitHub Repo and set branch to
^master$ - Configuration should use Cloud Build Configuration file
- Location should be Repository and set the configuration file to be
cloudbuild.yaml - Set the following substitution variables
_API_SECRET_VERSION-><Set the API Secret Version>_GKE_CLUSTER_ID-><Set the GKE cluster id>_IMAGE->capstone-backend_LOCATION-><Set the location as per your preference>_MONGO_DB_SECRET->mongodb_credentials_MONGO_DB_SECRET_VERSION-><Set the MongoDB Secret Version>_REPOSITORY-><Set the Repository Name>
- Event should be
- Enable Access to
GKEandSecret Managerin Cloud Build settings - Deploy and start the service either by pushing a new commit or running the cloud build trigger manually
- Access the app by opening the endpoint in
capstone-backend-servicein GKE Services Page
