$ git clone https://github.com/chpladmin/chpl-data-model.git
$ cd chpl-data-modelSince the CHPL is running in production, any changes to the data-model must be done in such a way that they do not negatively impact the live data. For development purposes, DML and DDL scripts must be able to be executed multiple times, where at the end of any execution the database must be in the correct "target" state. The files in the /dev subdirectory should describe the live data-model as well; this means that any change done to the live database must be implemented in the "model" files as well.
To update the soft delete triggers, update the dev/openchpl_soft-delete.sql script to include updates.
To update the views in the CHPL database, update the dev/openchpl_views.sql script to include updates.
These scripts are run each time the database is updated. These scripts will drop all database objects associated with the script and recreate the objects.
To update or add a new table to the CHPL database:
- Create new file or append changes to existing
changes/ocd-XXXX.sqlwhich corresponds to the ticket the change is associated to - Add necessary
ALTERstatements, ensuring that the script can be run multiple times based on requirementsIF EXISTS,IF NOT EXISTSclauses should be used to determine if the statement will be executed
To update data in tables
- Create a new file or append changes to an existing
changes/ocd-XXXX.sqlfile which corresponds to the ticket the change is associated to - Add necessary
INSERT,UPDATEstatements, ensuring the script can be run multiple times based on requirements- Using
WHERE NOT EXISTScan often be used to help with determining if the statement should be executed
- Using
Sometimes database changes may require scripting not defined here to perform the required changes. Those situations should be handled on a case by case basis.
To update your local database to match a particular environment, pull the code associated with that environment.
git pull upstream/qa for example.
Run the load-pending-changes.sh script. This will execute:
- All
ocd-XXXX.sqlscripts in the/changesdirectory - The
dev/openchpl_soft-delete.sqlscript - The
dev/openchpl_views.sqlscript - The
dev/openchpl_grant-all.sqlscript (set permissions for all database objects)
CHPL currently recommends using Postgres version 15 running on the standard port of 5432.
- Create the necessary roles in our database
- Create a new file
dev/openchpl_role.sqlbased ondev/openchpl_role-template.sqland set the password for theopenchplandopenchpl_devroles in the new file. The password is recorded as "change this password" in the template file.
psql -Upostgres -f dev/openchpl_role.sql
- Create a new file
- Create the openchpl database
psql -Upostgres -f dev/openchpl_database.sql
- Download the latest backup file from Bamboo and load it
- The artifact name in Bamboo is openchpl.final.backup. It should be downloaded and copied into the
maintdirectory asopenchpl.backup.
maint/load.sh
- The artifact name in Bamboo is openchpl.final.backup. It should be downloaded and copied into the
- Run scripts to add tables for all other schemas
psql -Uopenchpl_dev -f dev/openchpl_ff4j.sql openchpl psql -Uopenchpl_dev -f dev/openchpl_ff4j_audit.sql openchpl psql -Uopenchpl_dev -f dev/openchpl_quartz.sql openchpl psql -Uopenchpl_dev -f dev/openchpl_quartz_audit.sql openchpl # The shared store schema does have its single table loaded during the load of the prod backup. # I would have expected the above schemas to also have their tables loaded and am not sure why they do not. # psql -Upostgres -f openchpl_shared_store.sql
- Load the latest data model code from the staging branch or relevant feature branch
./load-pending-changes.sh
- Start Tomcat
- The FF4j data does not come with the production data dump by default. You will need to add the current set of feature flags via the FF4j UI.