From 56a1f8e89e5a9d8eb1e3f63117c21df6791cde46 Mon Sep 17 00:00:00 2001 From: lindenmckenzie Date: Wed, 29 Apr 2026 18:17:42 +0100 Subject: [PATCH 1/5] Add new infra processes --- guides/CONTRIBUTING.md | 44 +++---- guides/INSTALLING.md | 2 +- guides/RELEASES.md | 253 +++++++++++++++++++++++++++-------------- guides/TAGS.md | 25 ++-- guides/VERSIONING.md | 27 +++-- 5 files changed, 226 insertions(+), 125 deletions(-) diff --git a/guides/CONTRIBUTING.md b/guides/CONTRIBUTING.md index f634149..68095e1 100644 --- a/guides/CONTRIBUTING.md +++ b/guides/CONTRIBUTING.md @@ -2,34 +2,40 @@ ## Development work -* As a first step, GPG key setup is required in order to contribute: +We currently have two different processes in place as we are transitioning between them. + +The below is true for both approaches, see branching strategy below for the differences. + +- As a first step, GPG key setup is required in order to contribute: instructions on setup can be found in the [GPG Guide](https://github.com/ONSdigital/dp-operations/blob/main/guides/gpg.md) -* We use [git-flow](https://www.atlassian.com/git/tutorials/comparing-workflows/gitflow-workflow) - - create a feature or fix branch from `develop`, e.g. `feature/`, `fix/`. Alternatively, create a hotfix branch from master e.g. `hot-fix/`. -* Pull requests must use the pull request template found in each repo. Each section must contain the following: - * 'What' - a succinct, clear summary of what the user-need is that is driving this feature change, - * 'How to review' - a step-by-step guide or set of guides that detail how a reviewer can test and verify the changes made - * 'Who can review' - a list of the most appropriate people to review the pull request, +- Pull requests must use the pull request template found in each repo. Each section must contain the following: + - 'What' - a succinct, clear summary of what the user-need is that is driving this feature change, + - 'How to review' - a step-by-step guide or set of guides that detail how a reviewer can test and verify the changes made + - 'Who can review' - a list of the most appropriate people to review the pull request, e.g. if on a frontend application, a member of the frontend team should review; likewise for backend and platform. If the changes are critical or are likely to have severe side-effects then the Tech Lead should review. -* Ensure your branch contains logical atomic commits before sending a pull request - follow the [GDS Way styleguide](https://gds-way.digital.cabinet-office.gov.uk/standards/source-code/working-with-git.html#commits) -* You may rebase your branch after feedback if it's to include relevant updates from the `develop` branch. We prefer a rebase here to a merge commit as we - prefer a clean and straight history on the `develop` branch, with discrete merge commits for features -* It is advised to squash commits before pushing to a remote repository +- Ensure your branch contains logical atomic commits before sending a pull request - follow the [GDS Way styleguide](https://gds-way.digital.cabinet-office.gov.uk/standards/source-code/working-with-git.html#commits) +- You may rebase your branch after feedback if it's to include relevant updates from the upstream branch. We prefer a rebase here to a merge commit as we + prefer a clean and straight history on protected branches, with discrete merge commits for features +- It is advised to squash commits before pushing to a remote repository ----- +### New branching strategy -## Releasing +We use trunk based development - create a feature branch from `main`, e.g. `feature/new-feature` -* For instructions on how to release work, read the [Releases guidelines](RELEASES.md) +### Old branching strategy + +We use [git-flow](https://www.atlassian.com/git/tutorials/comparing-workflows/gitflow-workflow) - create a feature or fix branch from `develop`, e.g. `feature/`, `fix/`. Alternatively, create a hotfix branch from master e.g. `hot-fix/`. + +## Releasing ----- +For instructions on how to release work, read the [Releases guidelines](RELEASES.md) ## Access to Environments ### Note: if you get a 404 error for any of the below links, then you need to be added to the `ONSdigital` organisation in GitHub -* For direct access to the environments - * Setup your [AWS credentials](AWS_CREDENTIALS.md) - * Configure your [SSH access](https://github.com/ONSdigital/dp-operations/blob/main/guides/ssh-access.md) - * [Ansible](https://github.com/ONSdigital/dp-ci/tree/master/ansible#prerequisites) is required for provisioning to environments +- For direct access to the environments + - Setup your [AWS credentials](AWS_CREDENTIALS.md) + - Configure your [SSH access](https://github.com/ONSdigital/dp-operations/blob/main/guides/ssh-access.md) + - [Ansible](https://github.com/ONSdigital/dp-ci/tree/master/ansible#prerequisites) is required for provisioning to environments diff --git a/guides/INSTALLING.md b/guides/INSTALLING.md index 7d15f79..575c33f 100644 --- a/guides/INSTALLING.md +++ b/guides/INSTALLING.md @@ -211,7 +211,7 @@ Variable name | note --- | --- `zebedee_root` | path to your zebedee content, typically the directory the [dp-zebedee-content](https://github.com/ONSdigital/dp-zebedee-content) generation script points to when run `ENABLE_PRIVATE_ENDPOINTS` | set `true` when running services in publishing, unset for web mode -`ENABLE_PERMISSIONS_AUTH`| set `true` to ensure that calls to APIs are from registered services or users +`ENABLE_PERMISSIONS_AUTH` | set `true` to ensure that calls to APIs are from registered services or users `ENCRYPTION_DISABLED` | set `true` to disable encryption, making data readable for any debugging purposes `DATASET_ROUTES_ENABLED` | `true` will enable the filterable dataset routes (the CMD journey) in some services `FORMAT_LOGGING` | if `true` then `zebedee` will format its logs diff --git a/guides/RELEASES.md b/guides/RELEASES.md index b376c42..42840e5 100644 --- a/guides/RELEASES.md +++ b/guides/RELEASES.md @@ -3,29 +3,32 @@ This is an overview of the sequence of steps that our tickets (sometimes called *tasks* or *stories*) take from initial idea to completion. -## Notes - -* **PR** refers to a [Github Pull Request](../training/culture-and-process/PULL_REQUEST_GUIDANCE.md) - * not to be confused with the related *Peer Review* (when someone assesses a Pull Request) -* **Story** refers to the Jira ticket describing the work. -* Some **ops work** does not need to go through `PO sign off` (though a tech lead may sign it off) or `Ready for release` - * for example, terraform configuration which has already been applied to the *sandbox* or *production* environments. -* **Columns** are where stories move within Jira - from left-to-right - towards the 'Done' column. -* The current environments and their base branches (from which deployments are made) are: - - | | Sandbox env | Staging env | Prod env - | - | - | - | - - | env used for | initial integration | final stable testing | live production env - | applications (base branch) | `develop` | `master` | `master` - | infrastructure `dp-setup` branch | `awsb` | `main` | `main` - | `dp-configs` (secrets, manifests) | `main` branch | `main` | `main` - -* In some repos (apps, largely), the `main` branch may still have the **deprecated branch name** of `master` (replace in the below, as necessary) -* Some **ops/infrastructure work** (e.g. in `dp-setup`) does not need to go through `PO sign off` (though your Tech Lead should sign it off instead of the Product Owner) - * for example, terraform configuration which has already been applied to the current environments - * if in doubt, ask your Tech Lead, or the Platform Team - -## 'Backlog' column +We currently have two different processes in place as we are transitioning between them. + +## Old process + +### Notes - old + +- **PR** refers to a [Github Pull Request](../training/culture-and-process/PULL_REQUEST_GUIDANCE.md) + not to be confused with the related *Peer Review* (when someone assesses a Pull Request) +- **Story** refers to the Jira ticket describing the work. +- Some **ops work** does not need to go through `PO sign off` (though a tech lead may sign it off) or `Ready for release` + For example, terraform configuration which has already been applied to the *sandbox* or *production* environments. +- **Columns** are where stories move within Jira - from left-to-right - towards the 'Done' column. +- Some **ops/infrastructure work** (e.g. in `dp-setup`) does not need to go through `PO sign off` (though your Tech Lead should sign it off instead of the Product Owner) + - for example, terraform configuration which has already been applied to the current environments + - if in doubt, ask your Tech Lead, or the Platform Team +- In some repos (apps, largely), the `main` branch may still have the **deprecated branch name** of `master` (replace in the below, as necessary) +- The current environments and their base branches (from which deployments are made) are: + +| | Sandbox env | Staging env | Prod env | +|-----------------------------------|---------------------|----------------------|---------------------| +| env used for | initial integration | final stable testing | live production env | +| applications (base branch) | `develop` | `master` | `master` | +| infrastructure `dp-setup` branch | `awsb` | `main` | `main` | +| `dp-configs` (secrets, manifests) | `main` branch | `main` | `main` | + +### 'Backlog' column - old Tickets often start in this column, but may not be fully-formed and need clarification/prioritisation. @@ -34,7 +37,7 @@ More information may be required, but it is sufficiently detailed for most. This move usually happens in planning sessions. -## 'Ready' column +### 'Ready' column - old This is the list of stories that are aimed to be completed in the current sprint. The column is often sorted by priority (those at the top have higher priority). @@ -44,65 +47,149 @@ Scrum teams may break this column into sections for the current (higher section) During the sprint, team members take tickets from this column, add themselves to that ticket, move the ticket to the 'In progress' column and begin the work therein. -## 'In progress' column +### 'In progress' column - old Once work/coding is complete: -* **Create a PR** - to have your feature branch merged into the `develop` branch -* Ensure the PR has passed in CI -* Request a peer-review on Slack (`#dev-code-review`) -* Move the ticket to the 'PR' column - -## 'PR' column - -* **Get approval:** get the PR peer-reviewed - * if any code or other issues are identified, apply fixes to the feature branch -* **Merge** the approved feature branch into the `develop` branch - * do **not** merge using Github :warning: -* **Ship it** - the updated `develop` branch should be auto-deployed to the *sandbox* environment - * ensure that shipping was successful: - * in [CI](https://concourse.dp-ci.aws.onsdigital.uk/) (`sandbox-ship-it` job was successful for the right commit) - * in [*consul*](https://consul.dp.aws.onsdigital.uk/ui/eu/services) (for the expected commit/version and health) -* Developer moves the story to `Testing` column - -## 'Testing' column - -* Developer **tests** the feature in the *sandbox* environment - * similar to approval, above: any fixes go into your feature branch for re-approval, re-merge, etc - * if your feature branch was deleted (manually or automated by github settings), you can always reopen it via the github UI -* Developer moves the story to `PO sign off` (if required) or `Ready for release` column - -## 'PO sign off' column - -* Product owner **reviews changes** in the *sandbox* environment - * bug fixes applied to feature branch (and re-approved, etc into the `develop` branch) -* **Sign off:** the product owner moves the story to `Ready for release` column - -## 'Ready for release' column - -* **Create Release branch** (e.g. `release/1.7.0` see [version control](VERSIONING.md) ) taken from `develop` branch -* **Create a PR** - to merge the release branch into the `main` branch - * the release PR must include the release notes - * a list of the features and bug fixes added since the last release - * please ask the other contributors (commit authors) to add to the release notes for their commits/features/fixes - * any issues (e.g. in CI) are applied to release branch - * any release branch updates (from PR comments) should be regularly merged back to `develop` (or `awsb` for `dp-setup` repo) -* **Merge** into `main` branch but don't push it to Github (origin) yet, wait until the tag has been created in the next step and push both at the same time. -* [Create Release tag](TAGS.md) from `main` branch - * tag must be published as a release, with bullet point list of changes in the release notes. These should be obtained from the release pull request description - * release branch can be deleted - -## Deployment - -* The `main` branch requires **manual deployment** to the *production* and *staging* environments - * for apps, this can be done in CI - * to access `production-ship-it` or `staging-ship-it`, in CI, ask a member of the dev team - * ensure the app has been shipped as expected - * in CI (`production-ship-it` and `staging-ship-it` jobs were successful for the expected release) - * in *consul* (for the expected version and health) - * for dp-setup (infrastructure) changes, please ask each commit author (in the release) to apply their commits/changes to the appropriate environments -* Any issues arising: - * major issues should prompt you to rollback to the previous version and re-work the original (or a new) feature branch; see [dp operation docs on how to rollback an application to previous version](https://github.com/ONSdigital/dp-operations/blob/main/guides/rollback-app.md#rolling-back-an-application-to-a-previous-deployment) - * minor issues are fixed in `hotfix/my_fix` branches (which are PR'd back into the `main` branch) - * merge any hotfixes back into the `develop` branch, too -* Developer moves the story to the `Done` column :tada: +- **Create a PR** - to have your feature branch merged into the `develop` branch +- Ensure the PR has passed in CI +- Request a peer-review on Slack (`#dev-code-review`) +- Move the ticket to the 'PR' column + +### 'PR' column - old + +- **Get approval:** get the PR peer-reviewed + - if any code or other issues are identified, apply fixes to the feature branch +- **Merge** the approved feature branch into the `develop` branch + - do **not** merge using Github :warning: +- **Ship it** - the updated `develop` branch should be auto-deployed to the *sandbox* environment + - ensure that shipping was successful: + - in [CI](https://concourse.dp-ci.aws.onsdigital.uk/) (`sandbox-ship-it` job was successful for the right commit) + - in [*consul*](https://consul.dp.aws.onsdigital.uk/ui/eu/services) (for the expected commit/version and health) +- Developer moves the story to `Testing` column + +### 'Testing' column - old + +- Developer **tests** the feature in the *sandbox* environment + - similar to approval, above: any fixes go into your feature branch for re-approval, re-merge, etc + - if your feature branch was deleted (manually or automated by github settings), you can always reopen it via the github UI +- Developer moves the story to `PO sign off` (if required) or `Ready for release` column + +### 'PO sign off' column - old + +- Product owner **reviews changes** in the *sandbox* environment + - bug fixes applied to feature branch (and re-approved, etc into the `develop` branch) +- **Sign off:** the product owner moves the story to `Ready for release` column + +### 'Ready for release' column - old + +- **Create Release branch** (e.g. `release/1.7.0` see [version control](VERSIONING.md) ) taken from `develop` branch +- **Create a PR** - to merge the release branch into the `main` branch + - the release PR must include the release notes + - a list of the features and bug fixes added since the last release + - please ask the other contributors (commit authors) to add to the release notes for their commits/features/fixes + - any issues (e.g. in CI) are applied to release branch + - any release branch updates (from PR comments) should be regularly merged back to `develop` (or `awsb` for `dp-setup` repo) +- **Merge** into `main` branch but don't push it to Github (origin) yet, wait until the tag has been created in the next step and push both at the same time. +- [Create Release tag](TAGS.md) from `main` branch + - tag must be published as a release, with bullet point list of changes in the release notes. These should be obtained from the release pull request description + - release branch can be deleted + +### Deployment - old + +- The `main` branch requires **manual deployment** to the *production* environment + - for apps, this can be done in CI + - to access `production-ship-it`, in CI, ask a member of the dev team + - ensure the app has been shipped as expected + - in CI (`production-ship-it` or `staging-ship-it` jobs were successful for the expected release) + - in *consul* (for the expected version and health) + - for dp-setup (infrastructure) changes, please ask each commit author (in the release) to apply their commits/changes to the appropriate environments +- Any issues arising: + - major issues should prompt you to rollback to the previous version and re-work the original (or a new) feature branch; see [dp operation docs on how to rollback an application to previous version](https://github.com/ONSdigital/dp-operations/blob/main/guides/rollback-app.md#rolling-back-an-application-to-a-previous-deployment) + - minor issues are fixed in `hotfix/my_fix` branches (which are PR'd back into the `main` branch) + - merge any hotfixes back into the `develop` branch, too +- Developer moves the story to the `Done` column :tada: + +## New process + +### Notes + +- **PR** refers to a [Github Pull Request](../training/culture-and-process/PULL_REQUEST_GUIDANCE.md) + not to be confused with the related *Peer Review* (when someone assesses a Pull Request) +- **Story** refers to the Jira ticket describing the work. +- **Columns** are where stories move within Jira - from left-to-right - towards the 'Done' column. +- All environments are deployed to from the `main` branch and are deployed based on tags: + +| Environment | Tag style | Automated deploy | +|-------------|------------|------------------| +| Sandbox | None | Yes | +| Staging | 1.2.1-rc.1 | Yes | +| Production | 1.2.1 | No | + +### 'Backlog' column + +Tickets often start in this column, but may not be fully-formed and need clarification/prioritisation. + +Once a ticket has been given enough detail for someone to be able to start working on it, it can move to the 'Ready' column. +More information may be required, but it is sufficiently detailed for most. + +This move usually happens in planning sessions. + +### 'Ready' column + +This is the list of stories that are aimed to be completed in the current sprint. +The column is often sorted by priority (those at the top have higher priority). + +Scrum teams may break this column into sections for the current (higher section) and future (lower sections) tickets. + +During the sprint, team members take tickets from this column, add themselves to that ticket, +move the ticket to the 'In progress' column and begin the work therein. + +### 'In progress' column + +Once work/coding is complete: + +- **Create a PR** - to have your feature branch merged into the `main` branch +- Ensure the PR has passed in CI +- Request a peer-review on Slack (`#dev-code-review`) +- Move the ticket to the 'PR' column + +### 'PR' column + +- **Get approval:** get the PR peer-reviewed + - if any code or other issues are identified, apply fixes to the feature branch +- **Merge** the approved feature branch into the `main` branch + - do **not** merge using Github :warning: +- **Ship it** - the updated `main` branch should be auto-deployed to the *sandbox* environment + - ensure that shipping was successful: + - in [CI](https://dis-concourse.dp-ci.aws.onsdigital.uk/) (`helm_chart_sandbox_values_update` job was successful for the right commit) + - in [*argo*](https://argo-cd.dp-ci.aws.onsdigital.uk/) (for the expected commit/version and health) +- Developer moves the story to `Testing` column + +### 'Testing' column + +- Developer **tests** the feature in the *sandbox* environment + - similar to approval, above: any fixes go into your feature branch for re-approval, re-merge, etc + - if your feature branch was deleted (manually or automated by github settings), you can always reopen it via the github UI +- Developer moves the story to `PO sign off` (if required) or `Ready for release` column + +### 'PO sign off' column + +- Product owner **reviews changes** in the *sandbox* environment + - bug fixes applied to feature branch (and re-approved, etc into the `main` branch) +- **Sign off:** the product owner moves the story to `Ready for release` column + +### 'Ready for release' column + +- [Create Release candidate tag](TAGS.md) from `main` branch (e.g. `1.2.1-rc.1` see [version control](VERSIONING.md) ) +- **Push Release tag** - this will ship to the `staging` environment. +- If appropriate, re-test in the staging environment +- [Create Release tag](TAGS.md) from `main` branch (e.g. `1.2.1` see [version control](VERSIONING.md) ) +- **Push Release tag** - this will create a PR to your service helm chart + - tag must be published as a release, with bullet point list of changes in the release notes. These should be obtained from the original pull request description or auto generated from the commits + +### Deployment + +- Releases require **manual deployment** to the *production* environment + - for apps this can be done via Github by locating the automated PRs from Concourse for the relevant updates +- Developer moves the story to the `Done` column :tada: diff --git a/guides/TAGS.md b/guides/TAGS.md index e2ca3d3..8c7973f 100644 --- a/guides/TAGS.md +++ b/guides/TAGS.md @@ -8,19 +8,18 @@ In order to tag a release you first need to checkout the branch you aim to tag (usually `main` or `master`) -* Run `git tag -s -m ` - * `` is the semver value in the release branch name (e.g. `v1.7.0`, see [version control](VERSIONING.md)) - * `-s` ensures the tag is signed - * provide a message to describe the changes at a high level -* Run `git push --follow-tags` - * `--follow-tags` flag will only push tags that match your commits to the shared remote repository, this will prevent any local tags being pushed to remote that are not referencing a commit which has been pushed. This flag can be set as default by adding to `push.followTags` to your git config locally, [see documentation here](https://git-scm.com/docs/git-push#Documentation/git-push.txt---follow-tags) - +- Run `git tag -s -m ` + - `` is the semver value in the release branch name (e.g. `v1.7.0`, see [version control](VERSIONING.md)) + - `-s` ensures the tag is signed + - provide a message to describe the changes at a high level +- Run `git push --follow-tags` + - `--follow-tags` flag will only push tags that match your commits to the shared remote repository, this will prevent any local tags being pushed to remote that are not referencing a commit which has been pushed. This flag can be set as default by adding to `push.followTags` to your git config locally, [see documentation here](https://git-scm.com/docs/git-push#Documentation/git-push.txt---follow-tags) Once pushed, go to the Github **releases** page for the repository in question, which can be found by clicking on the `<> Code` tab and clicking on `Releases` on the right hand side: -* Select `Tags` -* Click on the `v1.7.0` tag that you just created -* Choose `Create release from tag` -* Copy the release name and make human-friendly (capitalise, remove `/`) to create a release title (e.g. `release/1.7.0 -> Release 1.7.0`) -* Add relevant release notes -* Hit **Publish release** +- Select `Tags` +- Click on the `v1.7.0` tag that you just created +- Choose `Create release from tag` +- Copy the release name and make human-friendly (capitalise, remove `/`) to create a release title (e.g. `release/1.7.0 -> Release 1.7.0`) +- Add relevant release notes +- Hit **Publish release** diff --git a/guides/VERSIONING.md b/guides/VERSIONING.md index d32dac5..715c91e 100644 --- a/guides/VERSIONING.md +++ b/guides/VERSIONING.md @@ -1,25 +1,34 @@ # Version control of Repositories Following [Semantic Versioning 2.0.0](https://semver.org/). -* The first stable release for repositories, either libraries or services, is `1.0.0` + +- The first stable release for repositories, either libraries or services, is `1.0.0` The versioning process is different for applications and libraries (see the [release process](RELEASES.md) for additional information): -* **Applications** - * Releases: only major and minor increments are allowed, the patch is always 0. The release branch is branched off `develop` and merged into `master`. For example: - ``` + +- **Applications** + - Releases: only major and minor increments are allowed, the patch is always 0. + - For the [old branching strategy](./CONTRIBUTING.md#old-branching-strategy) the release branch is branched off `develop` and merged into `master`. For example: + - For the [new branching strategy](./CONTRIBUTING.md#new-branching-strategy) all branches are off `main` + For example: + + ```txt Current version is 1.1.0 * Minor release updates version to 1.2.0 * Major release updates version to 2.0.0 ``` - * Hotfixes: only patch increments are allowed. The hotfix branch is branched off `master`. For example: - ``` + + - Hotfixes: only patch increments are allowed. The hotfix branch is branched off `master` or `main`. For example: + + ```txt Current version is 1.1.0 * Patch release updates version to 1.1.1 ``` -* **Libraries** - * Releases and hotfixes: any type of increment (major, minor or patch) is allowed. The branch is branched off `main`, as libraries don't have a `develop` branch. + +- **Libraries** + - Releases and hotfixes: any type of increment (major, minor or patch) is allowed. The branch is branched off `main`, as libraries don't have a `develop` branch. ## Maintaining Old Versions @@ -40,7 +49,7 @@ Any new tag releases should follow the convention to add `.` with the nu See below example of order of increasing versions: -``` +```txt 1.3.2 < 2.0.0-alpha < 2.0.0-alpha.1 < 2.0.0-beta < 2.0.0-beta.2 < 2.0.0-beta.11 < 2.0.0-rc.1 < 2.0.0 ``` From af8ac993b0ca532876ddbd31bf994b072741d0f1 Mon Sep 17 00:00:00 2001 From: lindenmckenzie Date: Wed, 29 Apr 2026 18:20:16 +0100 Subject: [PATCH 2/5] Add markdown linting --- .github/workflows/ci.yml | 29 ++++++++++++++++++++++++ .markdownlint-cli2.yaml | 7 ++++++ .pre-commit-config.yaml | 35 ++++++++++++++++++++++++++++ CONTRIBUTING.md | 49 ++++++++++++++++++++++++++++++++++++++++ 4 files changed, 120 insertions(+) create mode 100644 .github/workflows/ci.yml create mode 100644 .markdownlint-cli2.yaml create mode 100644 .pre-commit-config.yaml create mode 100644 CONTRIBUTING.md diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml new file mode 100644 index 0000000..509de5c --- /dev/null +++ b/.github/workflows/ci.yml @@ -0,0 +1,29 @@ +name: CI + +on: + push: + branches: ["main"] + pull_request: + branches: ["main"] + +concurrency: + group: ${{ github.workflow }}-${{ github.ref }} + cancel-in-progress: false + +jobs: + dp-operations-lint: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v5 + with: + fetch-depth: 0 + - uses: tj-actions/changed-files@v46 + id: changed-files + with: + files: '**/*.md' + separator: "," + - uses: DavidAnson/markdownlint-cli2-action@v20 + if: steps.changed-files.outputs.any_changed == 'true' + with: + globs: ${{ steps.changed-files.outputs.all_changed_files }} + separator: "," diff --git a/.markdownlint-cli2.yaml b/.markdownlint-cli2.yaml new file mode 100644 index 0000000..f7910ba --- /dev/null +++ b/.markdownlint-cli2.yaml @@ -0,0 +1,7 @@ +--- +config: + line-length: false + ul-style: + style: dash +ignores: + - "LICENSE.md" diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml new file mode 100644 index 0000000..88cf541 --- /dev/null +++ b/.pre-commit-config.yaml @@ -0,0 +1,35 @@ +# Keeping the hooks versions up to date can be done by running `pre-commit autoupdate` validate that the examples still pass correctly before pushing the updated version changes +repos: +- repo: https://github.com/pre-commit/pre-commit-hooks + rev: v5.0.0 + hooks: # Available hooks https://github.com/pre-commit/pre-commit-hooks/blob/main/README.md + # Git style + - id: check-added-large-files # prevents giant files from being committed. + - id: check-merge-conflict # checks for files that contain merge conflict strings. + - id: check-vcs-permalinks # ensures that links to vcs websites are permalinks. + - id: no-commit-to-branch # Protect specific branches from direct commits. + args: [--branch, main] # Protected in the repo config. but this will prevent you even getting that far + # Common errors + - id: end-of-file-fixer # Ensures that a file is either empty, or ends with one newline. + - id: trailing-whitespace # Trims trailing whitespace. + args: [--markdown-linebreak-ext=md] # Preserve hard line breaks in MD files + - id: check-yaml # Attempts to load all yaml files to verify syntax. + - id: check-executables-have-shebangs # Checks that non-binary executables have a proper shebang. + # Cross platform + - id: check-case-conflict # Check for files with names that would conflict on a case-insensitive filesystem like MacOS HFS+ or Windows FAT. + - id: mixed-line-ending # Replaces or checks mixed line ending. + args: [--fix=lf] # Forces to replace line ending with LF + # Security + - id: detect-aws-credentials # Checks for the existence of AWS secrets that you have set up with the AWS CLI. + args: ['--allow-missing-credentials'] # Allow hook to pass when no credentials are detected. + - id: detect-private-key # Checks for the existence of private keys. + +- repo: https://github.com/zricethezav/gitleaks + rev: v8.23.3 + hooks: + - id: gitleaks + +- repo: https://github.com/DavidAnson/markdownlint-cli2 + rev: v0.18.1 + hooks: + - id: markdownlint-cli2 diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 0000000..fa00f58 --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,49 @@ +# Contributing guidelines + +## Git workflow + +- We use trunk based development - create a feature branch from `main`, e.g. `feature/new-feature` +- Pull requests must contain a succinct, clear summary of what the user need is driving this document change +- Ensure your branch contains logical atomic commits following our [commit standards](https://github.com/ONSdigital/dp-standards/blob/main/COMMIT_STANDARDS.md) before submitting a pull request +- You may rebase your branch after feedback if it's to include relevant updates from the main branch. We prefer a rebase here to a merge commit as we prefer a clean and straight history on develop with discrete merge commits for changes + +## Pre-commit + +We encourage the use of [pre-commit](https://pre-commit.com/) locally, this reduces the amount of common mistakes and engineers can just focus on reviewing the actual changes. + +```sh +# Install pre-commit +brew install pre-commit +# This will enable pre-commit to run every time you git commit +pre-commit install +``` + +If you want to do an adhoc run of pre-commit + +⚠️ This will only run on files that are staged in git + +```sh +pre-commit run --all-files +``` + +If you want to commit and skip checks + +```sh +git commit --no-verify -m "some message" +``` + +## Markdown linting + +This respository uses [markdown linting](https://github.com/DavidAnson/markdownlint-cli2) for all PRs. + +To install: + +```sh + brew install markdownlint-cli2 +``` + +To run: + +```sh + markdownlint-cli2 **/*.md +``` From 850de4c61ba102146fd4bb1a0ebb91aa9f816fb9 Mon Sep 17 00:00:00 2001 From: lindenmckenzie Date: Wed, 29 Apr 2026 18:32:45 +0100 Subject: [PATCH 3/5] Update formatting --- guides/INSTALLING.md | 230 +++++++++++++++++++++---------------------- 1 file changed, 114 insertions(+), 116 deletions(-) diff --git a/guides/INSTALLING.md b/guides/INSTALLING.md index 575c33f..7965e14 100644 --- a/guides/INSTALLING.md +++ b/guides/INSTALLING.md @@ -8,41 +8,39 @@ As listed step-by-step in the [Getting Started](https://github.com/ONSdigital/dp In the below, the installation of each app is typically one of: -* use the `brew` command where provided, or -* use the link to the website to follow the installation instructions, or -* follow the link to the Github repo, where you should clone the repo and follow the instructions in the `README.md` file to install/run (within the repo directory) +- use the `brew` command where provided, or +- use the link to the website to follow the installation instructions, or +- follow the link to the Github repo, where you should clone the repo and follow the instructions in the `README.md` file to install/run (within the repo directory) **Note:** when indicating a command that should be run in your terminal, we use the `$` prefix to indicate your shell prompt. -------------- -Software | Install | Notes --------- | ------- | ----- | -[Java 8 JDK (OpenJDK)](https://openjdk.java.net/install/) | `$ brew install openjdk@8` | Append `export PATH="/usr/local/opt/openjdk@8/bin:$PATH"` to your shell profile (eg. `.zshrc`) and restart your terminal. -[Maven](https://maven.apache.org/) | `$ brew install maven` -[Docker](https://www.docker.com/get-started) | `$ brew install --cask docker` - Docker Compose | `$ brew install docker-compose` -[Cypher Shell](https://neo4j.com/docs/operations-manual/current/tools/cypher-shell/) | `$ brew install cypher-shell` | deprecated (not needed if using Neptune over Neo4j) -[nvm](https://github.com/nvm-sh/nvm) | Follow the [git install instructions](https://github.com/nvm-sh/nvm?tab=readme-ov-file#git-install) | Required to allow easy switching between node/npm versions depending on usage within app -[Go](https://go.dev/dl/) | `$ brew install go` | The Go installation is processor architecture specific. For the newer Apple M1 processor the ARM installation is required. This is managed by Homebrew. However, if installing manually this is something to be aware of. [Go direct Download](https://go.dev/dl/) | -[GoConvey](https://github.com/smartystreets/goconvey#installation) | -[GhostScript](https://www.ghostscript.com/download.html) | `$ brew install ghostscript` | Required for [Babbage](https://github.com/onsdigital/babbage) -[Vault](https://www.vaultproject.io/intro/getting-started/install.html) | `$ brew install hashicorp/tap/vault` | Required for running Florence. -[jq](https://stedolan.github.io/jq/) | `$ brew install jq` | A handy JSON tool (for debugging website content and much more) -[yq](https://github.com/mikefarah/yq) | `$ brew install yq` | A handy YAML tool -[dp-compose](https://github.com/ONSdigital/dp-compose) | `$ git clone git@github.com:ONSdigital/dp-compose` | See [`dp-compose` README](https://github.com/ONSdigital/dp-compose#dp-compose) for configuration of Docker Desktop resources - -[dp-compose](https://github.com/ONSdigital/dp-compose) runs the following services: - -* Services for the Website - * Elasticsearch 7 (on non-standard port) - * Highcharts - * Postgres - * MongoDB - * Kafka (plus required Zookeeper dependency) -* Services for CMD - * Elasticsearch 6 (on non-standard port) - * [Neptune](https://github.com/ONSdigital/dp/blob/main/guides/NEPTUNE.md#migrating-from-neo4j-to-aws-neptune)) +| Software | Install | Notes | +|--------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------| +| [Java 8 JDK (OpenJDK)](https://openjdk.java.net/install/) | `$ brew install openjdk@8` | Append `export PATH="/usr/local/opt/openjdk@8/bin:$PATH"` to your shell profile (eg. `.zshrc`) and restart your terminal. | +| [Maven](https://maven.apache.org/) | `$ brew install maven` | | +| [Docker](https://www.docker.com/get-started) | `$ brew install --cask docker` | | +| Docker Compose | `$ brew install docker-compose` | | +| [nvm](https://github.com/nvm-sh/nvm) | Follow the [git install instructions](https://github.com/nvm-sh/nvm?tab=readme-ov-file#git-install) | | +| [Go](https://go.dev/dl/) | `$ brew install go` | The Go installation is processor architecture specific.[^1] | +| [GoConvey](https://github.com/smartystreets/goconvey#installation) | | | +| [jq](https://stedolan.github.io/jq/) | `$ brew install jq` | A handy JSON tool (for debugging website content and much more) | +| [yq](https://github.com/mikefarah/yq) | `$ brew install yq` | A handy YAML tool | +| [dp-compose](https://github.com/ONSdigital/dp-compose) | `$ git clone git@github.com:ONSdigital/dp-compose` | See [`dp-compose` README](https://github.com/ONSdigital/dp-compose#dp-compose) for configuration of Docker Desktop resources | + +[^1]: For the newer Apple M1 processor the ARM installation is required. This is managed by Homebrew. However, if installing manually this is something to be aware of. [Go direct Download](https://go.dev/dl/) + +[dp-compose](https://github.com/ONSdigital/dp-compose/tree/main/v2/stacks/v1-compat) runs the following services: + +- Services for the Website + - Elasticsearch 7 (on non-standard port) + - Highcharts + - MongoDB + - Kafka (plus required Zookeeper dependency) + - Mathjax +- Services for CMD + - Elasticsearch 6 (on non-standard port) Return to the [Getting Started](https://github.com/ONSdigital/dp/blob/main/guides/GETTING_STARTED.md) guide for next steps. @@ -52,22 +50,22 @@ Return to the [Getting Started](https://github.com/ONSdigital/dp/blob/main/guide Clone the GitHub repos for [web](#web-journey), [publishing](#publishing-journey) and/or [CMD](#cmd-journeys) (Customise My Data). -* [Web](#web-journey) - These apps make up the public-facing website providing **read-only access** to published content, and will be enough strictly to work on website content types other than filterable datasets (e.g. bulletins, articles, timeseries, datasets). +- [Web](#web-journey) - These apps make up the public-facing website providing **read-only access** to published content, and will be enough strictly to work on website content types other than filterable datasets (e.g. bulletins, articles, timeseries, datasets). -* [Publishing](#publishing-journey) - The "publishing journey" gives you all the features of web together with an internal interface to update, preview and publish content. All content is encrypted and requires authentication. +- [Publishing](#publishing-journey) - The "publishing journey" gives you all the features of web together with an internal interface to update, preview and publish content. All content is encrypted and requires authentication. -* [CMD](#cmd-journeys) - apps will support the filterable dataset journey, and would mean you have every possible service running. +- [CMD](#cmd-journeys) - apps will support the filterable dataset journey, and would mean you have every possible service running. ### Web Journey -* [babbage](https://github.com/ONSdigital/babbage) -* [zebedee](https://github.com/ONSdigital/zebedee) -* [sixteens](https://github.com/ONSdigital/sixteens) -* [dp-frontend-router](https://github.com/ONSdigital/dp-frontend-router) -* [dp-frontend-homepage-controller](https://github.com/ONSdigital/dp-frontend-homepage-controller) -* [dp-frontend-cookie-controller](https://github.com/ONSdigital/dp-frontend-cookie-controller) -* [dp-frontend-dataset-controller](https://github.com/ONSdigital/dp-frontend-dataset-controller) -* [dp-frontend-feedback-controller](https://github.com/ONSdigital/dp-frontend-feedback-controller) +- [babbage](https://github.com/ONSdigital/babbage) +- [zebedee](https://github.com/ONSdigital/zebedee) +- [sixteens](https://github.com/ONSdigital/sixteens) +- [dp-frontend-router](https://github.com/ONSdigital/dp-frontend-router) +- [dp-frontend-homepage-controller](https://github.com/ONSdigital/dp-frontend-homepage-controller) +- [dp-frontend-cookie-controller](https://github.com/ONSdigital/dp-frontend-cookie-controller) +- [dp-frontend-dataset-controller](https://github.com/ONSdigital/dp-frontend-dataset-controller) +- [dp-frontend-feedback-controller](https://github.com/ONSdigital/dp-frontend-feedback-controller) ```bash git clone git@github.com:ONSdigital/babbage @@ -87,13 +85,13 @@ Clone the GitHub repos for [web](#web-journey), [publishing](#publishing-journey All services listed in the [web journey](#web-journey) are required for the publishing journey. They are used for the preview functionality. -* [florence](https://github.com/ONSdigital/florence) -* [The-Train](https://github.com/ONSdigital/The-Train) -* [dp-api-router](https://github.com/ONSdigital/dp-api-router) -* [dp-image-api](https://github.com/ONSdigital/dp-image-api) -* [dp-image-importer](https://github.com/ONSdigital/dp-image-importer) -* [dp-upload-service](https://github.com/ONSdigital/dp-upload-service) -* [dp-download-service](https://github.com/ONSdigital/dp-download-service) +- [florence](https://github.com/ONSdigital/florence) +- [The-Train](https://github.com/ONSdigital/The-Train) +- [dp-api-router](https://github.com/ONSdigital/dp-api-router) +- [dp-image-api](https://github.com/ONSdigital/dp-image-api) +- [dp-image-importer](https://github.com/ONSdigital/dp-image-importer) +- [dp-upload-service](https://github.com/ONSdigital/dp-upload-service) +- [dp-download-service](https://github.com/ONSdigital/dp-download-service) ```bash git clone git@github.com:ONSdigital/florence @@ -109,30 +107,30 @@ All services listed in the [web journey](#web-journey) are required for the publ All the services in the [web] and [publishing] journeys, as well as: -#### Dataset journey: +#### Dataset journey -* [dp-dataset-api](https://github.com/ONSdigital/dp-dataset-api) -* [dp-frontend-filter-dataset-controller](https://github.com/ONSdigital/dp-frontend-filter-dataset-controller) +- [dp-dataset-api](https://github.com/ONSdigital/dp-dataset-api) +- [dp-frontend-filter-dataset-controller](https://github.com/ONSdigital/dp-frontend-filter-dataset-controller) ```bash git clone git@github.com:ONSdigital/dp-dataset-api git clone git@github.com:ONSdigital/dp-frontend-filter-dataset-controller ``` -#### Import services: - -* [dp-recipe-api](https://github.com/ONSdigital/dp-recipe-api) -* [dp-import-api](https://github.com/ONSdigital/dp-import-api) -* [dp-upload-service](https://github.com/ONSdigital/dp-upload-service) -* [dp-import-tracker](https://github.com/ONSdigital/dp-import-tracker) -* [dp-dimension-extractor](https://github.com/ONSdigital/dp-dimension-extractor) -* [dp-dimension-importer](https://github.com/ONSdigital/dp-dimension-importer) -* [dp-observation-extractor](https://github.com/ONSdigital/dp-observation-extractor) -* [dp-observation-importer](https://github.com/ONSdigital/dp-observation-importer) -* [dp-hierarchy-builder](https://github.com/ONSdigital/dp-hierarchy-builder) -* [dp-hierarchy-api](https://github.com/ONSdigital/dp-hierarchy-api) -* [dp-dimension-search-builder](https://github.com/ONSdigital/dp-dimension-search-builder) -* [dp-publishing-dataset-controller](https://github.com/ONSdigital/dp-publishing-dataset-controller) +#### Import services + +- [dp-recipe-api](https://github.com/ONSdigital/dp-recipe-api) +- [dp-import-api](https://github.com/ONSdigital/dp-import-api) +- [dp-upload-service](https://github.com/ONSdigital/dp-upload-service) +- [dp-import-tracker](https://github.com/ONSdigital/dp-import-tracker) +- [dp-dimension-extractor](https://github.com/ONSdigital/dp-dimension-extractor) +- [dp-dimension-importer](https://github.com/ONSdigital/dp-dimension-importer) +- [dp-observation-extractor](https://github.com/ONSdigital/dp-observation-extractor) +- [dp-observation-importer](https://github.com/ONSdigital/dp-observation-importer) +- [dp-hierarchy-builder](https://github.com/ONSdigital/dp-hierarchy-builder) +- [dp-hierarchy-api](https://github.com/ONSdigital/dp-hierarchy-api) +- [dp-dimension-search-builder](https://github.com/ONSdigital/dp-dimension-search-builder) +- [dp-publishing-dataset-controller](https://github.com/ONSdigital/dp-publishing-dataset-controller) ```bash git clone git@github.com:ONSdigital/dp-recipe-api @@ -153,15 +151,16 @@ All the services in the [web] and [publishing] journeys, as well as: [Sequence diagram of cmd import process](https://github.com/ONSdigital/dp-import/tree/main/docs/import-sequence#readme) -#### Filter journey: +#### Filter journey + If you have already setup the import journey, you will have the Hierarchy API already. It's still fine to copy the command set below, just be aware that if you hit 1 error for `destination path already exists` that is expected. -* [dp-dimension-search-api](https://github.com/ONSdigital/dp-dimension-search-api) -* [dp-code-list-api](https://github.com/ONSdigital/dp-code-list-api) -* [dp-hierarchy-api](https://github.com/ONSdigital/dp-hierarchy-api) -* [dp-filter-api](https://github.com/ONSdigital/dp-filter-api) -* [dp-dataset-exporter](https://github.com/ONSdigital/dp-dataset-exporter) -* [dp-dataset-exporter-xlsx](https://github.com/ONSdigital/dp-dataset-exporter-xlsx) +- [dp-dimension-search-api](https://github.com/ONSdigital/dp-dimension-search-api) +- [dp-code-list-api](https://github.com/ONSdigital/dp-code-list-api) +- [dp-hierarchy-api](https://github.com/ONSdigital/dp-hierarchy-api) +- [dp-filter-api](https://github.com/ONSdigital/dp-filter-api) +- [dp-dataset-exporter](https://github.com/ONSdigital/dp-dataset-exporter) +- [dp-dataset-exporter-xlsx](https://github.com/ONSdigital/dp-dataset-exporter-xlsx) ```bash git clone git@github.com:ONSdigital/dp-dimension-search-api @@ -190,32 +189,32 @@ Both of these stacks rely on variations of an `scs.sh` script, which provides su Some commands require changes to be made to your shell - e.g. -* to your `PATH` or -* to add environment variables - these commands take the form `export VAR_NAME=value` +- to your `PATH` or +- to add environment variables - these commands take the form `export VAR_NAME=value` and should be appended to the startup file for your shell: -* for the shell `zsh`, the startup file is `~/.zshrc` -* for the `bash` shell, the startup file is `~/.bashrc` +- for the shell `zsh`, the startup file is `~/.zshrc` +- for the `bash` shell, the startup file is `~/.bashrc` When the startup files are updated, to load the new changes into your shell, either: -* open a new terminal window, or -* `$ exec $SHELL -l` +- open a new terminal window, or +- `$ exec $SHELL -l` ### Environment variables You should put the below _env vars_ in your [startup file](#startup-file). -Variable name | note ---- | --- -`zebedee_root` | path to your zebedee content, typically the directory the [dp-zebedee-content](https://github.com/ONSdigital/dp-zebedee-content) generation script points to when run -`ENABLE_PRIVATE_ENDPOINTS` | set `true` when running services in publishing, unset for web mode -`ENABLE_PERMISSIONS_AUTH` | set `true` to ensure that calls to APIs are from registered services or users -`ENCRYPTION_DISABLED` | set `true` to disable encryption, making data readable for any debugging purposes -`DATASET_ROUTES_ENABLED` | `true` will enable the filterable dataset routes (the CMD journey) in some services -`FORMAT_LOGGING` | if `true` then `zebedee` will format its logs -`SERVICE_AUTH_TOKEN` | a value for `zebedee` to work +| Variable name | note | +|----------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| `zebedee_root` | path to your zebedee content, typically the directory the [dp-zebedee-content](https://github.com/ONSdigital/dp-zebedee-content) generation script points to when run | +| `ENABLE_PRIVATE_ENDPOINTS` | set `true` when running services in publishing, unset for web mode | +| `ENABLE_PERMISSIONS_AUTH` | set `true` to ensure that calls to APIs are from registered services or users | +| `ENCRYPTION_DISABLED` | set `true` to disable encryption, making data readable for any debugging purposes | +| `DATASET_ROUTES_ENABLED` | `true` will enable the filterable dataset routes (the CMD journey) in some services | +| `FORMAT_LOGGING` | if `true` then `zebedee` will format its logs | +| `SERVICE_AUTH_TOKEN` | a value for `zebedee` to work | After all the various steps, here's an example set of exports and their values that you might now have in your [startup file](#startup-file): @@ -237,7 +236,6 @@ export PUBLISHING_THREAD_POOL_SIZE=10 export GRAPH_DRIVER_TYPE=neptune export GRAPH_ADDR=wss://localhost:8182/gremlin export NEPTUNE_TLS_SKIP_VERIFY=true - ``` Return to the [Getting Started](https://github.com/ONSdigital/dp/blob/main/guides/GETTING_STARTED.md) guide for next steps. @@ -246,7 +244,7 @@ Return to the [Getting Started](https://github.com/ONSdigital/dp/blob/main/guide ## Running the apps -Run [dp-compose](https://github.com/ONSdigital/dp-compose) using the `$ ./run.sh` command (in the dp-compose repo) to run the supporting services. As well as Vault, e.g. `$ vault server -dev`. +Run [dp-compose v1-compat stack](https://github.com/ONSdigital/dp-compose/blob/main/v2/stacks/v1-compat) using the `$ make up` command (in the dp-compose repo) to run the supporting services. Most applications can be run using the `$ make debug` command, but deviations are all documented below: @@ -254,36 +252,36 @@ Most applications can be run using the `$ make debug` command, but deviations ar Run all the services in the [web journey](#web-journey) -* [babbage](https://github.com/ONSdigital/babbage) - use: `$ ./run.sh` -* [zebedee](https://github.com/ONSdigital/zebedee) - use: `$ ./run-reader.sh` -* [sixteens](https://github.com/ONSdigital/sixteens) - use: `$ ./run.sh` -* [dp-frontend-router](https://github.com/ONSdigital/dp-frontend-router) -* [dp-frontend-homepage-controller](https://github.com/ONSdigital/dp-frontend-homepage-controller) -* [dp-frontend-cookie-controller](https://github.com/ONSdigital/dp-frontend-cookie-controller) -* [dp-frontend-dataset-controller](https://github.com/ONSdigital/dp-frontend-dataset-controller) -* [dp-frontend-feedback-controller](https://github.com/ONSdigital/dp-frontend-feedback-controller) +- [babbage](https://github.com/ONSdigital/babbage) - use: `$ ./run.sh` +- [zebedee](https://github.com/ONSdigital/zebedee) - use: `$ ./run-reader.sh` +- [sixteens](https://github.com/ONSdigital/sixteens) - use: `$ ./run.sh` +- [dp-frontend-router](https://github.com/ONSdigital/dp-frontend-router) +- [dp-frontend-homepage-controller](https://github.com/ONSdigital/dp-frontend-homepage-controller) +- [dp-frontend-cookie-controller](https://github.com/ONSdigital/dp-frontend-cookie-controller) +- [dp-frontend-dataset-controller](https://github.com/ONSdigital/dp-frontend-dataset-controller) +- [dp-frontend-feedback-controller](https://github.com/ONSdigital/dp-frontend-feedback-controller) -The website will be available at http://localhost:20000 +The website will be available at ### Publishing Run all of the services in the [web journey](#web-journey), **but** change the commands used to run babbage and zebedee to: -* [babbage](https://github.com/ONSdigital/babbage) - use: `$ ./run-publishing.sh` -* [zebedee](https://github.com/ONSdigital/zebedee) - use: `$ ./run.sh` +- [babbage](https://github.com/ONSdigital/babbage) - use: `$ ./run-publishing.sh` +- [zebedee](https://github.com/ONSdigital/zebedee) - use: `$ ./run.sh` and also run the following: -* [florence](https://github.com/ONSdigital/florence) - use: `$ make debug ENCRYPTION_DISABLED=true` -* [The-Train](https://github.com/ONSdigital/The-Train) - use: `$ ./run.sh` -* [dp-api-router](https://github.com/ONSdigital/dp-api-router) +- [florence](https://github.com/ONSdigital/florence) - use: `$ make debug ENCRYPTION_DISABLED=true` +- [The-Train](https://github.com/ONSdigital/The-Train) - use: `$ ./run.sh` +- [dp-api-router](https://github.com/ONSdigital/dp-api-router) If you also want to run Florence with the ability to edit images on the homepage (for the Featured Content section), you will need to additionally run: -* [dp-image-api](https://github.com/ONSdigital/dp-image-api) -* [dp-image-importer](https://github.com/ONSdigital/dp-image-importer) - use: `make debug ENCRYPTION_DISABLED=true` -* [dp-upload-service](https://github.com/ONSdigital/dp-upload-service) - use `make debug ENCRYPTION_DISABLED=true` -* [dp-download-service](https://github.com/ONSdigital/dp-download-service) - use: `make debug ENCRYPTION_DISABLED=true` +- [dp-image-api](https://github.com/ONSdigital/dp-image-api) +- [dp-image-importer](https://github.com/ONSdigital/dp-image-importer) - use: `make debug ENCRYPTION_DISABLED=true` +- [dp-upload-service](https://github.com/ONSdigital/dp-upload-service) - use `make debug ENCRYPTION_DISABLED=true` +- [dp-download-service](https://github.com/ONSdigital/dp-download-service) - use: `make debug ENCRYPTION_DISABLED=true` Florence will be available at [http://localhost:8081/florence/login](http://localhost:8081/florence/login). @@ -293,23 +291,23 @@ The website will be available at [http://localhost:8081](http://localhost:8081) All of the services in the [web](#web-journey), [publishing](#publishing-journey) and [CMD](#cmd-journeys) journeys need to be run for the full CMD journey to work. This journey includes importing data, publishing it and testing the public journey. -> You will want to make sure you have access to the Neptune test instance as well, if you want the entire CMD journey to be accessible. Details on how to set this up can be found [here](https://github.com/ONSdigital/dp/blob/main/guides/NEPTUNE.md). +> You will want to make sure you have access to the Neptune test instance as well, if you want the entire CMD journey to be accessible. Details on how to set this up can be found in [our Neptune guide](https://github.com/ONSdigital/dp/blob/main/guides/NEPTUNE.md). Use the following alternative commands: -* [florence](https://github.com/ONSdigital/florence) - use: `$ make debug ENCRYPTION_DISABLED=true` -* [dp-frontend-router](https://github.com/ONSdigital/dp-frontend-router) - use: `$ make debug DATASET_ROUTES_ENABLED=true` -* For every service in [dataset](#dataset-journey) and [filter](#filter-journey)- use: `make debug ENABLE_PRIVATE_ENDPOINTS=true` -* [dp-dimension-extractor](https://github.com/ONSdigital/dp-dimension-extractor) - use: `$ make debug ENCRYPTION_DISABLED=true` -* [dp-observation-extractor](https://github.com/ONSdigital/dp-observation-extractor) - use `$ make debug ENCRYPTION_DISABLED=true` +- [florence](https://github.com/ONSdigital/florence) - use: `$ make debug ENCRYPTION_DISABLED=true` +- [dp-frontend-router](https://github.com/ONSdigital/dp-frontend-router) - use: `$ make debug DATASET_ROUTES_ENABLED=true` +- For every service in [dataset](#dataset-journey) and [filter](#filter-journey)- use: `make debug ENABLE_PRIVATE_ENDPOINTS=true` +- [dp-dimension-extractor](https://github.com/ONSdigital/dp-dimension-extractor) - use: `$ make debug ENCRYPTION_DISABLED=true` +- [dp-observation-extractor](https://github.com/ONSdigital/dp-observation-extractor) - use `$ make debug ENCRYPTION_DISABLED=true` #### CMD Web If you already have content, and you just want to run the web journey, you'll need the [dataset](#dataset-journey), [filter](#filter-journey) and [web](#web-journey) services. Again, use the commands: -* [florence](https://github.com/ONSdigital/florence) - use: `$ make debug ENCRYPTION_DISABLED=true` -* [dp-frontend-router](https://github.com/ONSdigital/dp-frontend-router) - use: `$ make debug` -* unset `ENABLE_PRIVATE_ENDPOINTS` +- [florence](https://github.com/ONSdigital/florence) - use: `$ make debug ENCRYPTION_DISABLED=true` +- [dp-frontend-router](https://github.com/ONSdigital/dp-frontend-router) - use: `$ make debug` +- unset `ENABLE_PRIVATE_ENDPOINTS` Return to the [Getting Started](https://github.com/ONSdigital/dp/blob/master/guides/GETTING_STARTED.md) guide for next steps. From cfe304f67d6cd6a75684a2a2242d5b85da73987f Mon Sep 17 00:00:00 2001 From: lindenmckenzie Date: Wed, 29 Apr 2026 18:33:23 +0100 Subject: [PATCH 4/5] Remove trailing whitespace --- guides/RELEASES.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/guides/RELEASES.md b/guides/RELEASES.md index 42840e5..52b0afc 100644 --- a/guides/RELEASES.md +++ b/guides/RELEASES.md @@ -3,7 +3,7 @@ This is an overview of the sequence of steps that our tickets (sometimes called *tasks* or *stories*) take from initial idea to completion. -We currently have two different processes in place as we are transitioning between them. +We currently have two different processes in place as we are transitioning between them. ## Old process From f1eda2b2d7a1ab1b2aa832fb5d88f1e346e0655f Mon Sep 17 00:00:00 2001 From: lindenmckenzie Date: Wed, 29 Apr 2026 18:39:52 +0100 Subject: [PATCH 5/5] Reformat OT instrumentation guide --- guides/OT_INSTRUMENTATION.md | 113 +++++++++++++++++++---------------- 1 file changed, 61 insertions(+), 52 deletions(-) diff --git a/guides/OT_INSTRUMENTATION.md b/guides/OT_INSTRUMENTATION.md index 97449e1..b932318 100644 --- a/guides/OT_INSTRUMENTATION.md +++ b/guides/OT_INSTRUMENTATION.md @@ -1,7 +1,7 @@ -Instrumenting ONS services for Open Telemetry -=============== +# Instrumenting ONS services for Open Telemetry + +## Instrumenting Java services for OT -# Instrumenting Java services for OT These lines to be added in the Dockerfile(s), and to any run.sh scripts that are used locally: `-javaagent:target/dependency/aws-opentelemetry-agent-1.31.0.jar \` @@ -9,7 +9,8 @@ These lines to be added in the Dockerfile(s), and to any run.sh scripts that are `-Dotel.propagators=tracecontext,baggage \` For instance: -``` + +```sh ENTRYPOINT java $JAVA_OPTS \ -Drestolino.realm=$REALM \ -Drestolino.files=$RESTOLINO_STATIC \ @@ -21,9 +22,9 @@ ENTRYPOINT java $JAVA_OPTS \ com.github.davidcarboni.restolino.Main ``` - The following entry needs to be added to the pom: -``` + +```xml software.amazon.opentelemetry @@ -32,7 +33,6 @@ The following entry needs to be added to the pom: ``` - The following environment variables need to be set on the instance: `OTEL_SERVICE_NAME=` @@ -43,9 +43,9 @@ The URL specified has an identifier of http. In fact the protocol in use is GRPC on GRPC). However, the java URL parsing lib doesn't recognise 'grpc://' as a valid protocol, so the configuration requires it to be specified as above. - Spans can be created around individual calls within the code as follows: -``` + +```java import io.opentelemetry.api.GlobalOpenTelemetry; import io.opentelemetry.api.trace.Span; import io.opentelemetry.context.Scope; @@ -67,41 +67,43 @@ try (Scope scope = span.makeCurrent()) { } ``` - (NB - using the tracer pulled from the global scope this way works, but it's preferable to instantiate the tracer in the init code for the service then inject it into your method. See here for full guidance: [Open Telemetry Docs](https://opentelemetry.io/docs/instrumentation/java/manual/#:~:text=To%20create%20Spans%2C%20you%20only,set%20by%20the%20OpenTelemetry%20SDK.&text=It's%20required%20to%20call%20end,you%20want%20it%20to%20end)). - -## Logging Implementation +### Logging Implementation The existing logging library has been modified to extract the traceId from the traceparent header (if it exists) and add it to the TraceId section of the log message. This will enable log entries to be correlated with trace ids which will allow engineers to zero in on problems quickly and accurately. - The original logging library was modified in order to manage the change centrally and avoid the need for code changes across multiple applications. +## Instrumenting Go services for OT -# Instrumenting Go services for OT The following environment variables need to be created: -``` + +```go OTServiceName string `envconfig:"OTEL_SERVICE_NAME"` OTExporterOTLPEndpoint string `envconfig:"OTEL_EXPORTER_OTLP_ENDPOINT"` OTBatchTimeout time.Duration `envconfig:"OTEL_BATCH_TIMEOUT"` ``` + These can then be set in the config: -``` + +```go cfg = &Config{ OTExporterOTLPEndpoint: "localhost:4317", OTServiceName: "service-name", OTBatchTimeout: 5 * time.Second, } ``` -Note that the exporter endpoint is `:`, unlike the java configuration there is no protocol identifier + +Note that the exporter endpoint is `:`, unlike the Java configuration there is no protocol identifier Import the shared init library for go dp-otel-go `import "github.com/ONSdigital/dp-otel-go"` From the init code of the library initialise the otel services: -``` + +```go //Set up OpenTelemetry cfg, err := config.Get() @@ -122,12 +124,14 @@ defer func() { err = errors.Join(err, otelShutdown(context.Background())) }() ``` + NB: if this isn't done any calls to the otel service will fail silently. If you find that traces are not coming through, ensure this code is getting called. +### Instrumenting http handlers -## Instrumenting http handlers There are a wide range of different facilities for instrumenting http calls. The simplest (taken here from dp-search-api) simply creates a new opentelemetry handler to pass to the server and attaches otelmux middlewarer to the router: -``` + +```go import "go.opentelemetry.io/contrib/instrumentation/github.com/gorilla/mux/otelmux" import "go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp" ... @@ -139,10 +143,9 @@ router.Use(otelmux.Middleware(cfg.OTServiceName)) server := serviceList.GetHTTPServer(cfg.BindAddr, otelHandler) ``` - Where gorillamux or Chi is not being used for the router, it may be necessary to instrument individual routes as follows: -``` +```go func routes(router *mux.Router, hc *healthcheck.HealthCheck) *RendererAPI { api := RendererAPI{router: router} @@ -160,28 +163,31 @@ func routes(router *mux.Router, hc *healthcheck.HealthCheck) *RendererAPI { ``` Where the server is configured with an api field: -``` + +```go return &Service{ - api: searchAPI, + api: searchAPI, } ``` -Handlers can be wrapped as below: -``` + +Handlers can be wrapped as below: + +```go func (a *SearchAPI) RegisterGetSearch(...) *SearchAPI { - a.Router.Handle( - "/search", - otelhttp.NewHandler( - SearchHandlerFunc( + a.Router.Handle( + "/search", + otelhttp.NewHandler( + SearchHandlerFunc( ... - ), "/search"), - ).Methods(http.MethodGet) - return a + ), "/search"), + ).Methods(http.MethodGet) + return a } ``` - The following shows an alternative way to instrument: -``` + +```go func CreateRendererAPI(ctx context.Context, bindAddr string, allowedOrigins string, errorChan chan error, hc *healthcheck.HealthCheck) { router := mux.NewRouter() routes(router, hc) @@ -201,23 +207,25 @@ func CreateRendererAPI(ctx context.Context, bindAddr string, allowedOrigins stri } ``` - A purely middleware approach can also be taken where Alice is already in place chaining middleware. Both otelmux and otelhttp are used here to capture all requests with sufficient detail. Here you can see an example instrumentation: -``` + +```go func New(cfg Config) http.Handler { router := mux.NewRouter() - router.Use(otelmux.Middleware(cfg.OTServiceName)) - middleware := []alice.Constructor{ - otelhttp.NewMiddleware(cfg.OTServiceName), + router.Use(otelmux.Middleware(cfg.OTServiceName)) + middleware := []alice.Constructor{ + otelhttp.NewMiddleware(cfg.OTServiceName), ... - } - newAlice := alice.New(middleware...).Then(router) + } + newAlice := alice.New(middleware...).Then(router) } ``` -## Instrumenting http calls +### Instrumenting http calls + Outgoing service calls need to be instrumented to include the traceparent header when the handler itself is not instrumented. This can be done as follows: -``` + +```go import ("go.opentelemetry.io/otel" "go.opentelemetry.io/otel/propagation") ... @@ -225,11 +233,11 @@ import ("go.opentelemetry.io/otel" otel.GetTextMapPropagator().Inject(req.Context(), propagation.HeaderCarrier(req.Header)) ``` - -## Manually adding spans: +### Manually adding spans Similarly to the Java approach, you can create a span manually as follows: -``` + +```go import ("go.opentelemetry.io/otel") ... ... @@ -239,18 +247,19 @@ ctx, span := tracer.Start(r.Context(), "table render span") defer span.End() ``` -## Mongo Instrumentation: +### Mongo Instrumentation + The `dp-mongodb` package has been instrumented centrally as of version 3.7.0. This means that there is no need for additional instrumentation in services that import this package version or above. Make sure this version or above is imported! +### Kafka Instrumentation -## Kafka Instrumentation: The `dp-kafka` package has also be instrumented centerally as of version 4, this requires the context to be passed in order to work: -``` + +```go kafkaProducer.Channels().Output <- kafka.BytesMessage{Value: bytes, Context: ctx} ``` - -## Logging Implementation +### Logging Implementation - Go Go http request middleware was created to extract the traceId from the traceparent header and insert into the expected place in the request context (as controlled by the RequestIdKey in the github.com/ONSdigital/dp-net/v2/request package)