Skip to content

Slice 1: Add shared pipeline functions to qa-jenkins-library #589

@floatingman

Description

@floatingman

Parent PRD

#585

What to build

Add all shared pipeline functions to the qa-jenkins-library Jenkins shared library repository. This is the foundational slice that all other slices depend on. Everything goes into a single combined PR.

The functions fall into three categories:

1. Parameter and utility functions:

  • resolvePipelineParams() — Parses job name, resolves BRANCH/REPO/TIMEOUT with standard defaults. Eliminates the duplicated 10-line parameter resolution block across all pipelines.
  • standardDockerCleanup(containerNames, imageNames, volumeNames) — Runs docker stop/rm/rmi/volume rm. Eliminates the 15-line cleanup sequence duplicated in every pipeline.
  • standardCredentialLoader(targetEnv) — Loads the appropriate credential set (AWS, Azure, GCP, vSphere, Harvester, registry) based on target environment. Currently each pipeline has a 30-50 line withCredentials block.

2. Infrastructure primitive functions:

  • airgap.standardCheckout(params) — Clones both tests and qa-infra-automation repos with parameterized branches. Eliminates the duplicated dual-repo checkout block.
  • airgap.teardownInfrastructure(params) — Tofu select workspace + destroy + delete workspace as a single unit. Currently this 3-step teardown is copy-pasted in 3 different airgap files with subtle variations.
  • airgap.configureAnsible(params) — Handles SSH key path configuration, inventory rendering, and variable substitution for Ansible playbooks.
  • airgap.deployRKE2(params) — Runs the RKE2 tarball deployment playbook with standard retry handling.
  • airgap.deployRancher(params) — Runs the Rancher helm deploy playbook conditionally.
  • s3.uploadArtifact(workspaceName, localPath, s3Key) — Uploads a file to S3 using the standard Docker+awscli pattern with path env:/${workspaceName}/${s3Key}.
  • s3.downloadArtifact(workspaceName, s3Key, localPath) — Downloads a file from S3 using the same pattern.
  • s3.deleteArtifact(workspaceName, s3Key) — Deletes a file from S3.

3. Pipeline orchestration templates:

  • airgapInfraPipeline — Composes the primitives into a complete airgap infrastructure pipeline pattern (checkout, build image, configure SSH, tofu lifecycle, S3 upload, Ansible deploy).
  • airgapTestPipeline — Extends infra with Go test execution (gotestsum invocation, Qase reporting, cattle-config generation).
  • simpleTestPipeline — Shared parameters and stage flow for simple test runners (checkout, configure.sh, build.sh, gotestsum, report).

All functions should be written in Groovy for Jenkins Shared Library vars/ convention. Each function should have @param documentation.

Acceptance criteria

  • resolvePipelineParams parses job name and returns resolved BRANCH, REPO, TIMEOUT with correct defaults
  • standardDockerCleanup accepts container/image/volume name lists and runs appropriate docker cleanup commands
  • standardCredentialLoader returns the correct credential set for each supported target environment
  • airgap.standardCheckout clones both repos with parameterized branches into the correct directories
  • airgap.teardownInfrastructure performs the complete tofu select→destroy→deleteWorkspace sequence
  • airgap.configureAnsible handles SSH key paths and inventory rendering
  • airgap.deployRKE2 runs the tarball playbook with retry handling
  • airgap.deployRancher runs the helm deploy playbook
  • s3.uploadArtifact / s3.downloadArtifact / s3.deleteArtifact handle S3 operations with the standard path pattern
  • airgapInfraPipeline composes primitives into a complete infra pipeline pattern
  • airgapTestPipeline extends infra with Go test execution stages
  • simpleTestPipeline provides the standard test runner stage flow
  • All functions have @param documentation
  • PR submitted to qa-jenkins-library for review

Blocked by

None — can start immediately.

User stories addressed

  • User story 4 (S3 upload/download via shared function)
  • User story 5 (tofu lifecycle shared function)
  • User story 6 (Ansible variable configuration shared function)
  • User story 7 (checkout block as shared function)
  • User story 17 (resolvePipelineParams function)
  • User story 18 (standardDockerCleanup function)
  • User story 19 (standardCheckout function in qa-jenkins-library)
  • User story 20 (S3 artifact upload/download shared functions)

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestteam/pit-crewslack notifier for pit crew

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions