-
Notifications
You must be signed in to change notification settings - Fork 29
Slice 1: Add shared pipeline functions to qa-jenkins-library #589
Copy link
Copy link
Open
Labels
enhancementNew feature or requestNew feature or requestteam/pit-crewslack notifier for pit crewslack notifier for pit crew
Description
Parent PRD
What to build
Add all shared pipeline functions to the qa-jenkins-library Jenkins shared library repository. This is the foundational slice that all other slices depend on. Everything goes into a single combined PR.
The functions fall into three categories:
1. Parameter and utility functions:
resolvePipelineParams()— Parses job name, resolves BRANCH/REPO/TIMEOUT with standard defaults. Eliminates the duplicated 10-line parameter resolution block across all pipelines.standardDockerCleanup(containerNames, imageNames, volumeNames)— Runs docker stop/rm/rmi/volume rm. Eliminates the 15-line cleanup sequence duplicated in every pipeline.standardCredentialLoader(targetEnv)— Loads the appropriate credential set (AWS, Azure, GCP, vSphere, Harvester, registry) based on target environment. Currently each pipeline has a 30-50 linewithCredentialsblock.
2. Infrastructure primitive functions:
airgap.standardCheckout(params)— Clones bothtestsandqa-infra-automationrepos with parameterized branches. Eliminates the duplicated dual-repo checkout block.airgap.teardownInfrastructure(params)— Tofu select workspace + destroy + delete workspace as a single unit. Currently this 3-step teardown is copy-pasted in 3 different airgap files with subtle variations.airgap.configureAnsible(params)— Handles SSH key path configuration, inventory rendering, and variable substitution for Ansible playbooks.airgap.deployRKE2(params)— Runs the RKE2 tarball deployment playbook with standard retry handling.airgap.deployRancher(params)— Runs the Rancher helm deploy playbook conditionally.s3.uploadArtifact(workspaceName, localPath, s3Key)— Uploads a file to S3 using the standard Docker+awscli pattern with pathenv:/${workspaceName}/${s3Key}.s3.downloadArtifact(workspaceName, s3Key, localPath)— Downloads a file from S3 using the same pattern.s3.deleteArtifact(workspaceName, s3Key)— Deletes a file from S3.
3. Pipeline orchestration templates:
airgapInfraPipeline— Composes the primitives into a complete airgap infrastructure pipeline pattern (checkout, build image, configure SSH, tofu lifecycle, S3 upload, Ansible deploy).airgapTestPipeline— Extends infra with Go test execution (gotestsum invocation, Qase reporting, cattle-config generation).simpleTestPipeline— Shared parameters and stage flow for simple test runners (checkout, configure.sh, build.sh, gotestsum, report).
All functions should be written in Groovy for Jenkins Shared Library vars/ convention. Each function should have @param documentation.
Acceptance criteria
-
resolvePipelineParamsparses job name and returns resolved BRANCH, REPO, TIMEOUT with correct defaults -
standardDockerCleanupaccepts container/image/volume name lists and runs appropriate docker cleanup commands -
standardCredentialLoaderreturns the correct credential set for each supported target environment -
airgap.standardCheckoutclones both repos with parameterized branches into the correct directories -
airgap.teardownInfrastructureperforms the complete tofu select→destroy→deleteWorkspace sequence -
airgap.configureAnsiblehandles SSH key paths and inventory rendering -
airgap.deployRKE2runs the tarball playbook with retry handling -
airgap.deployRancherruns the helm deploy playbook -
s3.uploadArtifact/s3.downloadArtifact/s3.deleteArtifacthandle S3 operations with the standard path pattern -
airgapInfraPipelinecomposes primitives into a complete infra pipeline pattern -
airgapTestPipelineextends infra with Go test execution stages -
simpleTestPipelineprovides the standard test runner stage flow - All functions have
@paramdocumentation - PR submitted to qa-jenkins-library for review
Blocked by
None — can start immediately.
User stories addressed
- User story 4 (S3 upload/download via shared function)
- User story 5 (tofu lifecycle shared function)
- User story 6 (Ansible variable configuration shared function)
- User story 7 (checkout block as shared function)
- User story 17 (resolvePipelineParams function)
- User story 18 (standardDockerCleanup function)
- User story 19 (standardCheckout function in qa-jenkins-library)
- User story 20 (S3 artifact upload/download shared functions)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requestteam/pit-crewslack notifier for pit crewslack notifier for pit crew