This repository provides an automated CI/CD solution for deploying IBM App Connect Enterprise (ACE) integration servers using Tekton pipelines on OpenShift/Kubernetes with Cloud Pak for Integration.
- Overview
- Prerequisites
- Repository Structure
- Workflow
- Installation
- AppConnect Build Pipeline
- AppConnect Deploy Pipeline
- Automation with Webhooks
- Running the Complete Pipeline
- Troubleshooting
- Useful Commands
This project demonstrates automated deployment of ACE integration servers using:
- Tekton Pipelines for CI/CD orchestration
- AppConnect Operator for deployment management
- OpenShift Container Registry for image storage
- Git webhooks for automated triggering
- Automated build and deployment of ACE integration servers
- Support for two build approaches:
- Build from ACE toolkit project (generates BAR file from source)
- Build from pre-existing ACE BAR file
- Minimal ACE image for efficient testing and building
- Unit testing integration with Maven
- Git webhook-based pipeline triggering
- OpenShift/CP4I integration with operational dashboard support
The pipeline supports two types of builds:
- Toolkit Project Build (default): Generates a BAR file from ACE project sources
- BAR File Build: Uses pre-existing BAR files from the
bars/directory
The pipeline is triggered by Git events and executes the following:
- Clone the Git repository
- Build custom ACE image using Dockerfile
- Push image to OpenShift registry
- Deploy IntegrationServer CR using AppConnect Operator
To minimize image size for building and testing, a minimal ACE base image is built using a separate Tekton pipeline based on ace-docker.
Before setting up the pipeline, ensure you have:
- OpenShift/Kubernetes Cluster (v4.8+)
- AppConnect Operator - Creates and manages IntegrationServer custom resources
- Tekton Operator (OpenShift Pipelines Operator)
- Cloud Pak for Integration (CP4I) - For license management, common services, and operational dashboard
- For non-CP4I Kubernetes environments, modify the IntegrationServer CR in the deploy task
- Git Repository - For source code and webhook integration
The following images must be built before running the main pipeline:
- ACE Minimal Image - Base image for efficient building and testing
- ACE Maven Image - Built on top of minimal image, used for Java-based unit testing
- Cluster admin access or appropriate RBAC permissions
- OpenShift CLI (
oc) installed and configured - Access to OpenShift internal registry
pipeline-ace/
├── assets/ # Tekton pipeline resources and configurations
│ ├── tekton/ # Main pipeline, task, and resource definitions
│ ├── sample/ # Sample configurations and examples
│ └── docker/ # Additional Dockerfiles
├── bars/ # Directory for pre-built BAR files
├── source/ # ACE application source code (toolkit projects)
├── Dockerfile # Main Dockerfile for building ACE server image
├── DockerFileMvn # Dockerfile for Maven-based unit testing
├── DockerFile_aceminimal # Dockerfile for minimal ACE image
└── imgcfg # Configuration file for image name, tag, and app settings
- assets/tekton: Contains all Tekton pipeline, task, and resource YAML files
- bars/: Place pre-built BAR files here for BAR-based deployments
- source/: Contains ACE application source code from toolkit workspace. Built by pipeline and packaged as BAR file for deployment
- Git Push: Developer pushes code changes to the Git repository
- Webhook Trigger: Git webhook triggers the Tekton pipeline
- Build Phase:
- Clone Git repository (main branch)
- Build ACE image using Dockerfile
- Image name and tag are read from
imgcfgfile - Push image to OpenShift registry
- Deploy Phase:
- Create IntegrationServer custom resource (CR)
- AppConnect Operator deploys the integration server
- Server is registered with operational dashboard (if enabled)
The Tekton pipeline is configured using Kubernetes resources in the assets/tekton folder.
Resources are installed on the OpenShift cluster using:
oc create -f <resource.yaml>Or apply all resources at once:
oc apply -f assets/tekton/The pipeline requires privileged access to create directories in the Git workspace:
oc adm policy add-scc-to-user privileged -n <yourNamespace> -z pipelineThe minimal image is built using the repository ace-docker. The Dockerfile is located at experimental/ace-minimal/Dockerfile.alpine in that repository.
A copy of the Dockerfile is provided in this repo as DockerFile_aceminimal.
-
Create the build task:
oc create -f assets/tekton/acecicd-task-build-aceminimal.yaml
-
Create the pipeline:
oc create -f assets/tekton/acecicd-pipeline-build-img-acemin.yaml
-
Create the image resource (defines OpenShift registry location):
oc create -f assets/tekton/acecicd-res-image.yaml
-
Run the pipeline using the OpenShift UI or CLI
Note: The pipeline uses an external Git repository. The ACE developer image URL is configured in the task. Ensure version synchronization with the external repo, as scripts may include ACE version in their paths.
Unit test projects are Java-based and require Maven for compilation. The image is built using DockerFileMvn and depends on the ACE minimal image built in the previous step.
The Maven binaries are downloaded from Apache archive during the build.
Future Enhancement: Newer ACE versions support building unit test projects using ibmint command line (not yet implemented).
-
Create the build task:
oc create -f assets/tekton/acecicd-task-buildacemvn.yaml
-
Create the pipeline:
oc create -f assets/tekton/acecicd-pipeline-buildacemvn.yaml
-
Create the Git resource:
oc create -f assets/tekton/acecicd-res-git.yaml
-
Create the image resource (if not already created):
oc create -f assets/tekton/acecicd-res-image.yaml
-
Run the pipeline using the OpenShift UI
This pipeline creates a custom ACE image containing an IntegrationServer configured with resources from the source folder. The image can then be deployed directly to the cluster.
The pipeline builds App Connect source from the source folder in the repository.
(pipeline) acecicd-pack-test-build-acesrv [acecicd-pipeline-ptb-acesrv.yaml]
|-> (task) pack-test-build-aceserver [acecicd-task-ptb-acesrv.yaml]
|-> (step) prepare-bar
|-> (step) run-unittest
|-> (step) build-push-acesrv
The pipeline requires Git and image resources to retrieve source code and push the built image.
Defined in acecicd-res-git.yaml - specifies the Git repository location.
Important: Clone this repository and add your own source code in the source directory. Use project folders from your toolkit workspace.
Action Required: Update the $(params.url) in the Git resource to point to your repository URL.
Defined in acecicd-res-image.yaml - specifies the registry location for pushing and pulling images.
Default value: image-registry.openshift-image-registry.svc:5000 (standard OpenShift registry URL)
Prepares the BAR file for unit testing and image creation. Uses the minimal ACE image.
Process:
- Clones the Git repository
- Uses
mqsipackagebarto create the BAR file - Applies property overrides using
mqsiapplybaroverride(if property file exists)
Property File: Place a properties file named <application_name>.properties in the application folder (e.g., HelloWorld/HelloWorld.properties). An example is provided in the HelloWorld app.
Runs unit tests. The unit test project name is defined in the unittestprj variable in the imgcfg file.
Note: Unit test libraries have dependencies defined in the POM file. Some libraries are from the App Connect installation, which includes the AppConnect version in the path. If you change versions, you may need to update the POM file.
Builds the custom ACE image using the default Dockerfile.
Note: Issues have been encountered with Kaniko. Refer to the Troubleshooting section.
-
Create the resources:
oc create -f assets/tekton/acecicd-res-git.yaml oc create -f assets/tekton/acecicd-res-image.yaml
-
Create the pipeline:
oc create -f assets/tekton/acecicd-pipeline-ptb-acesrv.yaml
-
Create the task:
oc apply -f assets/tekton/acecicd-task-ptb-acesrv.yaml
-
Run the pipeline from the OpenShift UI
This pipeline deploys the custom image built by the build pipeline.
Configuration is provided in acecicd-task-deploy.yaml. Required properties can be configured at the pipeline level.
You may need to change these according to your environment:
- Tracing: Enabled by default with operational dashboard namespace set to
tracing - License:
L-APEH-C49KZH(CP4I non-production for ACE 12.0.1-12.0.2)- See License annotations for details
The namespace, image name, and tag are configured from the imgcfg file.
- The IntegrationServer version must be fully qualified
- The base image is
ibmcom/ace-server:latest, designed for use with AppConnect Operator
-
Create the deploy task:
oc apply -f assets/tekton/acecicd-task-deploy.yaml
-
Create the deploy pipeline:
oc apply -f assets/tekton/acecicd-pipeline-deploy.yaml
-
Create the image resource (if not already created):
oc create -f assets/tekton/acecicd-res-image.yaml
-
Start the pipeline using the OpenShift UI
Verify the IntegrationServer deployment:
oc get integrationserverExpected output:
NAME RESOLVEDVERSION REPLICAS AVAILABLEREPLICAS CUSTOMIMAGES STATUS AGE
ace-helloword-10 12.0.2.0-r1 1 1 true Ready 10m
The pipeline can be triggered automatically using a GitHub webhook. The trigger configuration is provided in trigger-template.yaml.
The following Kubernetes objects automate the deployment:
- EventListener (
acecicd-listener): Pod that receives GitHub webhooks and launches the pipeline - TriggerTemplate (
acecicd-pipeline-trigger): Defines which pipeline to run by creating a PipelineRun with name prefixacecicd-pipeline-run- - Route (
acecicd-webhook): Exposes the EventListener via a public URL for GitHub webhook configuration
-
Apply the trigger resources:
oc apply -f assets/tekton/trigger-template.yaml
Important: Update the namespace in the TriggerTemplate to match your environment.
-
Get the webhook URL:
oc get route acecicd-webhook
The route exposes an HTTP URL that can be used to configure the GitHub webhook.
-
Install the process-git pipeline:
oc apply -f assets/tekton/acecicd-pipeline-process-git.yaml
-
Install the get-config task:
oc apply -f assets/tekton/acecicd-task-get-config.yaml
The pipeline clones the Git repository and uses the get-config task to set environment variables from the imgcfg file.
-
Clone the repository:
git clone <your-repo-url> cd pipeline-ace
-
Create the GitHub webhook:
- Get the webhook URL:
oc get route acecicd-webhook - Configure the webhook in your GitHub repository settings
- Get the webhook URL:
-
Add your integration server project:
- Copy your integration server project directory (from toolkit workspace) to the
source/folder - Or use the provided example (
PingService)
- Copy your integration server project directory (from toolkit workspace) to the
-
Update the Dockerfile:
- Replace
PingServicewith your project directory name
- Replace
-
Configure the imgcfg file:
- Set the desired image name and tag
- Set the ACE application name
- Set the unit test project name (if applicable)
-
Push changes to Git:
git add . git commit -m "Configure pipeline for my project" git push
-
Pipeline executes automatically:
- The webhook triggers the pipeline
- Build and deploy processes run automatically
- The ACE server registers with the dashboard (if enabled)
-
Test the deployment:
- Navigate to the REST API endpoint
- Perform an HTTP GET request
An alternative example using Buildah for image building is provided in assets/sample/tekton/.
- Uses
registry.redhat.io/rhel8/buildahas the builder image - Default Dockerfile is
DockerFile(can be overridden with DOCKERFILE parameter) - Image URL format:
<image-pipeline-resource>/<namespace>/<imgcfg-name>:<imgcfg-version> - Image name and version are computed from the
imgcfgfile in the repository root
The task generates two results from the imgcfg file:
image_tag: Tag for the imageimage_name: Name of the image
These results are used by subsequent tasks to compose the full image URL.
Tekton changed how certificates are injected. The certificate volume mount path changed from /etc/config-registry-cert/ to /etc/ssl/certs. Since Tekton mounts a read-only volume, this can cause build failures.
Error Message:
Unpacking rootfs as cmd COPY bars /home/aceuser/initial-config/bars requires it.
error building image: error building stage: failed to get filesystem from image:
error removing ./etc/ssl/certs to make way for new symlink:
unlinkat /etc/ssl/certs/service-ca.crt: device or resource busy
Solution:
Set the SSL_CERT_DIR environment variable in your build step:
- name: build-push-acesrv
image: gcr.io/kaniko-project/executor:v0.16.0
env:
- name: "DOCKER_CONFIG"
value: "/tekton/home/.docker/"
- name: "SSL_CERT_DIR"
value: "/tmp/other-ssl-dir"References:
If you encounter errors related to missing files or incorrect paths, check that ACE versions are consistent across:
- Base images
- POM file dependencies
- Pipeline task configurations
Ensure the pipeline service account has privileged access:
oc adm policy add-scc-to-user privileged -n <yourNamespace> -z pipelineVerify that:
- The image resource URL is correct
- The OpenShift registry is accessible
- Service accounts have pull/push permissions
oc delete pipelinerun --all -n aceoc logs -f <pod-name> -n aceoc get integrationserver -n aceoc get pipeline,pipelinerun,task,taskrun -n aceoc describe pod <pod-name> -n ace
oc logs <pod-name> -n aceoc get route acecicd-webhook -n ace -o jsonpath='{.spec.host}'The imgcfg file contains key configuration variables:
export imgtag="1.1" # Image tag version
export imgname="ace-helloworld" # Image name
export aceappname="HelloWorld" # ACE application name
export unittestprj="HelloWorld_Test" # Unit test project nameUpdate these values according to your project requirements.
Refer to the IBM App Connect Enterprise licensing documentation for proper license configuration in your environment.
Contributions are welcome! Please ensure:
- YAML files are properly formatted
- Documentation is updated for new features
- Testing is performed in a non-production environment first
For issues and questions:
- Check the Troubleshooting section
- Review IBM App Connect Enterprise documentation
- Consult Tekton Pipelines documentation