I am building Jenkins with a Dockerfile, and during the Docker build I would like to have Jenkins pre-configured with a set of jobs. I find this works well with Jobs DSL, where jobs are seeded, but I have yet to preconfigure the "Pipeline" DSL. Given the direction of Jenkins and use of Jenkisfile, Pipeline, etc, I think there must be some way to allow Jenkins to automatically run with a set of jobs that were built using the Pipeline approach
Example Pipeline:
pipeline { agent { label 'cft' } parameters { string(name: 'StackName', defaultValue: 'cft-stack', description: 'The name to give the CFT stack.') string(name: 'KeyName', defaultValue: 'ACCOUNT', description: 'The account key to use for encryption.') string(name: 'VpcId', defaultValue: 'vpc-1234', description: 'The VPC to assign to the cluster resources.') string(name: 'SubnetID', defaultValue: 'subnet-1234, subnet-6789', description: 'The subnet(s) to assign to the cluster resources.') stages { stage('Build') { steps { s3Download(file:'cft.yaml' , bucket:'cft-resources' , path:'cft.yaml' , force:true) cfnUpdate(stack:"${params.StackName}" , file:"cft.yaml" , params:[ "SnapshotId=${params.SnapshotId}", "KeyName=${params.KeyName}", "VpcId=${params.VpcId}" ] , timeoutInMinutes: 20 ) } } } post { failure { echo 'FAILURE' cfnDelete(stack:"${params.StackName}") } } } Dockerfile:
COPY ./groovy/*.groovy /usr/share/jenkins/ref/init.groovy.d/