Introduction:
Day 26 was all about a Declarative pipeline, now its time to level up things, let's integrate Docker and your Jenkins declarative pipeline
On Day 27, we're embracing the containerization revolution by integrating Docker into our Jenkins Declarative Pipelines. This is an exciting step that brings a new dimension of flexibility and consistency to our workflows. Here's what we'll be exploring:
Use your Docker Build and Run Knowledge:
docker build - you can use
sh 'docker build . -t <tag>'
in your pipeline stage block to run the docker build command. (Make sure you have docker installed with correct permissions.docker run: you can use
sh 'docker run -d <image>'
in your pipeline stage block to build the container.
What will the stages of docker look like-
stages {
stage('Build') {
steps {
sh 'docker build -t trainwithshubham/django-app:latest'
}
}
}
Task 1:
Create a docker-integrated Jenkins declarative pipeline
Use the above-given syntax using
sh
inside the stage blockYou will face errors in case of running a job twice, as the docker container will be already created, so for that do task 2
Step 1: Create a new job and select pipeline.
Step 2: Then add a description and paste the URL of the GitHub project.
Step 3: Write a pipeline script using docker commands.
Step 4: Save and click on Build now.
Now run the job twice it will fail because Jenkins tries to build a container but we already have a running container with the same name.
Task 2:
Create a docker-integrated Jenkins declarative pipeline using the
docker
groovy syntax inside the stage block.(hint - use docker-compose)Complete your previous projects using this Declarative pipeline approach.
Step 1: Now to resolve this problem we've to change the pipeline script and use docker-compose in it.
Step 2: Now save it and Build again. This time you won't face any error and you can build multiple container.
Conclusion:๐
As we conclude Day 27 of our Jenkins Declarative Pipelines, you're now equipped with the knowledge to merge Jenkins and Docker seamlessly. By containerizing your pipeline steps, you're ensuring consistency, portability, and a smoother CI/CD process.