Jump to content

Search the Community

Showing results for tags 'jenkins'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

There are no results to display.

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 23 results

  1. Jenkins is an open-source continuous integration tool that automates technical tasks such as software testing, building, and deployment. It is a Java-based tool, and as a DevOP, knowing how to install and use Jenkins will save you time and resources. Jenkins supports numerous platforms, and this post focuses on installing it on Ubuntu 24.04. We will guide you through a step-by-step process to ensure you don’t get stuck. Let’s begin! Step-By-Step Installation of Jenkins on Ubuntu 24.04 The Jenkins repository is not included in Ubuntu 24.04. As such, we must fetch it and add it to our system. Again, we’ve mentioned that Jenkins is a Java-based tool. Therefore, you must have Java installed, and in this case, we will work with OpenJDK 11. Once you have the two prerequisites in place, installing Jenkins will be an easy task. Proceed with the below steps. Step 1: Install Java We must have a Java Runtime Environment before we can install and use Jenkins. However, not all Java versions are supported. To be safe, consider installing OpenJDK 8 or 11. Verify that you have installed the correct Java version. $ java -version If not installed, use the following command to install OpenJDK 11. $ sudo apt install openjdk-11-jdk Step 2: Fetch and Add the Jenkins Repository Jenkins is available as a stable or weekly version. This step requires us to download the Jenkins GPG key and then its software repository. After verification, we can then add the repository to our source list. First, let’s execute the following command to import the Jenkins GPG key. $ sudo wget -O /usr/share/keyrings/jenkins-keyring.asc https://pkg.jenkins.io/debian-stable/jenkins.io-2023.key The next task is adding the Jenkins repository by executing the following command. $ echo deb [signed-by=/usr/share/keyrings/jenkins-keyring.asc] https://pkg.jenkins.io/debian-stable binary/ | sudo tee /etc/apt/sources.list.d/jenkins.list > /dev/null Step 3: Install Jenkins After adding the stable Jenkins release to our source list, we can proceed with installing it, but first, let’s update the Ubuntu 24.04 repository to refresh the source list. $ sudo apt update Next, install Jenkins and ensure the installation completes without interruptions. $ sudo apt install jenkins -y Once installed, check the version to confirm that we managed to install it successfully. $ jenkins --version Step 4: Configure the Firewall We must modify our Firewall to create a rule allowing Jenkins to communicate via port 8080. First, start the Jenkins service. $ sudo systemctl start jenkins $ sudo systemctl status jenkins Next, add a new UFW rule and check that your firewall is active. If the firewall is inactive, enable it. $ sudo ufw allow 8080 $ sudo ufw status Step 5: Configure Jenkins We will access Jenkins via a browser to set it up. On your browser tab, access the below URL. Be sure to add the correct IP or domain name of your server and port number 8080. http://ip_address:8080 You will get a window displaying the “Getting Started” information. On the page, find the path to the file containing the administrator password. login Go back to your terminal and open the file using a text editor or a command such as “cat.” $ sudo cat /var/lib/jenkins/secrets/initialAdminPassword The administrator password will be displayed on your terminal. Copy the generated password and paste it into your browser in the “Administrator password” input box. At the bottom of the window, click on the Continue button. A new window will open. Click on the selected option to “Install suggested plugins.” Jenkins will initiate the setup. Once the process is complete, you will be prompted to create your administrator credentials. Type the admin username and password, then click the “Save and Continue” button. On the next window, note the Jenkins URL and click the “Save and Finish” button That’s it. Jenkins is now installed and configured on your Ubuntu 24.04. Click on the Start using Jenkins button to enjoy using Jenkins. You will get a window similar to the one below. Conclusion Jenkins has numerous applications, especially for developers. If you use Ubuntu Noble Numbat, this post has shared a step-by-step guide on how to install Jenkins. Hopefully, this post will be insightful to you, and you will be able to install Jenkins. View the full article
  2. In the fast-paced world of software development, efficiency is paramount. Automating repetitive tasks is key to achieving faster delivery cycles and improved quality. This is where Jenkins comes in — a free and open-source automation server that has become synonymous with continuous integration (CI) and continuous delivery (CD). Jenkins, the open-source automation powerhouse, plays a pivotal role in the DevOps world. But have you ever wondered how it all works under the hood? This blog delves into the intricate architecture of Jenkins, breaking down its core components and how they orchestrate the automation magic. View the full article
  3. Releasing software often and with confidence relies on a strong continuous integration and continuous delivery (CI/CD) process that includes the ability to automate tests. Jenkins offers an open source automation server that facilitates such release of software projects. In this article, we will explore how you can run tests based on the open source Testcontainers framework in a Jenkins pipeline using Docker and Testcontainers Cloud. Jenkins, which streamlines the development process by automating the building, testing, and deployment of code changes, is widely adopted in the DevOps ecosystem. It supports a vast array of plugins, enabling integration with various tools and technologies, making it highly customizable to meet specific project requirements. Testcontainers is an open source framework for provisioning throwaway, on-demand containers for development and testing use cases. Testcontainers makes it easy to work with databases, message brokers, web browsers, or just about anything that can run in a Docker container. Testcontainers also provides support for many popular programming languages, including Java, Go, .NET, Node.js, Python, and more. This article will show how to test a Java Spring Boot application (testcontainers-showcase) using Testcontainers in a Jenkins pipeline. Please fork the repository into your GitHub account. To run Testcontainers-based tests, a Testcontainers-supported container runtime, like Docker, needs to be available to agents. Note: As Jenkins CI servers are mostly run on Linux machines, the following configurations are tested on a Linux machine only. Docker containers as Jenkins agents Let’s see how to use dynamic Docker container-based agents. To be able to use Docker containers as agents, install the Docker Pipeline plugin. Now, let’s create a file with name Jenkinsfile in the root of the project with the following content: pipeline { agent { docker { image 'eclipse-temurin:17.0.9_9-jdk-jammy' args '--network host -u root -v /var/run/docker.sock:/var/run/docker.sock' } } triggers { pollSCM 'H/2 * * * *' } // poll every 2 mins stages { stage('Build and Test') { steps { sh './mvnw verify' } } } } We are using the eclipse-temurin:17.0.9_9-jdk-jammy Docker container as an agent to run the builds for this pipeline. Note that we are mapping the host’s Unix Docker socket as a volume with root user permissions to make it accessible to the agent, but this can potentially be a security risk. Add the Jenkinsfile and push the changes to the Git repository. Now, go to the Jenkins Dashboard and select New Item to create the pipeline. Follow these steps: Enter testcontainers-showcase as pipeline name. Select Pipeline as job type. Select OK. Under Pipeline section: Select Definition: Pipeline script from SCM. SCM: Git. Repository URL: https://github.com/YOUR_GITHUB_USERNAME/testcontainers-showcase.git. Replace YOUR_GITHUB_USERNAME with your actual GitHub username. Branches to build: Branch Specifier (blank for ‘any’): */main. Script Path: Jenkinsfile. Select Save. Choose Build Now to trigger the pipeline for the first time. The pipeline should run the Testcontainers-based tests successfully in a container-based agent using the remote Docker-in-Docker based configuration. Kubernetes pods as Jenkins agents While running Testcontainers-based tests on Kubernetes pods, you can run a Docker-in-Docker (DinD) container as a sidecar. To use Kubernetes pods as Jenkins agents, install Kubernetes plugin. Now you can create the Jenkins pipeline using Kubernetes pods as agents as follows: def pod = """ apiVersion: v1 kind: Pod metadata: labels: name: worker spec: serviceAccountName: jenkins containers: - name: java17 image: eclipse-temurin:17.0.9_9-jdk-jammy resources: requests: cpu: "1000m" memory: "2048Mi" imagePullPolicy: Always tty: true command: ["cat"] - name: dind image: docker:dind imagePullPolicy: Always tty: true env: - name: DOCKER_TLS_CERTDIR value: "" securityContext: privileged: true """ pipeline { agent { kubernetes { yaml pod } } environment { DOCKER_HOST = 'tcp://localhost:2375' DOCKER_TLS_VERIFY = 0 } stages { stage('Build and Test') { steps { container('java17') { script { sh "./mvnw verify" } } } } } } Although we can use a Docker-in-Docker based configuration to make the Docker environment available to the agent, this setup also brings configuration complexities and security risks. By volume mounting the host’s Docker Unix socket (Docker-out-of-Docker) with the agents, the agents have direct access to the host Docker engine. When using DooD approach file sharing, using bind-mounting doesn’t work because the containerized app and Docker engine work in different contexts. The Docker-in-Docker (DinD) approach requires the use of insecure privileged containers. You can watch the Docker-in-Docker: Containerized CI Workflows presentation to learn more about the challenges of a Docker-in-Docker based CI setup. This is where Testcontainers Cloud comes into the picture to make it easy to run Testcontainers-based tests more simply and reliably. By using Testcontainers Cloud, you don’t even need a Docker daemon running on the agent. Containers will be run in on-demand cloud environments so that you don’t need to use powerful CI agents with high CPU/memory for your builds. Let’s see how to use Testcontainers Cloud with minimal setup and run Testcontainers-based tests. Testcontainers Cloud-based setup Testcontainers Cloud helps you run Testcontainers-based tests at scale by spinning up the dependent services as Docker containers on the cloud and having your tests connect to those services. If you don’t have a Testcontainers Cloud account already, you can create an account and get a Service Account Token as follows: Sign up for a Testcontainers Cloud account. Once logged in, create an organization. Navigate to the Testcontainers Cloud dashboard and generate a Service account (Figure 1). Figure 1: Create a new Testcontainers Cloud service account. To use Testcontainers Cloud, we need to start a lightweight testcontainers-cloud agent by passing TC_CLOUD_TOKEN as an environment variable. You can store the TC_CLOUD_TOKEN value as a secret in Jenkins as follows: From the Dashboard, select Manage Jenkins. Under Security, choose Credentials. You can create a new domain or use System domain. Under Global credentials, select Add credentials. Select Kind as Secret text. Enter TC_CLOUD_TOKEN value in Secret. Enter tc-cloud-token-secret-id as ID. Select Create. Next, you can update the Jenkinsfile as follows: pipeline { agent { docker { image 'eclipse-temurin:17.0.9_9-jdk-jammy' } } triggers { pollSCM 'H/2 * * * *' } stages { stage('TCC SetUp') { environment { TC_CLOUD_TOKEN = credentials('tc-cloud-token-secret-id') } steps { sh "curl -fsSL https://get.testcontainers.cloud/bash | sh" } } stage('Build and Test') { steps { sh './mvnw verify' } } } } We have set the TC_CLOUD_TOKEN environment variable using the value from tc-cloud-token-secret-id credential we created and started a Testcontainers Cloud agent before running our tests. Now if you commit and push the updated Jenkinsfile, then the pipeline will run the tests using Testcontainers Cloud. You should see log statements similar to the following indicating that the Testcontainers-based tests are using Testcontainers Cloud instead of the default Docker daemon. 14:45:25.748 [testcontainers-lifecycle-0] INFO org.testcontainers.DockerClientFactory - Connected to docker: Server Version: 78+testcontainerscloud (via Testcontainers Desktop 1.5.5) API Version: 1.43 Operating System: Ubuntu 20.04 LTS Total Memory: 7407 MB You can also leverage Testcontainers Cloud’s Turbo mode in conjunction with build tools that feature parallel run capabilities to run tests even faster. In the case of Maven, you can use the -DforkCount=N system property to specify the degree of parallelization. For Gradle, you can specify the degree of parallelization using the maxParallelForks property. We can enable parallel execution of our tests using four forks in Jenkinsfile as follows: stage('Build and Test') { steps { sh './mvnw verify -DforkCount=4' } } For more information, check out the article on parallelizing your tests with Turbo mode. Conclusion In this article, we have explored how to run Testcontainers-based tests on Jenkins CI using dynamic containers and Kubernetes pods as agents with Docker-out-of-Docker and Docker-in-Docker based configuration. Then we learned how to create a Testcontainers Cloud account and configure the pipeline to run tests using Testcontainers Cloud. We also explored leveraging Testcontainers Cloud Turbo mode combined with your build tool’s parallel execution capabilities. Although we have demonstrated this setup using a Java project as an example, Testcontainers libraries exist for other popular languages, too, and you can follow the same pattern of configuration to run your Testcontainers-based tests on Jenkins CI in Golang, .NET, Python, Node.js, etc. Get started with Testcontainers Cloud by creating a free account at the website. Learn more Sign up for a Testcontainers Cloud account. Watch the Docker-in-Docker: Containerized CI Workflows session from DockerCon 2023. Subscribe to the Docker Newsletter. Get the latest release of Docker Desktop. Vote on what’s next! Check out our public roadmap. Have questions? The Docker community is here to help. New to Docker? Get started. View the full article
  4. Jenkins is an open-source automation server widely used for building, testing, and deploying software projects. It provides a platform for continuous integration and continuous delivery (CI/CD), allowing development teams to automate various tasks in the software development lifecycle. View the full article
  5. Tools and platforms form the backbone of seamless software delivery in the ever-evolving world of Continuous Integration and Continuous Deployment (CI/CD). For years, Jenkins has been the stalwart, powering countless deployment pipelines and standing as the go-to solution for many DevOps professionals. But as the tech landscape shifts towards cloud-native solutions, AWS CodePipeline emerges as a formidable contender. Offering deep integration with the expansive AWS ecosystem and the agility of a cloud-based platform, CodePipeline is redefining the standards of modern deployment processes. This article dives into the transformative power of AWS CodePipeline, exploring its advantages over Jenkins and showing why many are switching to this cloud-native tool. Brief Background About CodePipeline and Jenkins At its core, AWS CodePipeline is Amazon Web Services' cloud-native continuous integration and continuous delivery service, allowing users to automate the build, test, and deployment phases of their release process. Tailored to the vast AWS ecosystem, CodePipeline leverages other AWS services, making it a seamless choice for teams already integrated with AWS cloud infrastructure. It promises scalability, maintenance ease, and enhanced security, characteristics inherent to many managed AWS services. On the other side of the spectrum is Jenkins – an open-source automation server with a storied history. Known for its flexibility, Jenkins has garnered immense popularity thanks to its extensive plugin system. It's a tool that has grown with the CI/CD movement, evolving from a humble continuous integration tool to a comprehensive automation platform that can handle everything from build to deployment and more. Together, these two tools represent two distinct eras and philosophies in the CI/CD domain. View the full article
  6. Jenkins is a popular open-source CI/CD that helps automate various aspects of software development, including building, testing, and deploying applications. Jenkins is highly extensible, with over 1,000 available plugins, which help to integrate with various third-party tools and technologies. Consider a scenario where you're working on a large software project with multiple developers. Testing each and every change manually can be time-consuming and prone to human error. This is where Jenkins test cases can come in handy. View the full article
  7. Are you confident that with a CI/CD tool like Jenkins, your software delivery solutions can quickly integrate, easily manage, and rapidly scale to deliver a secure and quality application? Join this webinar to understand the challenges of using Jenkins and how they can be addressed by migrating to CircleCI on AWS to accelerate your software […] View the full article
  8. Jenkins is a server utilized to build and test software projects and make them efficient by integrating with alterations to the project. After that, these changes will be attached to the original code to be used with the application. Amazon’s cloud platform offers its users to set up a Jenkins Build server using its services. This guide contains the following sections: How to Install and Set up Jenkins on AWS EC2? How to Set up a Build Server Using Jenkins? How to Install and Set up Jenkins on AWS EC2? To set up a Jenkins Build Server, create and connect to the EC2 instance. To look at the creation and connection process, click here: Update the yum packages: sudo yum update -y Get extra packages from the Jenkins repository: sudo wget -O /etc/yum.repos.d/jenkins.repo https://pkg.jenkins.io/redhat-stable/jenkins.repo Import packages from the link: sudo rpm --import https://pkg.jenkins.io/redhat-stable/jenkins.io.key Upgrade the yum packages: sudo yum upgrade Install Java JDK 11 on the Amazon Linux instance: sudo amazon-linux-extras install java-openjdk11 -y Install Jenkins using the following command: sudo yum install jenkins -y Enable Jenkins service: sudo systemctl enable jenkins Start Jenkins service: sudo systemctl start jenkins Verify that the Jenkins is started: sudo systemctl status jenkins Head into the EC2 dashboard and copy the IP address of the instance: Paste the IP address with port 8080 on the web browser: Get the password to log in to the Jenkins Server: sudo cat /var/lib/jenkins/secrets/initialAdminPassword Copy the password provided upon the execution of the above command: Paste the password and click on the “Continue” button: Select the plugins to be installed on the Jenkins server: Type GitHub in the search bar of Jenkins and click on the “Install” button: It will take a few moments to install the plugins on Jenkins: Create a user on Jenkins by providing credentials and then clicking on the “Save and Continue” button: Verify the address and click on the “Save and Finish” button: The Jenkins server is ready to be used by clicking on the “Start using Jenkins” button: How to Set up a Build Server Using Jenkins? To build a server click on the “Configure a cloud” tab: Install cloud plugins by clicking on the link: Search for the EC2 instance and install its plugins by clicking on the “Install without restart” button: After installing plugins, locate “Manage Jenkins” from the left menu and click on it: Click on the “Manage Nodes and Clouds” button: Click on the “Configure Clouds” button: Add Amazon EC2 service to be used in Jenkins server: Click on the “Add” button under the EC2 credentials tab: Provide the IAM credentials to the Server: Add Access and Secret keys and then click on the “Add” button: After that, provide the Region and click on the “Add” button for EC2 private key pair section: Select the “SSH Username with private key” and enter “ec2-user” as Username: Select the “Enter directly” option and paste the contents of the private key pair: Click on the “Test Connection” button and press the “Save” button: Built-In node has been created successfully: This was all about setting up a Jenkins Build server with AWS service. Conclusion To set up a Jenkins Server, install Jenkins on the EC2 instance and then access it using the IP address with port 8080 on the web browser. After that, install plugins from the cloud configuration and configure EC2 settings. After the configuration, test the connection to get the success message which indicates that the setup has been created. This guide has demonstrates the process of installing the Jenkins on EC2 and then setting up Build server on it. View the full article
  9. Grateful for any tips guys. i am learning Dev Ops from Youtube course, the steps from course using Jenkins : - create job to download git hub repo. - build with maven packege command in course the file built with .war extension, but with me its always .jar file. How can i make the build with .war file.
  10. The continuous integration and continuous delivery (CI/CD) pipeline is a fundamental component of the software delivery process for DevOps teams. The pipeline leverages automation and continuous monitoring to enable seamless delivery of software. With continuous automation, it’s important to ensure security for every step of the CI/CD pipeline. Sensitive information like access credentials is often […] View the full article
  11. Listed below are free tutorial videos for the best CI/CD tools; ArgoCD AWS (CodeBuild, CodeDeploy CodePipeline) Azure DevOps CircleCI GitLab Jenkins ArgoCD Tutorials AWS Tutorials Azure DevOps Tutorials CircleCI Tutorials GitLab Tutorials Jenkins Tutorials
  12. Course to help SREs, software developers, software architects and other DevOps professionals use Jenkins to improve their CI/CD practices SAN FRANCISCO, October 6, 2020 – The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the availability of a new training course, LFS267 – Jenkins Essentials. LFS267, developed in conjunction with the Continuous Delivery Foundation, is […] The post New Training Course Helps Gain Expertise with Jenkins CI/CD appeared first on DevOps.com. View the full article
  13. Homepage Jenkins is a self-contained, open source automation server which can be used to automate all sorts of tasks related to building, testing, and delivering or deploying software. Jenkins can be installed through native system packages, Docker, or even run standalone by any machine with a Java Runtime Environment (JRE) installed.
  14. The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the availability of a new training course, LFS267 – Jenkins Essentials. LFS267, developed in conjunction with the Continuous Delivery Foundation, is designed for DevOps engineers, Quality Assurance personnel, SREs as well as software developers and architects who want to gain expertise with Jenkins for their continuous integration (CI) and continuous delivery (CD) activities. Source: Linux Foundation Training The post New Training Course from Continuous Delivery Foundation Helps Gain Expertise with Jenkins CI/CD appeared first on Linux.com. View the full article
  15. ServiceNow today announced it has added four new integrations with DevOps platforms to its IT service management (ITSM) platform delivered via the cloud. The four integrations add native support for Microsoft Azure Pipelines, a Jenkins Plugin, GitHub Actions and GitLab pipelines. A continuous integration/continuous delivery (CI/CD) Spoke capability on the ServiceNow platform makes it possible […] The post ServiceNow Extends DevOps Integration Reach appeared first on DevOps.com. View the full article
  16. Hosting Jenkins on a Kubernetes cluster is beneficial for Kubernetes-based deployments and dynamic container-based scalable Jenkins agents. In this guide, I have explained the step-by-step process for setting up Jenkins on a Kubernetes cluster... View the full article
  17. The resource utilization of the Jenkins slaves is very less if you do not have builds happening continuously. In this scenario, it is better to use ephemeral Docker containers as Jenkins build slaves for better resource utilization. As you know, spinning up a new container takes less than a minute; every build spins up a new container, builds the project, and is destroyed. This way, you can reduce the number of static Jenkins build VMs. Docker containers as Build Slaves In this guide, I will walk you through the steps for configuring docker container as build slaves. I assume that you have a Jenkins server up and running. If you do not have one, follow this tutorial. How to setup Jenkins 2 If you want docker based Jenkins setup, you can follow this tutorial -> Setup Jenkins On a Docker container Let’s Implement It Configure a Docker Host With Remote API [Important] The first thing we should do is set up a docker host. Jenkins server will connect to this host for spinning up the slave containers. I am going to use the Centos server as my docker host. You can use any OS which supports Docker. Jenkins master connects to the docker host using REST APIs. So we need to enable the remote API for our docker host. Make sure the following ports are enabled in your server firewall to accept connections from Jenkins master. Docker Remote API port4243Docker Hostport Range32768 to 60999 32768 to 60999 is used by Docker to assign a host port for Jenkins to connect to the container. Without this connection, the build slave would go in a pending state. Lets get started, Step 1: Spin up a VM, and install docker on it. You can follow the official documentation for installing docker. based on the Linux distribution you use. Make sure the docker service is up and running. Step 2: Log in to the server and open the docker service file /lib/systemd/system/docker.service. Search for ExecStart and replace that line with the following. ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:4243 -H unix:///var/run/docker.sock Step 3: Reload and restart docker service. sudo systemctl daemon-reload sudo service docker restart Step 4: Validate API by executing the following curl commands. Replace 54.221.134.7 with your host IP. curl http://localhost:4243/version curl http://54.221.134.7:4243/version Check the docker remote API article for a detailed explanation of Docker API. Once you enabled and tested the API, you can now start building the docker slave image. Create a Jenkins Agent Docker Image I have created a Jenkins docker image for maven. You can use this image or use its Dockerfile as a reference for creating your own. If you are creating the image on your own, its image should contain the following minimum configurations to act as a slave. sshd service running on port 22.Jenkins user with password.All the required application dependencies for the build. For example, for a java maven project, you need to have git, java, and maven installed on the image.Make sure sshd service is running and can be logged into the containers using a username and password. Otherwise, Jenkins will not be able to start the build process. Note: The default ssh username is jenkins and password is also jenkins as per the given Dockerfile. You will have to use these credentials in the below configuration. Configure Jenkins Server Step 1: Head over to Jenkins Dashboard –> Manage Jenkins –> Manage Plugins. Step 2: Under the Available tab, search for “Docker” and install the docker cloud plugin and restart Jenkins. Here is the official plugin site. Make sure you install the right plugin as shown below. Step 3: Once installed, head over to Jenkins Dashboard –> Manage Jenkins –>Configure system. Step 4: Under “Configure System”, if you scroll down, there will be a section named “cloud” at the last. There you can fill out the docker host parameters for spinning up the slaves. Note: In Jenkins versions 2.200 or later you will find dedicated cloud configuration under Manage Jenkins –> Manage Nodes and Clouds Step 5: Under docker, you need to fill out the details as shown in the image below. Note: Replace “Docker URI” with your docker host IP. For example, tcp://10.128.0.3:4243 You can use the “Test connection” to test if Jenkins is able to connect to the Docker host. Step 6: Now, from “Docker Agent Template” dropdown, click the “Add Docker template” and fill in the details based on the explanation and the image given below and save the configuration. Labels – Identification for the docker host. It will be used in the Job configuration. Here we use java-docker-slaveName: Name of the docker template. Here we use the same name as label ie, java-docker-slaveDocker Image – bibinwilson/jenkins-slave:latest or the image that you created for the slave.Remote Filing System Root – Home folder for the user you have created. In our case, it’s /home/jenkinsCredentials – click add and enter the SSH username and password that you have created for the docker image. Leave the rest of the configuration as shown in the image below and click save. If you are using my Docker image, the user will be jenkins & password is also jenkins.Note: There are additional configurations like registry authentication and container settings that you might have to use when configuring this set up in the corporate network. You can also use JNLP based slave agents. For this, the configurations need a little change as shown below. Primarily the docker image name and the connect method. Note: For JNLP to work, you need to enable the JNLP connection port (50000) in Jenkins’s global security configuration (TCP port for inbound agents). Also, the Jenkins master firewall should be able to accept this connection form the docker host. By default, the workspace will not be persisted in the host. However, if you want the workspace to be persistent, add a host volume path under container settings. For example, if you want the workspace to be available at /home/ubuntu, you can add the volume path as shown below. /home/jenkins is the path inside the container. /home/ubuntu:/home/jenkins Towards the right of the Volumes option, if you click the question mark, it will show you additional volume options as shown below. If you are planning to run docker in docker for your CI process, you can mount the host docker.sock as volume to execute docker commands. Check out my article on running docker in docker to know more about it. Test Docker Slaves Using FreeStyle Job Now that you have the slave configurations ready, Create a freestyle job, select “Restrict where this project can be run” option and select the docker host as a slave using the label.Add a shell build step which echoes a simple “Hello World“ If you have done all the configurations right, Jenkins will spin up a container, builds the project and destroys the container once the build is done. First you will see a pending notification as Jenkins tries to deploy a container on run time and establishes an SSH connection. After a few seconds, your job will start building. You can check the build logs in your jobs console output as shown below. Also, you can check out the video explaining the whole process. Possible Errors: Jenkins is not able to deploy containers on the host:– Please make sure you have proper connectivity to the docker host on API port.Jenkins builds goes in the pending state forever:– Make sure you have Docker host ports (32768 to 60999) access from Jenkins master to docker host.JNLP slaves go into the pending state: Make sure you have enabled the JNLP port in the Jenkins global security configuration.Conclusion In this article, I walked you through the process of setting up dynamic slaves using Docker. It can be further customized to fit your specific use cases. Please let me know your thoughts in the comment section. Also, don’t forget to share this article View the full article
  18. In Jenkins’s declarative pipeline, you can add parameters as part of Jenkinsfile. There are many supported parameters type that you can use with a declarative pipeline. In this blog, you have answers to the following. How to use parameters in the declarative pipeline?How to use dynamic parameters or active choice parameters in the declarative pipeline?Generating Pipeline Code for Parameters You can generate the parameter pipeline code block easily using the Jenkins pipeline generator. You will find the Pipeline syntax generator link under all the pipeline jobs, as shown in the image below. Navigate to the pipeline generator in Jenkins and under steps, search for properties, as shown below. Using Parameters in Jenkinsfile Here is an example ready to use Jenkins declarative pipeline with parameters. This script has the following parameter types. Choice parametersBoolean parameterMulti-line string parameterString ParameterHere is the Github link for this code. pipeline { agent any stages { stage('Setup parameters') { steps { script { properties([ parameters([ choice( choices: ['ONE', 'TWO'], name: 'PARAMETER_01' ), booleanParam( defaultValue: true, description: '', name: 'BOOLEAN' ), text( defaultValue: ''' this is a multi-line string parameter example ''', name: 'MULTI-LINE-STRING' ), string( defaultValue: 'scriptcrunch', name: 'STRING-PARAMETER', trim: true ) ]) ]) } } } } } Note: The parameters specified in the Jenkinsfile will appear in the job only after the first run. Your first job run will fail as you will not be able to provide the parameter value through the job. Access Parameters Inside Pipeline Stages You can access a parameter in any stage of a pipeline. Accessing parameters in stages is pretty straightforward. You just have to use params.[NAME] in places where you need to substitute the parameter. Here is an example of a stage that will be executed based on the condition that we get from the choice parameter. The parameter name is ENVIRONMENT, and we access it in the stage as params.ENVIRONMENT. So when the choice parameter matches PROD, it will execute the steps mentioned in the stage. stage('Deploy to Production') { when { expression { return params.ENVIRONMENT == 'PROD' } } steps { sh """ echo "deploy to production" """ } } } Using Active Choice Parameter in Declarative Pipeline for Dynamic Parameters Unlike default parameter types, the Active choice parameter type gives you more control over the parameters using a groovy script. You can have dynamic parameters based on user parameter selection. To use the active choice parameter, you need to have an Active Choices plugin installed in Jenkins. Here is a small use case for an active choice parameter. A job should have three parametersEnvironment (dev, stage & prod)AMI List (Should list the AMIs based on environment)AMI information (Show information about the AMIs related to a specific environment)If the user selects dev, the AMI list should dynamically change the values related to dev and show information related to the AMIs.Here is the image which shows the above use case. It shows how the AMI list and AMI information changes when you select different environments. There are three types of active choice parameters. Active Choices Parameter Thi parameter type returns a set of parameters returned by the groovy script. For example, an environment parameter that lists dev, stage, and prod values. return['dev','stage','prod'] You can also return values from third party APIs as parameters. One such example is dynamically showing folders from a Github repo in the Jenkins parameters. To make this work you just need to write a groovy script that calls Github APIs and query the folders of the specific repository. Active Choices Reactive Parameter Returns parameters based on conditions based on another referenced parameter. You can refer to an active choice parameter and return a parameter based on a condition. For example, if the environment parameter is selected as a dev, the reactive parameter will return AMI ids for dev based on groovy conditions. In the following example, Env is the reference parameter. if (Env.equals("dev")){ return["ami-sd2345sd", "ami-asdf245sdf", "ami-asdf3245sd"] } else if(Env.equals("stage")){ return["ami-sd34sdf", "ami-sdf345sdc", "ami-sdf34sdf"] } else if(Env.equals("prod")){ return["ami-sdf34", "ami-sdf34ds", "ami-sdf3sf3"] } Active Choices Reactive Reference Parameter The reactive reference parameter is similar to a reactive parameter except for the fact that it mostly will not be used in the build environment. Meaning, it is often used to display information to the user dynamically to select the correct values from the other parameter input fields, as shown in the above use case image. Using Active Choice Parameters With Declarative Pipeline If you are wondering how to use active choice parameters in a declarative pipeline, here is the Jenkinsfile with all Active Choice parameter types. If you execute this, you will get parameters like the demo I have shown with the use case. Note: Sometimes, after the execution of the pipeline, the parameters won’t show up correctly. If it happens, open job configuration and save it one time without changing anything. The values will show up. If you have trouble copying the code, use this Github link pipeline { agent any stages { stage('Parameters'){ steps { script { properties([ parameters([ [$class: 'ChoiceParameter', choiceType: 'PT_SINGLE_SELECT', description: 'Select the Environemnt from the Dropdown List', filterLength: 1, filterable: false, name: 'Env', script: [ $class: 'GroovyScript', fallbackScript: [ classpath: [], sandbox: false, script: "return['Could not get The environemnts']" ], script: [ classpath: [], sandbox: false, script: "return['dev','stage','prod']" ] ] ], [$class: 'CascadeChoiceParameter', choiceType: 'PT_SINGLE_SELECT', description: 'Select the AMI from the Dropdown List', name: 'AMI List', referencedParameters: 'Env', script: [$class: 'GroovyScript', fallbackScript: [ classpath: [], sandbox: false, script: "return['Could not get Environment from Env Param']" ], script: [ classpath: [], sandbox: false, script: ''' if (Env.equals("dev")){ return["ami-sd2345sd", "ami-asdf245sdf", "ami-asdf3245sd"] } else if(Env.equals("stage")){ return["ami-sd34sdf", "ami-sdf345sdc", "ami-sdf34sdf"] } else if(Env.equals("prod")){ return["ami-sdf34sdf", "ami-sdf34ds", "ami-sdf3sf3"] } ''' ] ] ], [$class: 'DynamicReferenceParameter', choiceType: 'ET_ORDERED_LIST', description: 'Select the AMI based on the following infomration', name: 'Image Information', referencedParameters: 'Env', script: [$class: 'GroovyScript', script: 'return["Could not get AMi Information"]', script: [ script: ''' if (Env.equals("dev")){ return["ami-sd2345sd: AMI with Java", "ami-asdf245sdf: AMI with Python", "ami-asdf3245sd: AMI with Groovy"] } else if(Env.equals("stage")){ return["ami-sd34sdf: AMI with Java", "ami-sdf345sdc: AMI with Python", "ami-sdf34sdf: AMI with Groovy"] } else if(Env.equals("prod")){ return["ami-sdf34sdf: AMI with Java", "ami-sdf34ds: AMI with Python", "ami-sdf3sf3: AMI with Groovy"] } ''' ] ] ] ]) ]) } } } } } Jenkinsfile Parameter Best Practices The following are some of the best practices you can follow while using parameters in a Jenkinsfile. Never pass passwords in the String or Multi-line parameter block. Instead, use the password parameter of access Jenkins credentials with credential id as the parameter.Try to use parameters only if required. Alternatively, you can use a config management tool to read configs or parameters in the runtime.Handle the wrong parameter execution in the stages with a proper exception handling. It avoids unwanted step execution when a wrong parameter is provided. It happens typically in multi-line and string parameters.Jenkinsfile Parameter FAQs How to dynamically populate the choice parameter in the declarative pipeline? Dynamic parameters can be achieved by using an active choice parameter. It uses a groovy script to dynamically populate choice parameter values. How are the parameters used in the declarative pipeline? In the declarative pipeline, parameters can be incorporated using the properties block. It supports all types of Jenkins parameters. How to generate pipeline code for parameters? You can use the native Jenkins pipeline syntax generator to generate the code block for any type of pipeline parameters. View the full article
  19. Jenkins is the widely adopted open source continuous integration tool. A lot has changed in Jenkins 2.x when compared to the older version. In this Jenkins tutorial series, we will try to cover all the essential topics for a beginner to get started with Jenkins. Jenkins is not just a Continuous Integration tool anymore. It is a Continuous Integration and Continuous delivery tool. You can orchestrate any application deployments using Jenkins with a wide range of plugins and native Jenkins workflows. Jenkins Tutorials For Beginners In this collection of Jenkins tutorial posts, we will be covering various Jenkins tutorials, which will help beginners to get started with many of the Jenkins core functionalities. Following is the list of Jenkins beginner tutorials. It is a growing list of Jenkins step by step guides. Jenkins Administration Jenkins Architecture ExplainedInstalling and configuring Jenkins 2.0Setting up Jenkins on Kubernetes ClusterConfigure SSL on Jenkins ServerSetting up a distributed Jenkins architecture (Master and slaves)Backing up Jenkins Data and ConfigurationsSetting up Custom UI for JenkinsRunning Jenkins on port 80Jenkins Pipeline Development Jenkins Pipeline as Code Tutorial for BeginnersBeginner Guide to Parameters in Declarative PipelineJenkins Shared Libary explainedCreating Jenkins Shared LibraryJenkins Multi-branch Pipeline Detailed Guide for BeginnersScaling Jenkins Configuring Docker Containers as Build SlavesConfiguring ECS as Build Slave For JenkinsCI/CD With Jenkins Java Continuous Integration with JenkinsJenkins PR based builds with Github Pull Request Builder PluginJenkins Core Features Lets have look at the overview of key Jenkins 2.x features that you should know. Pipeline as CodeShared LibrariesBetter UI and UXImprovements in security and pluginsPipeline as Code Jenkins introduced a DSL by which you can version your build, test, deploy pipelines as a code. Pipeline code is wrapped around groovy script which is easy to write and manage. An example pipeline code is shown below. node(‘linux’){ git url: 'https://github.com/devopscube/simple-maven-pet-clinic-app.git' def mvnHome = tool 'M2' env.PATH = "${MNHOME}/bin:${env.PATH}" sh 'mvn -B clean verify' } Using pipeline as a code you can run parallel builds on a single job on different slaves. Also, you have good programmatic control over how and what each Jenkins job should do. Jenkinsfile is the best way to implement Pipeline as code. There are two types of pipeline as code. Scripted Pipeline andDeclarative Pipeline.Our recommendation is to use only declarative pipeline for all your Jenkins based CI/CD workflows as you will have more control and customization over your pipelines. Jenkins Shared Libraries Jenkins shared library is a great way to reuse the pipeline code. You can create libraries of your CI/CD code which can be referenced in your pipeline script. The extended shared libraries will allow you to write custom groovy code for more flexibility. Jenkins X Jenkins X is a project from Jenkins for CI/CD on Kubernetes. This project is entirely different from normal Jenkins. Better UI and UX Jenkins 2.0 has a better User interface. The pipeline design is also great in which the whole flow is visualized. Now you can configure the user, password, and plugins right from the moment you start the Jenkins instance through awesome UI. Also, Jenkins Blueocean is a great plugin which gives a great view for pipeline jobs. You can even create a pipeline using the blue ocean visual pipeline editor. Blueocen looks like the following. View the full article
  20. Jenkins 2.x has lots of great functionalities that will make the CI pipeline smooth using the pipeline as code and reusable with shared libraries. In this guide, we will walk you through the steps for installing and configuring Jenkins on a ubuntu server in 10 easy steps. Also, we have added the steps to install Jenkins using Docker on Ubuntu server. Install and Configure Jenkins on Ubuntu Follow the steps given below to install and configure Jenkins 2 on a ubuntu server. Note: Centos/Rehat users follow this tutorial Install jenkins on centos/Redhat Step 1: Log in to the server and update it. sudo apt-get -y update Step 2: Install open JDK 11. sudo apt install openjdk-11-jdk -y Step 3: Add the Jenkins Debian repo. <code>wget -q -O - https://pkg.jenkins.io/debian/jenkins.io.key | sudo apt-key add -</code> <code>sudo sh -c 'echo deb https://pkg.jenkins.io/debian-stable binary/ &gt; /etc/apt/sources.list.d/jenkins.list'</code> Step 4: Update the packages sudo apt-get update -y Step 5: Install latest LTS Jenkins. sudo apt-get install jenkins -y Step 6: Start the Jenkins service & enable it for starting during bootup. sudo systemctl start jenkins sudo systemctl enable jenkins You can check the status of Jenkins service using the following command. sudo systemctl status jenkins Step 7: Now you will be able to access the Jenkins server on port 8080 from localhost or using the IP address as shown below. Step 8: As you can see in the above image, you need to provide the administrative password. You can get the password using the following command. sudo cat /var/lib/jenkins/secrets/initialAdminPassword Copy the password and click continue. Step 9: Next, you will be asked to configure plugins as shown below. Select the “Install Suggested Plugins” option. This will install all the required plugins for building your projects. It will take a few minutes to install the plugins. Step 10: Once installed, You need to create a user with password and click “save and finish” Click “Start Using Jenkins” and it will take you to the Jenkins Dashboard. Log in using the username and password that you have given in step 8. That’s it! Now you have a fully functional Jenkins server up and running. Consider setting up Jenkins backup using the backup plugin. Here are some key configurations and file locations in Jenkins that you should know. Note: For production setup, the recommended approach is to mount the Jenkins data folder to an additional data disk. This way you don’t lose any Jenkins data if the server crashes. Jenkins Data Location /var/lib/jenkins Jenkins main configuration file /var/lib/jenkins/config.xml Jobs folder /var/lib/jenkins/jobs Next, would be the configuration of a distributed master-slave setup wherein you will have an active master and slave agents for building the projects. Check Jenkins SSL setup guide if you want to setup SSL for your Jenkins instance. Setting up Jenkins Using Docker on Ubuntu If you are a docker user, you can run Jenkins on a docker container. Refer docker installation document to install the latest edition of docker. Execute the following command to deploy Jenkins on Docker docker run -p 8080:8080 -p 50000:50000 --name jenkins jenkinsci/jenkins:latest The above command won’t persist any changes if the container crashes. So it is better to mount a host volume to the container to hold all the Jenkins configurations. Here is the command to deploy Jenkins container with the host volume mount. docker run -p 8080:8080 -p 50000:50000 -v /home/ubuntu:/var/jenkins_home jenkinsci/jenkins:latest View the full article
  21. If you are looking for a well-automated Pull Request based or branch-based Jenkins Continuous Integration & Delivery (CI/CD) pipelines, this guide will help you get the overall picture of how to achieve it using the Jenkins multibranch pipeline. Jenkins’s multi-branch pipeline is one of the best ways to design CI/CD workflows as it is entirely a git-based (source control) pipeline as code. In this guide, I will talk about all the key concepts involved in a Jenkins multi-branch pipeline setup Jenkins Multibranch Pipeline Fundamentals Let’s start with the multi-branch pipeline basics. Specifically, in this section, I’m going to cover what multi-branch pipeline is and why it is essential to use it for all Jenkins CI/CD pipelines.I’ll also show you how a multi-branch pipeline works with a detailed workflow diagram. What is a Multi-branch Pipeline? A multi-branch pipeline is a concept of automatically creating Jenkins pipelines based on Git branches. Meaning, it can automatically discover new branches in the source control (Github) and automatically creates a pipeline for that branch. When the pipeline build starts, Jenkins uses the Jenkinsfile in that branch for build stages. SCM (Source Control) can be Github, Bitbucket, or a Gitlab repo. You can choose to exclude selected branches if you don’t want it to be in the automated pipeline with Java regular expressions. Multi-branch pipeline supports PR based branch discovery. Meaning, branches get discovered automatically in the pipeline if someone raises a PR (pull request) from a branch. If you have this configuration enabled, builds will get triggered only if a PR is raised. So if you are looking for a PR based Jenkins build workflow, this is a great option. You can add conditional logic to the Jenkinsfile to build jobs based on the branch requirement. For example, if you want the feature branch to run only unit testing and sonar analysis, you can have a condition to skip the deployment stage with a when condition as shown below. So whenever the developer raises the PR from the feature branch to some other branch, the pipeline will run the unit testing and sonar analysis stages skipping the deployment stage. Also, multi-branch pipelines are not limited to continuous delivery of applications. You can use it to manage your infrastructure code as well. One such example is having a continuous delivery pipeline for Docker image or a VM image patching, building and upgrade process. How Does a Multi-Branch Pipeline work? I will walk you through a basic build and deployment workflow to understand how a multi-branch pipeline work. Let’s say I want a Jenkins pipeline to build and deploy an application with the following conditions. Development starts with a feature branch by developers committing code to the feature branch. Whenever a developer raises a PR from the feature branch to develop a branch, a Jenkins pipeline should trigger to run a unit test and static code analysis. After testing the code successfully in the feature branch, the developer merges the PR to the develop branch. When the code is ready for release, developers raise a PR from develop branch to master. It should trigger a build pipeline that will run the unit test cases, code analysis and deploys it to dev/QA environments. From the above conditions, you can see that there is no manual trigger of Jenkins jobs, and whenever there is a pull request for a branch, the pipeline needs to be triggered automatically and run the required steps for that branch. This workflow builds a great feedback loop for engineers and avoids having a dependency on the DevOps team to build and deploy in non-prod environments. Developer can check the build status on Github and take decisions on what to do next. This workflow can be achieved easily through a Jenkins multi-branch pipeline. The following image shows how a multi-branch pipeline workflow would look like for the above example build process Here is how the multi-branch pipeline works. When a developer creates a PR from a feature branch to develop a branch, Github sends a webhook with the PR information to Jenkins. Jenkins receives the PR and finds the relevant multibranch pipeline and creates a feature branch pipeline automatically. It then runs the jobs with the steps mentioned in the Jenkinsfile from the feature branch. During checkout, the source and target branches in the PR gets merged. The PR merge will be blocked on Github until a build status from Jenkins is returned. Once the build finishes, Jenkins will update the status to Github PR. Now you will be able to merge the code. Also, if you want to check the Jenkins build logs, you can find the Jenkins build log link in the PR status. Multibranch Pipleline Jenkinsfile Before jumping in to implementation, lets have a look at multibranch pipeline Jenkins example Jenkinsfile that can be used in the pipeline. For the multibranch pipeline to work, you need to have the Jenkinsfile in the SCM repo. If you are learning/testing, you can use the multibranch pipeline Jenkinsfile given below. It has a checkout stage and other dummy stages, which echoes the message. Also, you can clone and use this Github repo which has this Jenkinsfile Note: Replace the agent label “master” with your Jenkins agent name. master will also work but wouldnt advise it running is actual project environments. pipeline { agent { node { label 'master' } } options { buildDiscarder logRotator( daysToKeepStr: '16', numToKeepStr: '10' ) } stages { stage('Cleanup Workspace') { steps { cleanWs() sh """ echo "Cleaned Up Workspace For Project" """ } } stage('Code Checkout') { steps { checkout([ $class: 'GitSCM', branches: [[name: '*/main']], userRemoteConfigs: [[url: 'https://github.com/spring-projects/spring-petclinic.git']] ]) } } stage(' Unit Testing') { steps { sh """ echo "Running Unit Tests" """ } } stage('Code Analysis') { steps { sh """ echo "Running Code Analysis" """ } } stage('Build Deploy Code') { when { branch 'develop' } steps { sh """ echo "Building Artifact" """ sh """ echo "Deploying Code" """ } } } } Setup Jenkins Multi-branch Pipeline Here I will walk you through the step by step process of setting up a multi-branch pipeline on Jenkins This setup will be based on Github and latest Jenkins 2.x version. You can also use Bitbucket or Gitlab as SCM source for a multi-branch pipeline Create Multibranch Pipeline on Jenkins (Step by Step Guide) Step 1: From the Jenkins home page create a “new item”. Step 2: Select the “Multibranch pipeline” from the option and click ok. Step 3: Click “Add a Source” and select Github. Step 4: Under the credentials field, select Jenkins, and create a credential with your Github username and password. Step 5: Select the created credentials and provide your Github repo to validate the credentials as shown below. If you are testing multi-branch pipeline you can clone the demo Github repo and use it. https://github.com/devopscube/multibranch-pipeline-demo. Step 6: Under “Behaviours” select the required option matches your requirement. You can either choose to discover all the branches in the repo or only branches with a Pull Request. The pipeline can discover branches with a PR from a forked repo as well. Choosing these options depends on your required workflow. There are additional behavior you can choose from the “add” button. For example, If you choose not to discover all the branches from the repo, you can opt for the regular expression or wildcard method to discover branches from the repo as shown below. Here is a regex and wildcard example. Step 7: If you choose to have a different name for Jenkinsfile, you can do so by specifying it in the build configuration. In the “Script Path” option you can provide the required name. Make sure the Jenkinsfile is present in the repo with the same name you provide in the pipeline configuration. Also, Enable “Discard old builds” to keep only required build logs as shown below. Step 8: Save all the job configurations. Jenkins scans the configured Github repo for all the branches which has a PR raised. The following image shows the job scanning the three branches, and since I haven’t raised any pull request, Jenkins won’t create any branch-based pipeline. I will show how to test the automatic pipeline creation after the webhook setup. Till now, we have done configurations on the Jenkins side to scan branches based on the PR requests. To have a complete workflow, we need to have a webhook configured in Github to send all the repo events (commits, PR etc) to Jenkins as the pipelines can be triggered automatically. Configure Webhook For Multibranch Pipeline Follow the steps given below to setup the Jenkins webhook on the repo. Step 1: Head over to the Github repo and click on the settings. Step 2: Select the webhook option at the left and click “Add Webhook” button. Step 3: Add your Jenkins URL followed by “/github-webhook/” under payload URL. Select the content type as “application/json” and click “Add Webhook” Note: You can choose what type of webhook you want to receive in Jenkins. For example, you want to trigger the pipeline only during PR; then, you can select just the PR event from “Let me select individual events” option. You should see a green tick mark on a successful webhook configuration as shown below. If you don’t see a green tick or see a warning sign, click on the webhook link, scroll down to “Recent Deliveries,” and click on the last webhook. You should be able to view why the webhook delivery failed with the status code. Now we are done with all the required configurations for the multi-branch pipeline. The next step is to test the multi-branch pipeline workflow triggers. Test Multi-branch Pipeline For demo purpose, I have chosen the option “Only Branches that are file as PR“. With this option, only the branches with a PR request gets discovered. To play around with a multi-branch pipeline, you can use this repo with a sample Jenkinsfile –> Multibranch Pipeline Demo Repo This repo has three branches. master, develop and feature. Update some content in the readme file in the feature branch and raise a PR to develop. It will send a webhook to Jenkins and Jenkins will send back the Jenkins job details and the PR will go to check state as shown below. If you click the “Details” it will take you to the Jenkins build log. You can write custom check in your Jenkinsfile that can be used for the build reviews. Now, if you check Jenkins you will find a pipeline for feature branch in Jenkins as shown below. If the build fails, you can commit the changes to the feature branch and as long as the PR is open, it will trigger the feature pipeline. In the Jenkinfile I have added a condition to skip the deploy stage if the branch is not develop. You can check that in the Jenkins build log. Also, If you check the build flow in the blue ocean dashboard you can clearly see the skipped deployment stage as shown below. Now merge the feature branch PR and raise a new PR from develop to the master branch. Jenkins will receive the webhook from Github for the new PR and the develop pipeline gets created as shown below. For develop branch, the deploy stage is enabled and if you check the Blue Ocean build flow you can see all the stages successfully triggered. Troubleshooting Multibranch Pipelines I will talk about a few possible errors in a multibranch pipeline that you might encounter and how to troubleshoot them. Branch Discovery Issue Sometimes even after creating new branches in the SCM, it might not reflect in the Jenkins pipeline. You can try running the “Scan Repository Now” option to scan the repo again. Also, check the repository scan configurations in the pipeline. PR Webhooks Not Triggering the Pipelines ​When a webhook is not triggering the pipeline, check the webhook delivery in Github for status code and error. Also, check if the Jenkins URL is correct. Also, check Jenkins logs from Manage Jenkins --> System Logs --> All Jenkins logs. If Jenkins is able to receive the webhook, the log should show the reason why the jobs are not getting triggered. Commits Not Triggering Pipeline If you want each commit to rigger the branch pipeline, then you should select the “Discover All Branches” option in the branch discovery configuration. So whenever you commit a change to the discoverable branches or raise a PR, the pipeline will automatically get triggered. Multibranch Pipeline Best Practices Let’s have a look at some of the best practices for a multibranch pipeline. Repo Branching – Have a Standard Structure It is essential to have the standard branching structure for your repositories. Whether it is your application code or infra code, having a standard branching will reduce the inconsistent configurations across different pipelines. Shared Libraries – Reusable Pipeline Code Make use of shared libraries for all your multi-branch pipelines. Reusable libraries make it easy to manage all the pipeline stages in a single place. Pull Request Vs Commit Triggers Try to use a PR based pipeline rather than commit based. If a code repo gets continuous commits it might overwhelm Jenkins with many builds. Commit based triggers are supported in PR based discovery as well. Here the commit trigger happens only when the PR is still open. Jenkins Pipeline Vs. Multibranch Pipeline A normal pipeline job is meant for building a single branch from the SCM and deploy to a single environment. However, you can A multibranch pipeline is meant for building multiple branches from a repository and deploy to multiple environments if required. A pipeline job supports both pipeline steps to be added in Jenkins configuration and form SCM. Use pipeline job for adhoc jobs, parameterised job executions and to debug pipeline as code. Do not use multibranch pipeline if you do not have a standard branching and CI/CD strategy. Multibranch Pipeline Vs. Github Organization Job Like a multi-branch pipeline, the Github organization folder is one of the Jenkins project types. Note: The organization folder is not just limited to Github. It can be use used for Gitlab, Bitbucket teams, or Gitea organization. Mullitbranch pipleine can only configure pipelines for a single Git repository. Whereas a Jenins Github organization project can automatically configure multi-branch pipelines for all the repos in a Github organization. It can discover all the repositories in the configured Github organization, with a Jenkinsfile. Also, you can configure a generic webhook in the organizational level to avoid having webhooks in each repo. The only difference between multi-branch and organization project is that organizations can configure multi-branch pipelines for multiple repos. So which one should I use? This totally depends on the workflow you need. If you have a standard pipeline and process of deploying applications or infra code, Github organization is great. Or else, configuring multibranch pipeline separately will be a good option. I’d like to hear from you That’s all for my guide to multi-branch pipelines. What type of Jenkins pipleines are you using? Do you think multi-branch pipeline will add value to your workflows? Or maybe you want stick to regular Jenkinsfile pipelines Either way, let me know by leaving a comment. View the full article
  22. Building projects based on pull request is something you cannot avoid in CI/CD pipelines. Nowadays every team does several deployments/operations per day and lots of builds have to happen in this process. Also, the teams work on the same repo collaborating code require faster code integrations. So it is better to have an automated build process that kicks off the CI/CD pipeline on a pull request rather than manually triggering the jobs. Trigger Builds Automatically On Github Pull Request In this tutorial, we will explain how to configure a pull request based build trigger on Jenkins using Github webhooks and Github pull request builder plugin. Note: Multipbranch Pipeline is the best way to achieve Jenkins pull request based workflow as it is natively available in Jenkins. Check out this article on the multibranch pipeline for setup and configuration. Install Github Pull Request Builder Plugin Go to Manange Jenkins --> Manage Plugins Click on the available tab at the top and search for Github Pull Request Builder. Select the plugin using the checkbox and click Install without restart as shown in the image below. Once the plugin is installed, select the restart checkbox as shown in the image below. Github Pull Request Builder Configuration Once Jenkins is restarted, follow the steps given below for configuring the plugin with your GitHub account. Head over to Manange Jenkins --> Configure System Find “GitHub Pull Request Builder” section and click add credentials. Enter your Github username and password and add it. You can test the Github API connection using the test credentials button. It should show “connected” as shown below. Save the configuration after testing the API connection. Github Repo Webhook Configuration For Jenkins to receive PR events through the pull request plugin, you need to add the Jenkins pull request builder payload URL in the Github repository settings. Go to Github repository settings, and under webhooks, add the Jenkins pull request builder payload URL. It has the following format http://<Jenkins-IP>:<port>/ghprbhook/ If you need just the PR triggers, you can select the “Let me select individual events” option and select just the “Pull requests” option. Save the webhook after selecting the required events. Once saved, go back to the webhook option and see if there is a green tick. It means Github is able to successfully deliver the events to the Jenkins webhook URL. Job Configuration for Automated Pull Request Builds Lets get started with the build job configuration for PR plugin. Under the General tab, select Github project option and enter the Github repo URL for which you want the PR builds without .git extension as shown below. Click advanced option and enable automatic PR build trigger and add the target branches you would raise the PR for. Add your pipeline build steps and save the configuration. Now raise a PR against the whitelisted branch you have given in the Jenkins PR trigger settings. You should see the job getting triggered on Jenkins. Other Jenkins PR based Build Workflows Github Pull request builder plugin is not actively developed as the same functionality is being provided by multi-branch pipelines and Github organisation project. There is also a Generic Webhook Plugin that can be used to trigger Jenkins jobs on a Pull Request. Also, you can write custom API endpoints that accept Github webhooks and process PR requests to trigger Jenkins job remotely. Custom APIs help only when the native Jenkins functionalities are not providing the workflow you are looking for. View the full article
  • Forum Statistics

    67.4k
    Total Topics
    65.3k
    Total Posts
×
×
  • Create New...