Efficiency and Quality Unleashed: Exploring CI/CD Pipelines and Creating the very first ci/cd pipeline using Jenkins.
Introduction
CI/CD (Continuous Integration/Continuous Delivery) pipeline is a software development approach that involves automating the building testing and deployment of the applications. CI/CD pipelines are a practice focused on improving software delivery throughout the software development life cycle via automation. Although we can manually do the process of development, testing, production, and monitoring CI/CD helps us to automate the process and help to reduce the error.
Continuous Integration
Continuous integration (CI) is the software development practice where frequent and automated integration of code changes from multiple developers into a shared repository aims to detect and address integration issues early in the development cycle.
Developer pushes their code in the Source Code Management (SVC) tool. Then CI /CD tools like Jenkis, and Circle Ci will automatically build the software, runs various tests, and check for potential issues. if there is any error in the code or is showing issues the developer will be informed in a short time and the developer can fix the issues that result in the low error and immediately solve the error which will save both time and money.
CI is not entirely an prerequisite required for creating a stable software product. However, it definitely serves an important role when developing software products or components that require frequent changes. Furthermore, it also ensures that all the components of an application are integrated properly.
In the SDLC, CI mainly covers the Source and Build phases. A CI pipeline typically involves these steps:
Detect changes in the code
Analyze the quality of the source code
Build
Execute all unit tests
Execute all integration tests
Generate deployable artifacts
Report status
If any of the above processes fail the integration process is stopped immediately and then the developer is informed.
CI tools like Jenkins can use various other tools and plugins to complete this building and testing of the code like
Maven: Jenkins can integrate with Maven to build, test, and package Java projects automatically.
Gradle: It provides a flexible and efficient way to build, test, and deploy software projects, not limited to Java.
SonarQube: It is a popular code quality management platform that analyzes code for bugs, vulnerabilities, and code smells.
Continuous Delivery
If the application passes all tests in the pipeline, it moves to the Continuous Delivery stage. In this stage, the application is prepared for release, including additional validation, security checks, and documentation generation. The application can be deployed to a production-like environment for final testing and validation by stakeholders.
Continuous Deployment
Continuous Delivery, on the other hand, is a set of software development practices that ensures the deployment of code to production while performing efficient testing in the process. The CD starts where CI ends. Continuous Delivery is responsible for pushing the code to the testing environment where different tests such as system testing, unit testing, and integration testing are performed.
A typical CI/CD pipeline works in 4 phases :
Phase 1: Commit - This is the actual phase where developers commit changes to the code.
Phase 2: Build – In this phase, the source code is integrated into the build from the repository.
Phase 3: Test Automation - This step is an integral part of any CI/CD pipeline. The source code previously integrated into the build is subjected to a systematic cycle of testing.
Phase 4: Deploy - The tested version is finally sent for deployment in this phase.
Writing Very First Pipeline Using Jenkins.
We Will use Docker as an agent while building our application. As it is more beneficial than Jenkins as a master.
AWS EC2 Instance
First, Creating the AWS EC2 instance we can use AWS own terminal or we can use our own terminal connecting the EC2 instance through ssh.
Installing Jenkins
After creating the was ec2 instance we installed Jenkins in that server using the below command.
We should first install Java for installing Jenkins.
Install Java
sudo apt update
sudo apt install openjdk-11-jre
Verify Java is Installed
java -version
Installing Jenkins
curl -fsSL
https://pkg.jenkins.io/debian/jenkins.io-2023.key
| sudo tee \ /usr/share/keyrings/jenkins-keyring.asc > /dev/null echo deb [signed-by=/usr/share/keyrings/jenkins-keyring.asc] \
https://pkg.jenkins.io/debian
binary/ | sudo tee \ /etc/apt/sources.list.d/jenkins.list > /dev/null
sudo apt-get update
sudo apt-get install jenkins
By default, Jenkins will not be accessible to the external world due to the inbound traffic restriction by AWS. We should edit the inbound rule and grant access to port 8080.
Login to Jenkins
Now we login to Jenkins using the URL as http://ec2-public-ip-address:8080
We configure Jenkins and install the recommended plugin.
We also install the docker-pipeline plugin as we use docker as an agent.
We create the pipeline project in Jenkins as write our very first Jenkinsfiile in Groovy script.
Here we contain only one stage i.e. test where test the node version of the node image we create using docker as an agent. This is just a basic pipeline we create while learning about the CI/CD Pipeline.
We use Docker as an agent in building Jenkins Pipeline.
As there will be many teams working together Jenkins master is used only for scheduling purposes as everything cannot be run in the same Jenkins master as there will be conflicting packages due to many teams so the worker node is specified.
With the advancement of microservices and Kubernetes and the increase in the amount of load the Jenkins master becomes inefficient as an EC2 instance is allocated to each and every application and service and as services keep on increasing that will also increase the cost as well if the service is not used the allocated ec2 instance become useless so we use the docker as the container for the Jenkins pipeline as it has many advantages over Jenkins master Docker's resource efficiency reduces the footprint of build agents, minimizing resource consumption such as memory and CPU.
Conclusion
The CI phase ensures that code changes are continuously integrated and tested, promoting collaboration and early bug detection. The CD phase takes this a step further by automating the deployment and release process, allowing for frequent and reliable deployments to production environments.
Here we look at the CI/CD and its various tools to create the pipeline like Jenkins Circle CI. We also create our very first pipeline using Jenkins Created on AWS EC2 instance although it contains only one stage. we use docker as our agent. That's all, for now, we will create our detailed CI/CD pipeline using various tools for testing, building, and CD tools for deployment in Kubernetes in the next article.