This is Part 2 of the Jenkins guide. If you haven’t read it yet, start with Part 1: Build Automation & CI/CD Fundamentals.
Jenkins Shared Libraries
When you have multiple microservices or projects, you’ll notice repetitive code in Jenkinsfiles. Shared Libraries solve this.
Problem:
- 10 Java microservices, each with similar Jenkinsfile
- All need to: build JAR, build Docker image, push to registry
- Any change requires updating 10 Jenkinsfiles
Solution: Shared Library
- Create a separate Git repository for common pipeline code
- Reference it in Jenkinsfiles
- Make changes once, all pipelines benefit
Shared Library Structure
jenkins-shared-library/
├── vars/
│ ├── buildJar.groovy
│ ├── buildImage.groovy
│ ├── dockerLogin.groovy
│ └── dockerPush.groovy
├── src/
│ └── com/
│ └── example/
│ └── Docker.groovy
└── resources/
└── config.json
vars/: Global functions callable directly from Jenkinsfile
src/: Helper classes and complex logic
resources/: Static resources, configuration files
Creating a Shared Library
vars/buildJar.groovy:
#!/usr/bin/env groovy
def call() {
echo "Building from branch: ${GIT_BRANCH}"
sh "mvn clean package"
}
vars/buildImage.groovy:
#!/usr/bin/env groovy
def call(String imageName) {
echo "Building Docker image: ${imageName}"
sh "docker build -t ${imageName} ."
}
vars/dockerLogin.groovy:
#!/usr/bin/env groovy
def call() {
withCredentials([usernamePassword(
credentialsId: 'docker-hub',
passwordVariable: 'PASS',
usernameVariable: 'USER'
)]) {
sh "echo '${PASS}' | docker login -u '${USER}' --password-stdin"
}
}
vars/dockerPush.groovy:
#!/usr/bin/env groovy
def call(String imageName) {
echo "Pushing image: ${imageName}"
sh "docker push ${imageName}"
}
Making Library Globally Available
Manage Jenkins → System → Global Pipeline Libraries:
- Name:
jenkins-shared(you’ll use this name to import) - Default version:
main(branch name) - Retrieval method: Modern SCM → Git
- Project repository: Your shared library Git URL
- Credentials: If private repository
Using Shared Library in Jenkinsfile
Option 1: Global Library (configured in Jenkins):
@Library('jenkins-shared')_ // Underscore if no import statement follows
pipeline {
agent any
tools {
maven "maven-3.9.11"
}
stages {
stage('Build Jar') {
steps {
script {
buildJar() // Directly call shared library function
}
}
}
stage('Build & Push Image') {
steps {
script {
def imageName = "myapp:${BUILD_NUMBER}"
buildImage(imageName)
dockerLogin()
dockerPush(imageName)
}
}
}
}
}
Option 2: Library Scoped to Jenkinsfile:
library identifier: 'jenkins-shared@main', retriever: modernSCM(
[$class: 'GitSCMSource',
remote: 'https://gitlab.com/yourorg/jenkins-shared.git',
credentialsId: 'gitlab-credentials']
)
pipeline {
// Rest of pipeline
}
Advanced: Using src/ Directory for Reusability
When Docker functions are needed across multiple shared library functions, move them to src/:
src/com/example/Docker.groovy:
package com.example
class Docker implements Serializable {
def script // Reference to Jenkins pipeline script
Docker(script) {
this.script = script
}
def buildImage(String imageName) {
script.echo "Building Docker image: ${imageName}"
script.sh "docker build -t ${imageName} ."
}
def dockerLogin() {
script.withCredentials([
script.usernamePassword(
credentialsId: 'docker-hub',
passwordVariable: 'PASS',
usernameVariable: 'USER'
)
]) {
script.sh "echo '\${script.PASS}' | docker login -u '\${script.USER}' --password-stdin"
}
}
def dockerPush(String imageName) {
script.echo "Pushing image: ${imageName}"
script.sh "docker push ${imageName}"
}
}
Why implements Serializable? Jenkins pipelines can pause and resume (e.g., waiting for user input). Objects need to be serializable to persist state.
Why pass script? The class needs access to Jenkins DSL methods like sh, echo, withCredentials.
Updated vars/buildImage.groovy:
#!/usr/bin/env groovy
import com.example.Docker
def call(String imageName) {
return new Docker(this).buildImage(imageName)
}
Updated vars/dockerLogin.groovy:
#!/usr/bin/env groovy
import com.example.Docker
def call() {
return new Docker(this).dockerLogin()
}
Updated vars/dockerPush.groovy:
#!/usr/bin/env groovy
import com.example.Docker
def call(String imageName) {
return new Docker(this).dockerPush(imageName)
}
Now the Docker logic is centralized, making it easier to maintain and test.
Triggering Jenkins Builds
1. Manual Triggers
Click “Build Now” in Jenkins UI. Useful for:
- Production deployments requiring manual approval
- Testing pipelines
- On-demand builds
2. Scheduled Triggers
Run builds at specific times using cron syntax:
pipeline {
agent any
triggers {
cron('H 2 * * *') // Run daily at 2 AM (H for random minute)
}
stages {
// Your stages
}
}
Use cases:
- Nightly test runs
- Weekly security scans
- Monthly report generation
3. Automated Triggers (Webhooks)
Most common in modern CI/CD - automatically trigger builds when code changes are pushed to Git.
Option A: GitLab Plugin (Single Branch Pipelines)
1. Install GitLab Plugin:
- Manage Jenkins → Plugins → Available → Search “GitLab”
- Install and restart
2. Configure GitLab Connection:
- Manage Jenkins → System → GitLab
- Connection name:
gitlab-connection - GitLab host URL:
https://gitlab.com - Credentials: Add GitLab API token
- In GitLab: Settings → Access Tokens
- Create token with
read_apiscope - Add as “GitLab API token” in Jenkins
3. Configure Job:
- Job → Configure → Build Triggers
- Check “Build when a change is pushed to GitLab”
- Note the webhook URL shown
4. Configure GitLab Webhook:
- GitLab project → Settings → Webhooks
- URL: From Jenkins (e.g.,
http://jenkins-url/project/job-name) - Secret token: From Jenkins configuration
- Trigger: Push events, Merge request events
- SSL verification: Disable if using HTTP
- Add webhook
Limitation: GitLab plugin works only with single-branch pipelines, not multibranch.
Option B: Multibranch Scan Webhook Trigger (Recommended)
Works with any Git provider (GitHub, GitLab, Bitbucket) and multibranch pipelines.
1. Install Plugin:
- Multibranch Scan Webhook Trigger
2. Configure Multibranch Pipeline:
- Job → Configure → Scan Multibranch Pipeline Triggers
- Enable “Scan by webhook”
- Trigger token:
myapp-token(choose any secure token) - Note the full webhook URL:
http://jenkins-url/multibranch-webhook-trigger/invoke?token=myapp-token
3. Configure Git Webhook:
- Git repository → Settings → Webhooks
- Payload URL: The Jenkins webhook URL
- Content type:
application/json - Events: Push events, Pull request events
- Add webhook
Testing:
# Trigger manually
curl -X POST http://jenkins-url/multibranch-webhook-trigger/invoke?token=myapp-token
Now every push or merge request triggers Jenkins to scan branches and build automatically.
Automated Version Incrementing
Manually updating version numbers is error-prone. Let’s automate it.
Maven Version Increment
Maven has plugins to parse and update version numbers automatically:
pipeline {
agent any
tools {
maven 'maven-3.9.11'
}
stages {
stage('Increment Version') {
steps {
script {
echo "Incrementing artifact version..."
// Parse current version and increment patch version
sh '''
mvn build-helper:parse-version versions:set \
-DnewVersion=\${parsedVersion.majorVersion}.\${parsedVersion.minorVersion}.\${parsedVersion.nextIncrementalVersion} \
versions:commit
'''
// Extract new version from pom.xml
def matcher = readFile('pom.xml') =~ '<version>(.+)</version>'
def version = matcher[0][1]
env.IMAGE_NAME = "${version}-${BUILD_NUMBER}"
echo "New version: ${env.IMAGE_NAME}"
}
}
}
stage('Build Jar') {
steps {
script {
echo "Building application..."
sh "mvn clean package"
}
}
}
stage('Build Docker Image') {
steps {
script {
echo "Building Docker image..."
withCredentials([usernamePassword(
credentialsId: 'docker-hub',
passwordVariable: 'PASS',
usernameVariable: 'USER'
)]) {
sh "docker build -t username/java-app:${IMAGE_NAME} ."
sh 'echo $PASS | docker login -u $USER --password-stdin'
sh "docker push username/java-app:${IMAGE_NAME}"
}
}
}
}
stage('Deploy') {
steps {
script {
echo "Deploying version ${IMAGE_NAME}..."
// Deployment logic
}
}
}
stage('Commit Version Update') {
steps {
script {
withCredentials([usernamePassword(
credentialsId: 'gitlab-credentials',
passwordVariable: 'PASS',
usernameVariable: 'USER'
)]) {
// Configure Git user
sh 'git config --global user.email "jenkins@example.com"'
sh 'git config --global user.name "jenkins"'
// Update remote URL with credentials for push
sh "git remote set-url origin https://${USER}:${PASS}@gitlab.com/yourorg/yourrepo.git"
// Stage, commit, and push changes
sh 'git add .'
sh 'git commit -m "ci: version bump"'
sh 'git push origin HEAD:jenkins-jobs'
}
}
}
}
}
}
What’s happening here?
-
mvn build-helper:parse-version: Parses the current version frompom.xmland stores it in variables likeparsedVersion.majorVersion,parsedVersion.minorVersion,parsedVersion.incrementalVersion -
versions:set -DnewVersion=...: Sets a new version inpom.xml. The expression\${parsedVersion.nextIncrementalVersion}automatically increments the patch version (e.g., 1.2.3 → 1.2.4) -
versions:commit: Commits the version change to the primarypom.xmlfile -
Version extraction: Uses Groovy regex to read the updated version from
pom.xmland stores it in an environment variable -
Git configuration: Sets up Git user identity for the commit
-
Remote URL with credentials: Modifies the Git remote URL to include authentication, allowing Jenkins to push changes back
-
Commit and push: Stages all changes, commits with a conventional commit message, and pushes to the repository
The Infinite Loop Problem
Problem: Jenkins commits a version bump → Git webhook triggers Jenkins → Jenkins commits another version bump → Infinite loop!
Solution: Ignore commits made by Jenkins user.
For GitLab: Ignore Committer Strategy
1. Install Plugin:
- Manage Jenkins → Plugins → “Ignore Committer Strategy”
2. Configure in Multibranch Pipeline:
- Job → Configure → Branch Sources → Git
- Add → Additional Behaviours → “Ignore Committer Strategy”
- Committer name:
jenkins(or whatever name you configured)
Now Jenkins ignores commits it made itself, breaking the loop.
For GitHub: Skip CI Markers
Include [skip ci] or [ci skip] in commit messages:
sh 'git commit -m "ci: version bump [skip ci]"'
npm Version Increment
For Node.js projects, npm has built-in version management:
stage('Increment Version') {
steps {
script {
// Increment patch version
sh 'npm version patch -m "ci: version bump to %s"'
// Or for specific version types
// sh 'npm version minor' // 1.2.3 → 1.3.0
// sh 'npm version major' // 1.2.3 → 2.0.0
// Read new version
def packageJson = readJSON file: 'package.json'
env.IMAGE_VERSION = packageJson.version
echo "New version: ${env.IMAGE_VERSION}"
}
}
}
Dynamic Dockerfile Tags
If your Dockerfile has hardcoded tags, update it to use wildcards:
Before (hardcoded):
FROM amazoncorretto:8-alpine3.17-jre
EXPOSE 8080
COPY ./target/java-maven-app-1.0.0.jar /usr/app/
WORKDIR /usr/app
CMD java -jar java-maven-app-1.0.0.jar
After (dynamic):
FROM amazoncorretto:8-alpine3.17-jre
EXPOSE 8080
COPY ./target/java-maven-app-*.jar /usr/app/
WORKDIR /usr/app
CMD java -jar java-maven-app-*.jar
The wildcard * matches any version, so the Dockerfile works regardless of version number.
Understanding Software Versioning
Proper versioning is crucial for tracking releases and managing dependencies.
Semantic Versioning (SemVer)
Format: MAJOR.MINOR.PATCH-SUFFIX
Example: 2.4.7-SNAPSHOT
MAJOR (2):
- Breaking changes
- NOT backward-compatible
- API changes that break existing functionality
- Example: Removing a public method, changing function signatures
MINOR (4):
- New features
- Backward-compatible changes
- Adding new API endpoints or methods
- Example: Adding optional parameters, new functionality
PATCH (7):
- Bug fixes
- No API changes
- Backward-compatible fixes
- Example: Fixing a calculation error, security patches
SUFFIX (-SNAPSHOT):
SNAPSHOT: Unstable, under active developmentalpha: Early testing phase, unstablebeta: Feature complete, testing phaserc(release candidate): Final testing before release- No suffix: Stable release
Version Lifecycle Example
1.0.0-SNAPSHOT → Development
1.0.0-alpha → Early testing
1.0.0-beta → Feature complete testing
1.0.0-rc1 → Release candidate
1.0.0 → Stable release
1.0.1 → Bug fix
1.1.0 → New feature added
2.0.0 → Breaking changes introduced
Troubleshooting Common Issues
1. Jenkins Showing Reverse Proxy Error
Error:
It appears that your reverse proxy set up is broken.
Your configured root URL does not contain the contextPath ("").
Cause: The IP address of your server changed (common with cloud instances that stop/start).
Solution:
- Go to Manage Jenkins → System
- Find Jenkins URL
- Update to your current public IP:
http://3.88.138.50:8080 - Save
Jenkins tries to verify the configured URL matches the actual access URL. When your IP changes, this check fails.
2. Docker Permission Denied
Error: permission denied while trying to connect to the Docker daemon socket
Solution:
docker exec -it -u 0 <container-id> bash
chmod 666 /var/run/docker.sock
This gives the Jenkins user permission to communicate with Docker. You may need to reapply this after restarting containers.
3. Git Push Fails: Authentication Error
Error: Authentication failed when pushing version updates
Solutions:
Option 1: Use Personal Access Token instead of password:
- Create a Personal Access Token in GitLab/GitHub with
write_repositoryscope - Use token as password in credentials
- Update credential in Jenkins
Option 2: Use SSH keys:
- Generate SSH key pair
- Add public key to Git provider
- Add private key as SSH credential in Jenkins
- Update Git remote URL to SSH format:
git@gitlab.com:yourorg/yourrepo.git
4. Plugin Conflicts
Issue: Jenkins becomes unstable after plugin updates
Prevention:
- Update plugins one at a time
- Test in a non-production Jenkins instance first
- Back up Jenkins before major updates
Recovery:
# Access Jenkins container
docker exec -it <container-id> bash
# Navigate to plugins directory
cd /var/jenkins_home/plugins
# Remove problematic plugin
rm -rf problematic-plugin*
# Restart Jenkins
5. Workspace Cleanup Issues
Issue: Old build artifacts consuming disk space
Solution: Add workspace cleanup to post actions:
post {
always {
cleanWs() // Cleans workspace after build
}
}
Or configure in Jenkins:
- Manage Jenkins → System → Workspace Cleanup
- Enable “Delete workspace before build starts”
Best Practices
1. Pipeline as Code
Always store Jenkinsfiles in your Git repository:
- ✅ Version controlled
- ✅ Code reviewed
- ✅ Auditable history
- ✅ Easy to restore previous versions
- ❌ Avoid writing pipelines directly in Jenkins UI
2. Naming Conventions
Jenkinsfile: Use the default name Jenkinsfile (capital J) at repository root. Jenkins automatically discovers it.
Stages: Use descriptive, action-oriented names:
stage('Build Application') { } // ✅ Good
stage('Test Suite Execution') { } // ✅ Good
stage('Stage 1') { } // ❌ Bad
3. Use Stages for Visualization
Structure pipelines with clear stages for better visibility:
pipeline {
agent any
stages {
stage('Checkout') { }
stage('Build') { }
stage('Test') { }
stage('Package') { }
stage('Deploy') { }
}
}
This creates a visual pipeline in Jenkins UI showing progress through each stage.
4. Groovy Shebang
Always add this at the top of Groovy files for proper syntax highlighting:
#!/usr/bin/env groovy
IDEs and Git diff viewers will correctly identify the language.
5. Input Parameters Outside Node Blocks
Bad:
node {
input message: 'Deploy?' // ❌ Blocks executor while waiting
}
Good:
stage('Approval') {
steps {
timeout(time: 5, unit: 'MINUTES') { // ✅ Times out after 5 minutes
input message: 'Deploy to production?'
}
}
}
User inputs should be outside node blocks to avoid tying up executors while waiting for human response.
6. Use Shared Libraries
Extract common logic to shared libraries:
- ✅ Reduces duplication
- ✅ Centralized maintenance
- ✅ Consistent practices across teams
- ✅ Easier to enforce standards
7. Credential Management
Do:
- ✅ Store all credentials in Jenkins Credential Manager
- ✅ Use descriptive IDs:
docker-hub-prod,aws-dev-access - ✅ Limit credential scope appropriately
- ✅ Rotate credentials regularly
- ✅ Use least-privilege principle
Don’t:
- ❌ Hardcode credentials in Jenkinsfiles
- ❌ Commit credentials to Git
- ❌ Use overly permissive credentials
- ❌ Share credentials between unrelated projects
8. Error Handling
Use try-catch blocks for critical sections:
stage('Deploy') {
steps {
script {
try {
sh 'kubectl apply -f deployment.yaml'
echo 'Deployment successful'
} catch (Exception e) {
echo "Deployment failed: ${e.getMessage()}"
// Rollback logic
sh 'kubectl rollout undo deployment/myapp'
throw e // Re-throw to fail the build
}
}
}
}
9. Use Timeouts
Prevent builds from hanging indefinitely:
options {
timeout(time: 1, unit: 'HOURS') // Entire pipeline timeout
}
stage('Long Running Test') {
steps {
timeout(time: 30, unit: 'MINUTES') { // Stage-specific timeout
sh 'npm run test:integration'
}
}
}
10. Parallel Execution
Speed up pipelines by running independent stages in parallel:
stage('Tests') {
parallel {
stage('Unit Tests') {
steps {
sh 'npm run test:unit'
}
}
stage('Integration Tests') {
steps {
sh 'npm run test:integration'
}
}
stage('Linting') {
steps {
sh 'npm run lint'
}
}
}
}
11. Build Notifications
Keep teams informed about build status:
post {
success {
slackSend(
color: 'good',
message: "Build Successful: ${env.JOB_NAME} #${env.BUILD_NUMBER}"
)
}
failure {
slackSend(
color: 'danger',
message: "Build Failed: ${env.JOB_NAME} #${env.BUILD_NUMBER}\nCheck: ${env.BUILD_URL}"
)
// Or send email
emailext(
subject: "Build Failed: ${env.JOB_NAME}",
body: "Build #${env.BUILD_NUMBER} failed. Check console output.",
to: 'team@example.com'
)
}
}
12. Resource Management
Clean up after builds:
post {
always {
// Remove Docker images
sh 'docker image prune -f'
// Clean workspace
cleanWs()
// Archive artifacts if needed
archiveArtifacts artifacts: 'target/*.jar', allowEmptyArchive: true
}
}
13. Security Scanning
Integrate security scans into pipelines:
stage('Security Scan') {
steps {
// Docker image scanning
sh 'trivy image myapp:${BUILD_NUMBER}'
// Dependency scanning
sh 'npm audit'
// Code quality
sh 'sonar-scanner'
}
}
Jenkins Architecture and Scaling
Master-Agent Architecture
As your CI/CD needs grow, a single Jenkins instance becomes a bottleneck. Jenkins supports distributed builds through master-agent architecture.
Jenkins Master (Controller):
- Schedules build jobs
- Dispatches builds to agents
- Monitors agents
- Records and presents build results
- Handles UI and API requests
Jenkins Agents (Nodes):
- Execute builds dispatched by master
- Can have different configurations (OS, tools, resources)
- Can be static (always running) or dynamic (launched on demand)
Benefits:
- Run builds on different operating systems
- Isolate builds from each other
- Scale horizontally
- Reduce load on master
Setting Up Agents
1. Static SSH Agent:
Manage Jenkins → Nodes → New Node
- Name: linux-agent-1
- Type: Permanent Agent
- Remote root directory: /home/jenkins
- Launch method: SSH
- Host: agent-server-ip
- Credentials: SSH key
2. Dynamic Docker Agent:
pipeline {
agent {
docker {
image 'maven:3.9-jdk-11'
args '-v $HOME/.m2:/root/.m2'
}
}
stages {
// Stages run inside Docker container
}
}
3. Kubernetes Agent (cloud-native approach):
pipeline {
agent {
kubernetes {
yaml """
apiVersion: v1
kind: Pod
spec:
containers:
- name: maven
image: maven:3.9-jdk-11
command: ['cat']
tty: true
- name: docker
image: docker:latest
command: ['cat']
tty: true
"""
}
}
stages {
stage('Build') {
steps {
container('maven') {
sh 'mvn clean package'
}
}
}
stage('Docker Build') {
steps {
container('docker') {
sh 'docker build -t myapp .'
}
}
}
}
}
Backup and Disaster Recovery
Jenkins stores everything in /var/jenkins_home. Regular backups are essential.
What to Backup
Critical:
/var/jenkins_home/jobs/- Job configurations and build history/var/jenkins_home/users/- User accounts/var/jenkins_home/plugins/- Installed plugins/var/jenkins_home/credentials.xml- Stored credentials/var/jenkins_home/config.xml- Jenkins configuration
Optional (can be rebuilt):
/var/jenkins_home/workspace/- Working directories- Build artifacts (if not stored elsewhere)
Backup Strategy
1. Volume Backup (if using Docker):
# Stop Jenkins
docker stop jenkins
# Backup volume
docker run --rm \
-v jenkins_home:/data \
-v $(pwd):/backup \
alpine tar czf /backup/jenkins-backup-$(date +%Y%m%d).tar.gz /data
# Start Jenkins
docker start jenkins
2. Plugin-based Backup: Install “ThinBackup” plugin:
- Manage Jenkins → ThinBackup → Settings
- Configure backup directory
- Schedule automatic backups
- Retains only configuration, not build history
3. Configuration as Code: Use “Jenkins Configuration as Code” (JCasC) plugin:
# jenkins.yaml
jenkins:
systemMessage: "Production Jenkins"
numExecutors: 2
credentials:
system:
domainCredentials:
- credentials:
- usernamePassword:
scope: GLOBAL
id: docker-hub
username: myuser
Apply with:
docker run -v $(pwd)/jenkins.yaml:/var/jenkins_home/casc_configs/jenkins.yaml \
-e CASC_JENKINS_CONFIG=/var/jenkins_home/casc_configs jenkins/jenkins:lts
Monitoring and Observability
Built-in Monitoring
Jenkins provides basic monitoring:
- Load Statistics: Manage Jenkins → Load Statistics
- System Log: Manage Jenkins → System Log
- Script Console: For Groovy scripts to inspect state
Prometheus Integration
Export Jenkins metrics to Prometheus:
1. Install Prometheus Plugin
2. Configure Prometheus:
# prometheus.yml
scrape_configs:
- job_name: 'jenkins'
metrics_path: '/prometheus'
static_configs:
- targets: ['jenkins-server:8080']
3. Create Grafana Dashboard for visualizing:
- Build duration trends
- Success/failure rates
- Queue length
- Agent utilization
Blue Ocean
Install Blue Ocean plugin for modern, visual pipeline representation:
- Better pipeline visualization
- Easier debugging
- Built-in editor for Jenkinsfiles
- Access at:
http://jenkins-url/blue
Summary: CI Part of CI/CD
This module covered Continuous Integration - the practice of automatically building and testing code changes. Key takeaways:
✅ Build automation eliminates manual, error-prone processes
✅ Jenkins provides flexible, extensible CI/CD capabilities
✅ Pipeline as Code enables version-controlled, testable pipelines
✅ Shared Libraries reduce duplication across projects
✅ Automated triggers respond to code changes in real-time
✅ Proper versioning tracks releases systematically
Next Steps: The Continuous Deployment (CD) module will cover:
- Deployment strategies (blue-green, canary, rolling)
- Infrastructure as Code
- Monitoring and rollback procedures
- Production-ready deployment pipelines
Quick Reference: Essential Jenkins Commands
# Docker setup
docker run -p 8080:8080 -p 50000:50000 -d \
-v jenkins_home:/var/jenkins_home \
-v /var/run/docker.sock:/var/run/docker.sock \
jenkins/jenkins:lts
# Access container as root
docker exec -it -u 0 <container-id> bash
# Fix Docker socket permissions
chmod 666 /var/run/docker.sock
# Backup Jenkins
docker run --rm -v jenkins_home:/data -v $(pwd):/backup alpine \
tar czf /backup/jenkins-backup.tar.gz /data
# Get initial admin password
docker exec <container-id> cat /var/jenkins_home/secrets/initialAdminPassword
Essential Jenkinsfile Template:
#!/usr/bin/env groovy
pipeline {
agent any
tools {
maven 'maven-3.9.11'
}
environment {
DOCKER_IMAGE = "myapp"
}
stages {
stage('Build') {
steps {
sh 'mvn clean package'
}
}
stage('Test') {
steps {
sh 'mvn test'
}
}
stage('Docker Build & Push') {
steps {
script {
withCredentials([usernamePassword(
credentialsId: 'docker-hub',
usernameVariable: 'USER',
passwordVariable: 'PASS'
)]) {
sh """
docker build -t ${USER}/${DOCKER_IMAGE}:${BUILD_NUMBER} .
echo \$PASS | docker login -u \$USER --password-stdin
docker push ${USER}/${DOCKER_IMAGE}:${BUILD_NUMBER}
"""
}
}
}
}
}
post {
always {
cleanWs()
}
success {
echo "Build successful!"
}
failure {
echo "Build failed!"
}
}
}