How to build a performance testing pipeline


643
643 points
build-performance-testing-pipeline
run jemeter tests in performance testing pipeline

Many people are asking themselve a simple question , should I build a performance testing pipeline and what benefits will this bring this to the team.

Of course performance tests will not run like unit and integration tests to give you a pass or failed build results but it will come handy to have an available pipeline to run performance tests so each member of your team will be able to use it. Another benefit will be to solve the scalability issue , in jenkins you can scalle your tests and run in parallel in jenkins using multiple kubernetes pods.

Steps to build a performance testing pipeline:

1 . Create a clean maven project

2. Update your pom file as per example bellow

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>performance-tests</artifactId>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <maven.compiler.source>11</maven.compiler.source>
        <maven.compiler.target>11</maven.compiler.target>
        <jmeter.plugin.version>3.1.2</jmeter.plugin.version>
        <sonar.skip>true</sonar.skip>
    </properties>

    <build>
        <plugins>
            <plugin>
                <groupId>com.lazerycode.jmeter</groupId>
                <artifactId>jmeter-maven-plugin</artifactId>
                <version>${jmeter.plugin.version}</version>
                <executions>
                    <!-- Generate JMeter configuration -->
                    <execution>
                        <id>configuration</id>
                        <goals>
                            <goal>configure</goal>
                        </goals>
                        <configuration>
                            <propertiesJMeter>
                                <log_level.jmeter>DEBUG</log_level.jmeter>
                            </propertiesJMeter>
                            <testResultsTimestamp>false</testResultsTimestamp>
                            <propertiesUser>
                                <threadCount>${threadCount}</threadCount>
                                <rampupTime>${rampupTime}</rampupTime>
                                <duration>${duration}</duration>
                                <server>${server}</server>
                                <duration>${duration}</duration>
                                <tps>${tps}</tps>
                                <currency>${currency}</currency>
                                <jmeter.save.saveservice.output_format>csv</jmeter.save.saveservice.output_format>
                                <jmeter.save.saveservice.print_field_names>true
                                </jmeter.save.saveservice.print_field_names>
                                <jmeter.save.saveservice.successful>true</jmeter.save.saveservice.successful>
                                <jmeter.save.saveservice.label>true</jmeter.save.saveservice.label>
                                <jmeter.save.saveservice.time>true</jmeter.save.saveservice.time>
                                <jmeter.save.saveservice.bytes>true</jmeter.save.saveservice.bytes>
                                <jmeter.save.saveservice.latency>true</jmeter.save.saveservice.latency>
                                <jmeter.save.saveservice.response_code>true</jmeter.save.saveservice.response_code>
                                <jmeter.save.saveservice.response_message>true
                                </jmeter.save.saveservice.response_message>
                                <jmeter.save.saveservice.successful>true</jmeter.save.saveservice.successful>
                                <jmeter.save.saveservice.autoflush>true</jmeter.save.saveservice.autoflush>
                                <jmeter.save.saveservice.thread_counts>true</jmeter.save.saveservice.thread_counts>
                                <jmeter.save.saveservice.thread_name>true</jmeter.save.saveservice.thread_name>
                                <jmeter.save.saveservice.time>true</jmeter.save.saveservice.time>
                                <jmeter.save.saveservice.connect_time>true</jmeter.save.saveservice.connect_time>
                                <jmeter.save.saveservice.assertion_results_failure_message>true
                                </jmeter.save.saveservice.assertion_results_failure_message>
                                <jmeter.reportgenerator.report_title>performance tests report
                                </jmeter.reportgenerator.report_title>
                                <jmeter.reportgenerator.overall_granularity>60000
                                </jmeter.reportgenerator.overall_granularity>
                            </propertiesUser>
                        </configuration>
                    </execution>
                    <!-- Run JMeter tests -->
                    <execution>

                        <id>jmeter-tests</id>
                        <goals>
                            <goal>jmeter</goal>
                        </goals>
                    </execution>
                    <!-- Fail build on errors in test -->
                    <execution>
                        <id>jmeter-check-results</id>
                        <goals>
                            <goal>results</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <ignoreResultFailures>true</ignoreResultFailures>
                    <generateReports>true</generateReports>
                    <jmeterExtensions>
                        <artifact>kg.apc:jmeter-plugins:pom:1.3.1</artifact>
                        <articact>kg.apc:jmeter-plugins-standard:1.4.0</articact>
                        <artifact>kg.apc:jmeter-plugins-manager:1.6</artifact>
                        <artifact>kg.apc:jmeter-plugins-graphs-additional:2.0</artifact>
                        <artifact>kg.apc:jmeter-plugins-graphs-composite:2.0</artifact>
                        <artifact>kg.apc:jmeter-plugins-cmn-jmeter:0.6</artifact>
                        <artifact>kg.apc:jmeter-plugins-graphs-vs:2.0</artifact>
                        <artifact>kg.apc:jmeter-plugins-graphs-basic:2.0</artifact>
                        <artifact>kg.apc:jmeter-plugins-graphs-ggl:2.0</artifact>
                    </jmeterExtensions>
                    <downloadExtensionDependencies>false</downloadExtensionDependencies>
                </configuration>
                <dependencies>
                    <dependency>
                        <groupId>kg.apc</groupId>
                        <artifactId>jmeter-plugins-graphs-additional</artifactId>
                        <version>2.0</version>
                    </dependency>
                </dependencies>
            </plugin>
        </plugins>
    </build>
</project>

3. Under src/test/jmeter add your jmeter test plan , the *.jmx file

Advertisements

Create your jenkins file eg : jenkins/JenkinsFile

pipeline {
    parameters {
        string(name: 'threadCount', defaultValue: '20', description: 'The number of running threads')
        string(name: 'rampupTime', defaultValue: '60', description: 'How long it takes to ramp-up to the full number of threads( in seconds)')
        string(name: 'duration', defaultValue: '180', description: "The duration of each thread group( in seconds)")
        string(name: 'tps', defaultValue: '6000', description: "Transactions per seconds , in samples per minute (E.g. for 10 tps = 10*60 seconds = 600)")
        string(name: 'server', defaultValue: 'example.com', description: 'The url of the service on which performance tests will run')
       
    }

    options {
        buildDiscarder logRotator(artifactDaysToKeepStr: '30', artifactNumToKeepStr: '5', daysToKeepStr: '30', numToKeepStr: '5')
        disableConcurrentBuilds()
    }

    agent {
        kubernetes {
            label 'some-jdk-agent-label-name'
            defaultContainer 'maven-jdk-11'
            yaml libraryResource('agents/k8s/youragentyml.yaml')
        }
    }

    stages {
        stage('Setup') {
            steps {
                script {
                   // your git code
                }
            }
        }

        stage('Warming up...') {
            steps {
                catchError(buildResult: 'SUCCESS', stageResult: 'UNSTABLE') {
                    script {
                        echo 'Changing Directory to performance-tests'
                        dir('performance-tests') {
                            mvn "clean verify -DthreadCount=20 -DrampupTime=120 -Dserver=${params.server} -Dduration=120 -Dtps=6000"
                        }
                    }
                }
            }
        }

        stage('Run Performance tests') {
            steps {
                catchError(buildResult: 'SUCCESS', stageResult: 'UNSTABLE') {
                    script {
                        echo 'performance-tests'
                        dir('performance-tests') {
                            echo "Number of tps is : ${params.tps}"
                            mvn "clean verify -DthreadCount=${params.threadCount} -DrampupTime=${params.rampupTime} -Dserver=${params.server} -Dduration=${params.duration} -Dtps=${params.tps}"
                            echo 'Tests finished running.'
                            perfReport 'target/jmeter/results/*.csv'
                        }
                    }
                }
            }
        }
    }
    post {
        success {
            // publish html
            publishHTML target: [
                    allowMissing         : false,
                    alwaysLinkToLastBuild: false,
                    keepAll              : true,
                    reportDir            : 'performance-tests/target/jmeter/reports/TestPlan',
                    reportFiles          : 'index.html',
                    reportName: 'Performance Report'
            ]
            archiveArtifacts artifacts: 'performance-tests/target/jmeter/results/*.csv', fingerprint: true
        }
    }
}

This is about it , your tests will generate the reports and will archive them and you’ll be able to see them in jenkins or locally on target/jmeter/reports/TestPlan/index.html

Please do reach out if it helps or if you have any questions, until next time , enjoy.

Happy testing!


Like it? Share with your friends!

643
643 points
Test engineer

0 Comments

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.