Getting started with CyBench Comparator - Nastel/gocypher-cybench-java GitHub Wiki

Introduction to Comparator (Continuous Performance Regression Testing)

The Comparator helps developers with continuous performance regression testing -- run and compare code performance benchmarks across tests, builds, releases. The Comparator supports passing/failing test builds depending on a variety of options you choose, suitable and intended for the CI/CD pipeline. Some of these options include: overall score's change in delta not within a given range, score's standard deviation straying from the mean, average of scores etc. All these values and options can be compared against your most recent benchmark score. Benchmark scores are categorized a few ways, including benchmark mode (throughput or single-shot) and version of your build. This ensures all scores, comparisons, and calculations are done accurately. While the comparator is optional, it's an extremely useful and lightweight tool for any developer who is interested in benchmarking their work. The Comparator can be tacked on to any Maven or Gradle project that contains CyBench benchmarks, or can be ran standalone.

Prerequisites

  • A Maven/Gradle project
    • CyBench/JMH benchmarks within your project
      • If your project has JUnit tests and you'd like to benchmark and compare those, refer to the Test2Benchmark (T2B) Agent.
      • CyBench benchmarks can also be written manually, or generated within Eclipse or Intellij, see specific pages in this wiki depending on the build nature of your project (Java, Gradle, Maven)
  • Latest build of necessary dependencies
    • CyBench Comparator: com.gocypher.cybench.client:gocypher-cybench-comparator:1.3.5
    • CyBench Runner: com.gocypher.cybench.client:gocypher-cybench-runner:1.3.5
    • Both of these dependencies should be found on the Maven snapshot repository, build instructions are also included below
  • (Recommended) GitHub account
    • The CyBench website uses GitHub for authentication, and allows you to save private reports to the CyBench database. You're also granted a private workspace where your reports will be uploaded to, with the option of making reports public. Essentially, your GitHub account also becomes your CyBench account.
    • NOTE: You have the ability to upload reports without signing in, but these reports will only post to the public workspace. It is highly recommended to sign in with GitHub to access your (or someone else's) workspace. Reports can also be stored and compared locally.

Preparation

To correctly use the Comparator, a few steps must be taken, which will be described below. Building the dependencies manually is optional, and useful if you want nightly builds or if you are unable to access the Maven repositories. Modifying your project's build instructions (pom.xml or build.gradle) is required to use CyBench Comparator.

Building CyBench Comparator

After locally cloning this respository, navigate to ~\gocypher-cybench-java\gocypher-cybench-client\gocypher-cybench-comparator\ where you will see the Comparator's pom.xml. From this directory, run the following Maven command to install the Comparator library into your local Maven repository. NOTE: While Comparator is a Maven project itself, the Maven local repository is accessible by both Maven AND Gradle projects. i.e. Your Gradle project will be able to access the local Maven repository and thus, the CyBench Comparator dependency.

  • Comparator:
    •  mvn clean install
    • This installs the Comparator into your local Maven repository
  • Comparator (Runnable .jar):
    •  mvn clean package assembly:single
    • After successful build, gocypher-cybench-comparator-1.3.5.jar will be produced in the ~/target/folder
  • Other CyBench dependencies (Runner, Core, etc.)
    • These dependencies should be available remotely on the Maven snapshot repository, but if they aren't, or you wish to build them yourself, refer to each subproject's README.md for build instructions. The above command should work for them all, but some may have specific instructions.

Modifying your build file

Whether you're using Gradle or Maven to build your project, you will need to add some additional resources into your build file (pom.xml or build.gradle). These include repositories, configurations, dependencies, etc. Attaching Comparator to its own task/profile is recommended and demonstrated below. Most of these groups (repositories, dependencies, etc.) should already exist in your build.gradle, so you can simply add in the Comparator components.

Gradle Project

An example/template build.gradle has been included below. The Sonatype snapshot and Maven Central repository are added for additional necessary CyBench libraries. If you're using your libraries you built yourself, you must also add the mavenLocal() repository. A special configuration is added, cybComparator. For dependencies, at the least you will need Cybench Runner and CyBench Comparator.

  • Example build.gradle (Groovy)
repositories {
    mavenLocal()
    mavenCentral()
    maven {
        url 'https://s01.oss.sonatype.org/content/repositories/snapshots'
    }
}
// ...
configurations {
    cybench
    cybComparator
}
// ...
dependencies {
    // ...
    cybench 'com.gocypher.cybench.client:gocypher-cybench-runner:1.3.5'
    cybComparator 'com.gocypher.cybench.client:gocypher-cybench-comparator:1.3.5'
}
// ...
task compareBenchmarks(type: JavaExec) {
    group = 'CyBench-Client'
    description = 'Compare Benchmarks'
    classpath = files(
        project.sourceSets.main.runtimeClasspath,
        project.sourceSets.test.runtimeClasspath,
        configurations.cybComparator
    )
    main = 'com.gocypher.cybench.CompareBenchmarks'
    args = [
        '-C config/comparator.yaml'
    ]
}
  • Example build.gradle (Kotlin)
    • Besides syntax of the two languages, very similar to the Groovy build
import java.util.Properties;
// ...
repositories {
    mavenLocal();
    mavenCentral();
    maven {
        setUrl("https://s01.oss.sonatype.org/content/repositories/snapshots")
    }
}
// ...
val t2bComparator by configurations.creating {
    isCanBeResolved = true
    isCanBeConsumed = false
}

// ...
dependencies {
    // ...
    cybench("com.gocypher.cybench.client:gocypher-cybench-runner:1.3.5")
    cybComparator("com.gocypher.cybench.client:gocypher-cybench-comparator:1.3.5")
}
// ...
val launcher = javaToolchains.launcherFor {
    languageVersion.set(JavaLanguageVersion.of(11))
}

tasks {
    val compareBenchmarks by registering(JavaExec::class) {
        group = "cybench-client"
        description = "Compare Benchmarks"
        javaLauncher.set(launcher)

        classpath(
            sourceSets["main"].runtimeClasspath,
            sourceSets["test"].runtimeClasspath,
            configurations.getByName("cybComparator")
        )

        mainClass.set("com.gocypher.cybench.CompareBenchmarks")
        args("-C config/comparator.yaml")
    }
}

Maven Project

The Comparator can also be added to your Maven project by modifying your pom.xml file. For <repositories>, you must add the Maven snapshot repository. This is used to grab the latest CyBench dependencies. For the <dependencies> section, you must add the CyBench Comparator. This additional repository and dependency look like below on their own:

  • Maven additions
  <repositories>
	<repository>
		<id>oss.sonatype.org</id>
		<url>https://s01.oss.sonatype.org/content/repositories/snapshots</url>
		<releases>
			<enabled>false</enabled>
		</releases>
		<snapshots>
			<enabled>true</enabled>
		</snapshots>
	</repository>
</repositories>
    ...
    
<dependency>
	<groupId>com.gocypher.cybench.client</groupId>
	<artifactId>gocypher-cybench-comparator</artifactId>
	<version>1.3.5</version>
	<scope>test</scope>
</dependency>

Similar to Gradle's task, you will need to add a <profile> to use the Comparator in your Maven project. An example/template profile is given below. There exists a couple of configurable options, denoted with comments marked like this: <!-- ###. The first block of configurable properties has two important pieces to it: which Java version you'd like to use, and the location of comparator.yaml. The comparator.yaml file is vital to getting the most out of your Comparisons, and allows you to change comparison methods, thresholds, range of tests, etc. More information on this important properties file is detailed later on this page.

  • Full profile for Comparator
<project>
	<profiles>
		<profile>
			<id>compareBenchmarks</id>
			<!-- @@@ Maven central snapshots repository to get dependency artifacts snapshot releases @@@ -->
			<repositories>
				<repository>
					<id>oss.sonatype.org</id>
					<url>https://s01.oss.sonatype.org/content/repositories/snapshots</url>
					<releases>
						<enabled>false</enabled>
					</releases>
					<snapshots>
						<enabled>true</enabled>
					</snapshots>
				</repository>
			</repositories>
			<dependencies>
				<!-- @@@ Cybench Comparator app dependency @@@ -->
				<dependency>
					<groupId>com.gocypher.cybench.client</groupId>
					<artifactId>gocypher-cybench-comparator</artifactId>
					<version>1.3.5</version>
					<scope>test</scope>
				</dependency>
			</dependencies>
			<properties>
				<!-- ### Java executable to use ### -->
				<comp.java.home>${java.home}</comp.java.home>
				<comp.java.exec>"${comp.java.home}/bin/java"</comp.java.exec>
				<comp.class>com.gocypher.cybench.CompareBenchmarks</comp.class>
				<comp.class.args>-C config/comparator.yaml</comp.class.args>
			</properties>
			<build>
				<plugins>
					<!-- @@@ Make classpath entries as properties to ease access @@@ -->
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-dependency-plugin</artifactId>
						<version>3.1.2</version>
						<executions>
							<execution>
								<id>get-classpath-filenames</id>
								<goals>
									<goal>properties</goal>
								</goals>
							</execution>
							<execution>
								<phase>generate-sources</phase>
								<goals>
									<goal>build-classpath</goal>
								</goals>
								<configuration>
									<outputProperty>comp.compile.classpath</outputProperty>
									<pathSeparator>${path.separator}</pathSeparator>
								</configuration>
							</execution>
						</executions>
					</plugin>
					<plugin>
						<groupId>org.codehaus.mojo</groupId>
						<artifactId>exec-maven-plugin</artifactId>
						<version>3.0.0</version>
						<!-- @@@ Compare benchmarks @@@ -->
						<execution>
							<id>compare-benchmarks</id>
							<goals>
								<goal>exec</goal>
							</goals>
							<!-- ### Maven phase when to compare benchmarks ### -->
							<phase>integration-test</phase>
							<configuration>
								<executable>${comp.java.exec}</executable>
								<classpathScope>test</classpathScope>
								<commandlineArgs>
                                        -cp ${comp.compile.classpath} ${comp.class} ${comp.class.args}
                                    </commandlineArgs>
							</configuration>
						</execution>
					</executions>
				</plugin>
			</plugins>
		</build>
	</profile><...>
</profiles><...>
</project>

Running the Comparator

Once the Comparator is added to your build instructions, it is easily executed in both Maven and Gradle projects.

Running the Comparator (Gradle)

Assuming you referenced the given build.gradle in this guide, you'll have set the comparator task to compareBenchmarks. You can call this task explicitly during your build, example below:

gradle :compareBenchmarks

NOTE: If your project uses a Gradle wrapper, use gradlew.bat instead of gradle

NOTE: In the above example, the command would be ran from the root of your project.

Running the Comparator (Maven)

If you've been following this guide, your pom.xml has an added profile for the Comparator app, using the <id> of compareBenchmarks. Make sure the configuration file is set correctly within the <properties> section of your <profile>.

      <...>
      <comp.java.home>${java.home}</comp.java.home>
      <comp.java.exec>"${comp.java.home}/bin/java"</comp.java.exec>
                <comp.class>com.gocypher.cybench.CompareBenchmarks</comp.class>
                <comp.class.args>-C config/comparator.yaml</comp.class.args> <!-- Make sure your configuration file is set correctly! -->

After making sure your comparator.yaml is set correctly, you can run your Maven build with the compareBenchmarks profile:

mvn clean verify -f pom.xml -P compareBenchmarks

Running the Comparator (Java .jar)

Refer to command line args for more specifics on the types of arguments you can pass to Comparator.
Note: most arguments are only passed if you are using custom JavaScripts for comparison configuration. Otherwise, most arguments are supplied within your YAML configuration file (with the exception of the -C flag that points to the file itself). Example below. See comparator configuration for more inforamtion.

java -jar gocypher-cybench-comparator-1.3.5.jar -C config/comparator.yaml

Adding Comparator to Jenkins

Comparator has the ability to fail a Jenkins build in the case of comparison failures. Just add a Jenkins stage with a Comparator run command for your appropriate operating system.
For different run configurations, refer to running the comparator.

Windows

stage('Compare Benchmarks') {
            steps {
                bat 'gradle :compareBenchmarks'
            }
        }

Linux

stage('Compare Benchmarks') {
            steps {
                sh 'gradle :compareBenchmarks'
            }
        }

Configuring the Comparator

To adjust the Comparator to your needs, you are encouraged to tweak and modify the configurations. Without setting up one of these configurations, all your comparisons will be ran at default values. It's important to understand what these configurations mean (how to change them, what you can change them to, etc.) to get the most out of Comparator, and thus your project as a whole. As of now, there are two different ways to configure the Comparator:

  • Manually creating/editing the template comparator.yaml (found here)
  • Using Cybench Config Generator script to generate valid configurations, via prompts (found Here)
  • Feeding the Comparator JavaScript code
    • Sample scripts and template found here

Deciding Configuration Method

The main difference between the two configuration approaches essentially boils down to one decision: How in-depth do you want your comparisons to be? Using the comparator.yaml file can be described as 'high-level'; all you need to do is plug-in predefined values that suit your needs. When configuring via this approach, the ability to test multiple packages (while using different values per package) is easy and intuitive. This method of configuration is great for getting started with the Comparator as it's very simple to adjust values and sample what the Comparator is capable of.

The other option is scripting. The Comparator is capable of taking in and executing JavaScript files. Various Comparator methods have been exposed and bound, allowing you to call them within your script. This allows you to have a lot more control over the Comparator, and allows you to customize your own comparisons by using the exposed methods as building blocks. You can consider this scripting option as 'low-level'; while not as simple as configuring via comparator.yaml, the scripting option opens the opportunity of creativity and depth.

Configuring via comparator.yaml

Configuring comparator.yaml is crucial to get the most out of CyBench Comparator. Adjusting the file itself is simple and intuitive. The Comparator's README contains a lot of information on configuring comparator.yaml, and has been included below.

Configuration Variables

  • Configuration file: comparator.yaml
  • Default location: ~/config
  • The first configuration is a boolean where you can decide if you want the build to fail (CI/CD pipeline failure) if there are benchmark comparisons that failed
    • failBuild: can be set to true or false, where true would mean just one failed benchmark comparison will fail the build in your pipeline. false will allow the build to pass, even if benchmark comparison tests fail.
  • The next two configurations are vital to fetching the previous benchmark scores correctly
    • reportPath: The filepath of the CyBench reports folder for your repository OR filepath to a specific report you wish to run the Comparator on
      • If running the Comparator from the root of your project, reportPath: "reports/" shall suffice as reports are by default generated into ~/reports
    • token: Set this to your CyBench Access token, specifically a 'query' one. You can generate an access token for your private workspace on the CyBench website. More details and a guide is provided here.
  • The following branches of comparator.yaml are used for configuring values exclusive to a certain package. i.e. If you'd like to test for change in Delta between the last benchmark for one package, and then test for change in average compared to ALL previous benchmarks in another package, you can! Multiple branches are defined by compare.X where X is any arbitrary name that you choose. While compare.default values should be changed to your liking, the name compare.default itself should NOT be adjusted.
  • An example has been given below of setting up different packages
compare.default:
    method: "DELTA"
    scope: "BETWEEN"
    threshold: "GREATER"
    percentChangeAllowed: "1"
    compareVersion: "1.0.1"
compare.A:
    method: "SD"
    package: "calctest.ClockTest"
    scope: "WITHIN"
    threshold: "PERCENT_CHANGE"
    percentChangeAllowed: "15"

In the above example, the package calctest.ClockTest and all its benchmarks will test for a percent change of no less than -15% or better, it'll also compare all previous benchmarks for this package version and its tests. Other tested packages will refer to the compare.default since they are not explicitly defined by a package: value. This means all other packages in your Comparator run will test for the change in score between your most recent score, and the previous score. In this case, threshold: is set to "GREATER", which means the most recent score must be greater than the previous in order to pass. As opposed to compare.A, compare.default will check scores from a different version (in this example, it'll compare scores between the current version, and version: "1.0.1".

  • Inside these compare.X branches exists various configurations you can set.
Comparator Methods
  • The first configuration you should decide is the method to compare
  • Comparison method is defined by method:
  • The possible values for method: are listed below
    • DELTA = Tests for the change from previous score(s) to the most recent one
    • SD = Tests for standard deviation of new score when compared to the average of previous X scores
  • NOTE: When comparing via Standard Deviation (SD), you must specify an additional key, deviationsAllowed:.
    • deviationsAllowed: is configured as the amount of deviations you will allow your score to be away from the mean of the previous X values, where X is defined via range:"X".
Package to Compare
  • The next configuration to decide is which package should be tested
    • Setting this value is crucial to taking advantage of multiple compare.X branches
  • Only used when defining multiple compare.X branches, does not get defined in compare.default
  • Package is defined with package:
  • Must be set to the full package name, e.g. package:"com.calcTest"
Comparison Scope
  • This configuration is used for testing either within or between versions
  • Scope is defined by scope:
  • Possible values are "WITHIN" or "BETWEEN"
    • "WITHIN" = Compare scores within the same version of your project
    • "BETWEEN" = Compare scores between different versions of your project
  • NOTE: When using the "BETWEEN" scope, you must also set compareVersion: to whichever version you wish to compare to
Comparison Range
  • Allows you to specify the amount of previous scores you wish to compare to
  • Defined with range:
  • Accepted values are either: ALL, or any number
    • e.g. range: "ALL" or range: "1"
    • In the latter example, the Comparator would compare your newest score only to the most recent, previous score
    • If you set range: "5", your newest score would be compared against the last 5 previous scores
Comparison Threshold
  • This configuration will decide what values dictate if your build/test passes or fails
  • Threshold is defined by threshold:
  • Possible values are either "GREATER" or "PERCENT_CHANGE"
    • "GREATER" = Passes/fails depending on if your current score was higher than the score getting compared against
    • "PERCENT_CHANGE" = More flexible, allows the build/test to pass even if the score was lower, as long as it is within a given percentage
  • NOTE: When using "PERCENT_CHANGE", make sure to define percentChangeAllowed:"X", where X is the percent change allowed, even if the comparison results in a negative number

Example comparator.yaml

A template comparator.yaml can be taken from this repository, and can/should be used for your own tests. If you've added the CyBench comparator to your project via this README or the CyBench Wiki, Comparator will look for comparator.yaml in a folder called config/ at the root of your project. All CyBench components that use a properties or configuration file will look for those files inside this same folder. The template comparator.yaml also includes comments at the top to help you adjust values on the fly. Once you've set your configurations, you're ready for the next step of running the Comparator, detailed in the next section. Below is an example of a more fleshed out comparator.yaml

### Property File for Cybench Comparator Tool ###

# failBuild = whether or not you want the build to fail (CI/CD pipeline failure) if there are benchmark comparison failures
## Options {true, false} ##

# reportPath = 
    # location of CyBench reports folder (automatically takes the most recent report) 
    # OR the location of the specific report you want to run Comparator on

# token = CyBench Access Token with View Permissions
### Token can be found on the CyBench site by clicking on your username ###
### Token should be a query token with access to the specific workspace that stores your benchmarks ###

# compare.default = default comparison configurations

#compare.{} = specific comparison configurations 
### {} can be any identifier of your choice ###
### Make sure to include {package} to specify which package you want these comparisons to run on ###


### Comparison Configurations ###

# scope = (within or between project versions)
## Options {within, between} ##
### {within} will compare all benchmarks within the benchmarked version ###
### if {between} is chosen, must specify {compareVersion} (will compare benchmarked version to the specified version) ###
### add {compareVersion} to specify which version to compare to ###

# range = {amount of values to compare against}
## Options {all, (#)} - can specify the word "all" to compare against all values or any number X to compare against previous X recorded scores ##
### to compare against just the previous score within your version or the most recent score in another version, specify range '1' ###
### otherwise the new score will be compared against the mean of the previous X values ###

# method = how benchmarks will be compared
## Options {delta, SD} ##
### if {SD} is chosen, must specify {deviationsAllowed} ###
### {deviationsAllowed} = the amount of deviations you will allow your score to be away from the mean of the previous X values (X specified as {range}) ###
### if {delta} is chosen, must specify {threshold} ###
# {threshold} = how to specify a passing benchmark 
## Options {percent_change, greater} ##
### {greater} will check to see if new score is simply greater than the compared to score ###
### if {percent_change} is chosen, must specify {percentChangeAllowed} ###
### {percentChangeAllowed} = percentage score should be within in order to pass ###
### ex. 5% means the new score should be within 5% of the previous threshold ###

failBuild: true
reportPath: "C:/Users/MUSR/eclipse-workspace/myMavenProject/reports/" #<---- Location of '/reports' folder that contains generated CyBench reports
token: "ws_874a4eb4-fzsa-48fb-pr58-g8lwa7820e132_query" #<---- Your CyBench Private Workspace Query Token (dummy token)

compare.default:
    method: "DELTA"
    package: "calctest.CalculatorTest"
    scope: "BETWEEN"
    threshold: "GREATER"
    range: "1"
    compareVersion: "1.0.1"
compare.A:
    method: "SD"
    package: "calctest.ClockTest"
    scope: "WITHIN"
    deviationsAllowed: "1"
    
#compare.B:
  #  pacakge: "com.my.other.package"
  #  percentage: "6"
  #  threshold: "percent_change"

Configuring via scripting

Deciding on going the scripting route can be a bit daunting initially, but will prove lucrative once you understand the syntax and what the Comparator expects. By creating your own scripts (or using our pre-built ones), you will execute Comparator methods that we have exposed directly, giving you a lot more control over what the Comparator is testing for. When using scripts, we give you control to fetch benchmark scores/data directly, access to the comparison methods, as well as access to the assertion methods. The Comparator's README has a plethora of information available, and can be viewed here. Much of the information below has been taken or paraphrased from the Comparator README.

Certain variables are required for the Comparator to work. When you use a custom script, these variables are already set for you. Benchmarks are fetched in the background immediately once configuration arguments are passed to the main class. Refer to the exposed methods section to view accessible methods. Below is a list of variables accessible in the scripts:

  • myBenchmarks - a Java ArrayList of all the benchmarks from your report. myBenchmarks is a ArrayList<ComparedBenchmark> object. ComparedBenchmark is a custom object that contains information about benchmark scores, modes, etc. Once a comparison is run on that benchmark, comparison statistics are scored in the model (delta, standard deviation, percent change...). You can look at the model here: ComparedBenchmark
  • project - the name of the project you ran your report on
  • currentVersion - the project version of the report you are comparing with
  • latestVersion - the latest version of your project (given you have run a report on it)
  • previousVersion - the previous version of your project (given you have run a report on it)

The configuration arguments you pass via command line or build instructions (see: Script Configuration Args) are also accessible:

  • method - the comparison method to be used
  • scope - comparing between or within versions
  • range - the amount of scores to compare to
  • threshold - specify what constitues a pass/fail in your test
  • percentChangeAllowed - used with threshold percent_change, dictates how much percent change is allowed to pass/fail the test
  • deviationsAllowed - used with SD method of comparison, amount of deviations allowed from the mean to pass/fail the test
  • compareVersion - used when scope is BETWEEN, the version to compare to

Example Script

// EXAMPLE ARGS PASSED VIA COMMAND LINE
// -F -S scripts/myScript.js -T ws_0a1evpqm-scv3-g43c-h3x2-f0pqm79f2d39_query -R reports/ -s WITHIN -r ALL -m DELTA -t GREATER 

// loop through the my benchmarks
forEach.call(myBenchmarks, function (benchmark) {
    // var benchmark represents a ComparedBenchmark Model

    // returns a list of benchmarks previously tested (with the same fingerprint and mode)
    benchmarksToCompareAgainst = getBenchmarksToCompareAgainst(benchmark);

    // represents an ENUM (PASS, FAIL, SKIP) - SKIP means this benchmark was not previously tested
    compareState = runComparison(benchmark, benchmarksToCompareAgainst);

    // after running a comparison, benchmark object will have contain properties that represent comparison statistics
    comparedAgainstMean = benchmark.getCompareMean();
    comapredAgainstStandardDeviation = benchmark.getCompareSD();

    score = benchmark.getScore();
    delta = benchmark.getDelta();
    percentChange = benchmark.getPercentChange();
    deviationsFromMean = benchmark.getDeviationsFromMean();
});

Detailed below is a walkthrough of the script above, explaining what each line of code means, and what is happening in the background as you execute the script.

Template Walkthrough

  • In order to compare the benchmarks, we loop through myBenchmarks which is retrieved from your report
    • Each of these benchmarks are ComparedBenchmark objects (details of the object can be found here
  • The method getBenchmarksToCompareAgainst(benchmark) is a required method you must call in order to execute a fetch and grab the benchmarks to compare against
    • This method allows you to pass a different version to compare to and a different range than specified via command args (read the exposed methods section for more info)
    • It returns a list of ComparedBenchmarks
  • Next we run the comparison using runComparison(benchmark, benchmarksToCompareAgainst). This will return either PASS, FAIL, or SKIP.
  • Finally, after runnning the comparison, all of the benchmark properties (score, mode, comparison values) are stored within benchmark. The methods you can call are found within the ComparedBenchmark model. You can print any of these for viewing with print(score) for example.
  • NOTE: As a reminder, a table of passable arguments and exposed methods can be found below in their corresponding sections.

Script Configuration Args

When opting for the scripting approach to configuration, you have the ability to directly pass configuration arguments. This is done in two ways: either through command line, or modifying your build instructions file (pom.xml or build.gradle(.kts)). Information for both methods of passing configuration arguments are detailed below the chart.

Argument Flag .yaml Equivalent Valid Options Description
-F, -failBuild failBuild: N/A This argument is unique in that you don't need to pass a value with it. Default value is false, meaning your build will not fail even if one more multiple benchmark comparison tests fail. By passing the (-f) flag, this value gets set to true, meaning your build will fail if even just one benchmark comparison test fails.
-C, -configPath N/A An existing comparator.yaml config file Allows you to forgo scripting and specify the path of a valid comparator.yaml configuration file
-S, -scriptPath N/A An existing .js script Specify file path/name of the script
-T, -token token: An existing CyBench query access token Specify your CyBench Workspace's query access token
-R, -reportPath reportPath: Path to folder containing CyBench generated reports, or a specific report Specify a certain .cybench report, or a folder of them
-s, -scope scope: WITHIN or BETWEEN Choose between comparing within current version, or between previous versions, when using BETWEEN, a specific version can be specified with (-v), otherwise defaults to the previous version
-v, -compareVersion compareVersion: PREVIOUS or any specific version Specify the version you'd like to compare to, previous is the immediate version prior to the tested version, e.g. a Benchmark with the version 2.0.2 compared to the PREVIOUS version will compare to 2.0.1
-r, -range range: ALL or any integer Decide how many scores you'd like to compare the newest one to, ALL would be all values, 1 would be the previous score from the newest
-m, -method method: DELTA or SD Decide which method of comparison to use. DELTA will compare difference in score, and requires an additional flag, threshold (-t). SD will do comparisons regarding standard deviation. SD requires an additional flag as well, deviations allowed (-d)
-d, -deviationsAllowed deviationsAllowed: Any Double value Used with assertions to check that the new score is within the given amount of deviations from the mean. (mean being calculated from the scores being compared to)
-t, -threshold threshold: GREATER or PERCENT_CHANGE Only used with the DELTA method. GREATER will compare raw scores, PERCENT_CHANGE is used to measure the percent change of the score in comparison to previous scores.
-p, -percentChangeAllowed percentChangeAllowed: Any Double value This argument is used when running assertions, makes sure your new score is within X percent of the previous scores you're comparing to
Passing arguments through the command line

The Comparator can be ran as an executable .jar through the command line. When running the Comparator this way, you can configure the Comparator to your needs using the above commands. If you're not comfortable with passing all the configurations with flags in one line, you can simply use the -C flag to pass a comparator.yaml file that has the configurations you wish to use. Beyond that, every configurable for the Comparator can be passed along as arguments, even including your custom or our pre-built JavaScript scripts. Assuming you build the runnable .jar of Comparator according to this guide, you will have a gocypher-cybench-comparator-1.3.5.jar. You can rename this .jar file if you need. Below is an example of how to run this file:

java -jar gocypher-cybench-comparator-1.3.5.jar -S myScript.js -T ws_b3a34a76-8kkc-4912-z8q8-f09b3b44fbf1_query -R reports/ -m DELTA -s BETWEEN -v PREVIOUS -t greater -r 5

While it may seem a bit daunting, each part of the command can be broken down very easily according to the chart above. The most important/necessary flags are the capitalized ones, i.e. the script (-S), your CyBench workspace's query token (-T), and location of local CyBench reports (-R). NOTE: while not used in this context, the (-C) flag to point to a comparator.yaml is just as important, and if your comparator.yaml is filled in correctly, token and reports (along with all other configurables) will be taken from there. Below is an example of running the Comparator .jar file with just a filled in comparator.yaml

java -jar gocypher-cybench-comparator-1.3.5.jar -C config/comparator.yaml

Passing arguments through your build file

These same argument flags can also be passed along through your project's pom.xml or build.gradle file. The general idea for this method of passing arguments is the same as the previous, though you define the arguments in the build file as opposed to through the command line.

Gradle Projects

Assuming you've followed this page and have added a compareBenchmarks task (or something similar), you'll already have a args = [] section at the bottom of the task, below where the main class is defined. It is inside this args = [] section you can pass along your configuration values. Let's start with the simplest one; having a filled out comparator.yaml and only passing that:

args = [ '-C config/comparator.yaml' ]

Full example of compareBenchmarks task with config file argument:

task compareBenchmarks(type: JavaExec) {
    group = 'CyBench-Comparator'
    description = 'Compare Benchmarks'
    classpath = files(
            project.sourceSets.main.runtimeClasspath,
            project.sourceSets.test.runtimeClasspath,
            configurations.cybenchComparator
    )
    main = 'com.gocypher.cybench.CompareBenchmarks'
    args = [
      '-C config/comparator.yaml'
    ]
}

Setting args like this in your build.gradle is the exact same as the previous command above (with the runnable .jar). However, you also have the ability to pass ALL the arguments like in the first example:

args = [
        '-S myScript.js',
        '-T ws_b3a34a76-8kkc-4912-z8q8-f09b3b44fbf1_query',
        '-R reports/',
        '-m DELTA',
        '-s BETWEEN',
        '-v PREVIOUS',
        '-t greater',
        '-r 5'
      ]

Full example of compareBenchmark task with multiple argument flags:

task compareBenchmarks(type: JavaExec) {
    group = 'CyBench-Comparator'
    description = 'Compare Benchmarks'
    classpath = files(
            project.sourceSets.main.runtimeClasspath,
            project.sourceSets.test.runtimeClasspath,
            configurations.cybenchComparator
    )
    main = 'com.gocypher.cybench.CompareBenchmarks'
	args = [
        '-S myScript.js',
        '-T ws_b3a34a76-8kkc-4912-z8q8-f09b3b44fbf1_query',
        '-R reports/',
        '-m DELTA',
        '-s BETWEEN',
        '-v PREVIOUS',
        '-t greater',
        '-r 5'
      ]
}
Maven projects

The process for Maven projects is very similar to Gradle, though instead of altering the args = [] section of your compareBenchmarks task, you're instead going to change the <comp.class.args> section of the compareBenchmarks profile in your pom.xml. If you've been following this page or the Comparator README, you can find <comp.class.args> inside the <properties> tag, after the <dependencies> tag and before the <build> tag. Below is an example of the <comp.class.args> that passes the comparator.yaml configuration file as an argument:

<comp.class.args>-C config/comparator.yaml</comp.class.args>

Very similar to the previous command (and the build.gradle commands), you can pass all the other configuration flags in this style:

<comp.class.args>-S myScript.js -T ws_b3a34a76-8kkc-4912-z8q8-f09b3b44fbf1_query -R reports/ -m DELTA -s BETWEEN -t greater -r 5</comp.class.args>

Exposed methods for scripting

The list of exposed methods lives inside ComparatorScriptBindings.js. This JavaScript file binds methods that you call to internal Comparator methods. The script itself has comments for each method, detailing their purpose. A brief overview is also given below, taken from the Comparator's README:

Accessing benchmarks

These methods are used to access and fetch benchmark reports + their scores.

Method Name Parameter Types Return Type Description
getBenchmarksToCompareAgainst(benchmarkToCompare, compareVersion, range) {ComparedBenchmark, String (optional), String (optional)} List<ComparedBenchmark> Returns a list of ComparedBenchmark which is a model encapsulating useful properties of a benchmark pre and post comparison. Model information can be found here
getBenchmarkScores(benchmarks) {List<ComparedBenchmark>} List<Double> Returns a list of Double values in case you wanted to easily extract a list of scores from your List of ComparedBenchmark

Compare methods

This method is used to run a comparison

Method Name Parameter Types Return Type Description
runComparison(benchmarkToCompare, benchmarksToCompareAgainst) {ComparedBenchmark, List<ComparedBenchmark>} CompareState To run a comparison, you pass a ComparedBenchmark and a List of ComparedBenchmark to compare against. You can get this list by calling the getBenchmarksToCompareAgainst(benchmarkToCompare) method. This returns a CompareState ENUM, which if printed in the script using print(compareState)' will print either PASS, FAIL, or SKIP`

Arithmetic methods

The following functions perform basic arithmetic for your comparisons/logging

Method Name Parameter Types Return Type Description
calculateMean(scores) List<Double> Double Returns the average score given a list of scores
calculateSD(scores) List<Double> Double Returns the standard deviation of a list of scores
calculateDeviationsFromMean(score, compareMean, compareStandardDeviation) {Double, Double, Double} Double Returns the amount of deviations the score deviates from the compareMean using the compareStandardDeviation
calculatePercentChange(newScore, compareScore) Double, Double Double Returns the percent change from one score to another
Assertion methods

The following assertion methods can be used to pass/fail your tests/build depending on certain factors

Method name Parameter Types Return Type Description
passAssertionDeviation(deviationsFromMean, deviationsAllowed) Double, Double boolean Specific assertion, returns true (pass) if your scores deviations from the mean is within the given deviations allowed, otherwise returns false (fail)
passAssertionPercentage(percentChange, percentageAllowed) Double, Double boolean Specific assertion, returns true (pass) if your scores change in percentage is within the given percentage allowed, otherwise returns false (fail)
passAssertionPositive(val) Double boolean Specific assertion, returns true (pass) if the given value is greather than 0, otherwise returns false (fail)
⚠️ **GitHub.com Fallback** ⚠️