databene

 
  • Increase font size
  • Default font size
  • Decrease font size

ContiPerf 1.x

Overview

ContiPerf is a lightweight testing utility that enables the user to easily leverage JUnit 4 test cases as performance tests e.g. for continuous performance testing. It is inspired by JUnit 4's easy test configuration with annotations and by JUnitPerf's idea of wrapping Unit tests for performance testing, but more powerful and easier to use:

  • Using Java annotations for defining test execution characterisitics and performance requirements
  • You can mark a test to run a certain number of times or to be repeatedly executed for a certain amount of time
  • Performance requirements can be maximum, average, medium or any percentile exectution time
  • Perfect control and readability of performance requirements
  • You can run tests in two different modes, using them as simple unit tests or performance tests
  • Easy integration with Eclipse and Maven
  • Export of execution summary to a CSV file
  • Small library without external dependencies (only JUnit)
  • Easy extension with custom statistics evaluation and tracking mechanisms

Here is a very simple test:

  import org.junit.*;
  import org.databene.contiperf.*;

  public class SmokeTest {

      @Rule
      public ContiPerfRule i = new ContiPerfRule();

      @Test
      @PerfTest(invocations = 1000, threads = 20)
      @Required(max = 1200, average = 250)

      public void test1() throws Exception {
          Thread.sleep(200);
      }

  }

 

Defining an attribute of type ContiPerfRule with the annotation @Rule activates ContiPerf. You can then choose from a different settings for specifying test execution (@PerfTest) and performance requirements (@Required). In the example the test is configured to be executed 1000 times with 20 concurrent threads, so each thread does 50 invocations. A maximum execution time of 1.2 seconds and and an average below or equals 250 milliseconds are tolerated.

As small but important detail is, that ContiPerf makes a difference to the behavior, you might expect from JUnit: ContiPerf creates a new test class instance and setup for each test method, but not for each test invocation! ContiPerf's purpose ist to do performance testing of your code, not of JUnit and the garbage collector! So, the behavior is as follows:

For each test method, a new instance of the test class is created and all further invocations of this method will happen on one and the same Java object!

On JUnit 4.7, first the @Before method is called. Then the test method is invoked 1,000 times subsequently and finally the @After method(s).

An example: Assuming you have a test class with two test methods, test1() and test2() which are executed two times each, a @Before method called before() and an @After method called after(), the invocation sequence is

constructor()
before()
test1()
test1()
after()

constructor()
before()
test2()
test2()
after()

On JUnit 4.8 and newer versions, the @Before method, the test method and the @After methods are called alternatingly.

An example: Assuming you have a test class with two test methods, test1() and test2() which are executed two times each, the invocation sequence is

constructor()
before()
test1()
after()
before()
test1()
after()

constructor()
before()
test2()
after()
before()
test2()
after()

 

As of ContiPerf 1.05 you may annotate classes with @PerfTest and @Required. Class annotations provide defaults for all test methods that do not have method annotations:

  import org.junit.*;
  import org.databene.contiperf.*;

  @PerfTest(invocations = 5)
  @Required(max = 1200, average = 250)

  public class SmokeTest {

      @Rule
      public ContiPerfRule i = new ContiPerfRule();

      @Test
      public void test1() throws Exception {
          Thread.sleep(200);
      }

      @Test
      public void test2() throws Exception {
          Thread.sleep(150);
      }

 }

 

Performance Test Suites

Test suites are a powerful mechanism introduced in ContiPerf 1.05. They can be used for, e.g.

  • reusing existing unit tests that are not ContiPerf-aware in a manner that configures repeated execution and adds performance requirements
  • defining general performance requirements in a ContiPerf test class and reusing this in different test scenarios that induce different execution characteristics, e.g. long-term,  expected load, load peak, etc.
  • ...

The general suite concept was introduced in JUnit: You take one ore more test classes which contain one or more test methods and group them into a test suite. When running the test suite, all tests contained in the related test classes are executed. While most JUnit users only it as grouping mechanism, ContiPerf allows you to add ContiPerf functionality to JUnit and ContiPerf tests.

A ContiPerf test suite is defined with two main annotations, @RunWith and @SuiteClasses and can be configured with the already known @PerfTest and @Required annotations:

@RunWith(ContiPerfSuiteRunner.class)
@SuiteClasses(MyApplicationTest.class)

@PerfTest(invocations = 1000, threads = 30)
public static class PeakLoadTest {
}

You always need to use @RunWith(ContiPerfSuiteRunner.class) for executing a suite. The @SuiteClasses annotation must provide a comma-separated list of all classes that contain the 'real' test implementations. If you add @PerfTest and @Required annotations they apply as defaults for all 'real' tests that do not have a specific annotation.

However, there is a limitation: ContiPerf Test Suites can only wrap test classes which use the default JUnit4 test runner. Tests that use another runner, indicated by a @RunWith annotation, do not interoperate correctly with ContiPerf. The technical reason is, that there are two JUnit design issues which prevent use for performance testing: The first one is that JUnit does not propagate @Rules from test suites to the wrapped tests (there is an open request in the JUnit github), the other is, that JUnithas the default behavior of repeating test class instantiation, setup, teardown and result tracking for each single test method invocation. Applied to several thousand performance test invocation, this would induce severe additional load and would render the approach useless for most local perfomance test purposes.

 

Using ContiPerf in an IDE

For running the test with your IDE (e.g. Eclipse), simply add the contiperf.jar to the classpath and run the JUnit test. If any requirement is violated, this is reported by a test failure with appropriate message.

 

Integrating ContiPerf with Maven

Simply add a dependency to contiperf with scope test. ...and don't forget to have a dependency to JUnit as the test invoker.

    <dependencies>

        ...

        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.7</version>
            <scope>test</scope>
        </dependency>
       
        <dependency>
            <groupId>org.databene</groupId>
            <artifactId>contiperf</artifactId>
            <version>1.0</version>
            <scope>test</scope>
        </dependency>


    </dependencies>

 

ContiPerf will be automatically activated when JUnit encounters a test class with @Rule ContiPerfRule...

  mvn test  

 

Dual use with Maven

Using the same test class implementation for unit tests and performance tests is usually not recommendable, but if you insist to do so, you can run the tests in two different modes:

  1. By default, performance testing is activated
  2. You can suppress performance testing, reducing the test to its unit test character by invoking Maven with the system property -Dcontiperf.active=false:
    mvn test -DargLine="-Dcontiperf.active=false"

License

ContiPerf is Open Source and you can chooseamong the following licenses:

  • Apache License 2.0
  • Lesser GNU Public License (LGPL) 3.0
  • Eclipse Public License 1.0
  • BSD License

Requirements

You need at least Java 5 and JUnit 4.7 to useContiPerf

 

Getting help / getting involved

If you are stuck, found a bug, have ideas for ContiPerf or want to help, visit the forum.

 

Defining performance test execution

@PerfTest(invocations = 300)
Executes the test 300 times, regardless of the number of threads.
If left out, ContiPerf defaults to invocations=1, ignoring the 'threads' count
@PerfTest(threads=30)

Executes the test in 30 concurrent threads.
The total number of 'invocations' is distributed evenly over the threads.
Thus, for invocations=300, threads=30, each thread performs 10 invocations.
If left out, ContiPerf defaults to a single thread.

@PerfTest(duration = 20000)
Executes the test repeatedly for at least 20,000 milliseconds (20 seconds)

 

Defining performance requirements

@Required(throughput = 20)
Requires to have at least 20 test executions per second
@Required(average = 50)Requires an average execution time of not more than 50 milliseconds
@Required(median = 45)
Requires that 50% of all executions do not take longer than 45 milliseconds
@Required(max = 2000)
Requires that no invocation takes more than 2000 milliseconds (2 seconds)
@Required(totalTime = 5000)
Requires that the sum of all execution times is not more than 5000 milliseconds (5 seconds)  
@Required(percentile90 = 3000)
Requires that 90% of all executions do not take longer than 3000 milliseconds
@Required(percentile95 = 5000)Requires that 95% of all executions do not take longer than 5000 milliseconds
@Required(percentile99 = 10000)Requires that 99% of all executions do not take longer than 10000 milliseconds

@Required(percentiles = "66:200,96:500")
 

Requires that 66% of all executions do not take longer than 200 milliseconds
and 96% of all executions do not take longer than 500 milliseconds

 

Standard Log File

ContiPerf logs invocation statistics to a log file 'target/contiperf/contiperf.log'. Its columns are

  1. test name
  2. total execution time
  3. number of invocations
  4. start time (as milliseconds since 1970-01-01)

A sample of the generated log file is:

org.databene.contiperf.junit.SmokeTest.simpleTest,101,1,1269870975079
org.databene.contiperf.junit.SmokeTest.unrepeatableTest,101,1,1269870975189
org.databene.contiperf.junit.SmokeTest.detailedTest,1008,5,1269870975295
org.databene.contiperf.junit.SmokeTest.continuousTest,2014,10,1269870976306
org.databene.contiperf.junit.SmokeTest.complexTest,1006,5,1269870978323

 

Custom Performance Loggers

If you want to do special or automatic evaluations of the summary or even each single invocation, you can easily do so by implementing the interface org.databene.contiperf.ExecutionLogger. Your custom implementation could e.g.write special log file formats, write execution logs to a database for tracking their evolution over the course of a project. A simple implementation that writes all events to the console is the ConsoleExecutionLogger:


 

import org.databene.contiperf.ExecutionLogger;

public class ConsoleExecutionLogger implements ExecutionLogger {

    public void logSummary(String id, long elapsedTime, long invocationCount, long startTime) {
        System.out.println(id + ',' + elapsedTime + ',' + invocationCount + ',' + 1000000);
    }

    public void logInvocation(String id, int latency, long startTime) {
        System.out.println(id + ',' + latency + ',' + 1000000);
    }

}


A custom ExecutionLogger can be used by providing it as parameter in the ContiPerfRule initialization:

 

  import org.junit.*;
  import org.databene.contiperf.*;

  public class SmokeTest {

      @Rule
      public ContiPerfRule i = new ContiPerfRule(
            new ConsoleExecutionLogger());

      @Test
      @PerfTest(invocations = 5)
      @Required(max = 1200, average = 250)

      public void test1() throws Exception {
          Thread.sleep(200);
      }

  }

A ContiPerf test suite can make use of an ExecutionLogger by declaring it as a public attribute of type ExecutionLogger:

    @RunWith(ContiPerfSuiteRunner.class)
    @SuiteClasses(UnconfiguredTest.class)
    @PerfTest(invocations = 6)

    public static class ConfiguredSuite {
        public ExecutionLogger el = new MyExecutionLogger(3);
    }
 

 

Release History

v1.0   2010-03-29 First release

v1.01 2010-04-08 Bug fix release

v1.02 2010-04-12 Bug fix release for non-Maven-based projects

v1.03 2010-04-19 Supporting multithreaded test execution

v1.04 2010-04-23 Bug fix release

v1.05 2010-05-24 Supporting test suites

v1.06 2010-05-24 Reduced memory footprint and performance impact

v1.07 2010-11-14 Bug fix release

 

 

Roadmap

v2.0.0 2011-02-?? generating charts with execution times, stopwatch API for non-JUnit clients

later: Supporting ramp-up and random pause time between executions