JUnit test reporting

Some time ago in 2011, I asked a question in stackoverflow about test report enrichment of JUnit tests. Because I was asked several times on this topic afterwards, and today again, I write down  the solution Ideveloped and used that time. Currently, I do not have a better idea, yet.

The issue to solve

For a customer detailed test reports are needed for integration tests which not only show, that everything is pass, but also what the test did and to which requirement it is linked to. These reports should be created automatically to avoid repetitive tasks to save money and to not frustrate valuable developers and engineers.

For that, I thought about a way how to document the more complex integration tests (which is also valuable for component tests). A documentation would be needed for classes and test methods. For the colleagues responsible for QA it is a good help to see to which requirement, Jira ticket or whatever the test is linked to and what the test actually tries to do to double check it. If the report is good enough, it may also be sent out to the customers to prove quality.

The big question now is: How can the documentation for each method and each test class put into the JUnit reports? JUnit 4.9 was used at the beginning of this issue with Maven 3.0 and not I am on JUnit 4.11 and Maven 3.1.0.

The solution

I did what was suggested in stackoverflow, because I did not find a way to enhance the surefire reports directly. The solution for me was the development of a JUnit run listener which collects some additional information about the tests which can be later used to aggregate the reports.

The JUnit listener was attached to maven-surefire-plugin  like:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-failsafe-plugin</artifactId>
    <version>2.12</version>
    <configuration>
        <properties>
            <property>
                <name>listener</name>
                <value>fully.qualified.class.name.to.your.listener</value>
            </property>
        </properties>
    </configuration>
</plugin>
The listener needs to extend org.junit.runner.notification.RunListener and there you can overwrite the methods testRunStarted, testRunFinished and so forth. The listener needs to be put into a separate Maven project, because a listener cannot be build and used in the same project, so another project is to be created separately.
For the tests itself, I also created an annotation where I put it in the requirements id and other information needed:
@Retention(RetentionPolicy.RUNTIME)
@Target({ ElementType.TYPE, ElementType.METHOD})
@Documented
public @interface TestDoc {
    public String testId();
    public String rationale();
    public String bugId() default "";
    /* Put in what you need... */
}
A test class can then look like that:
@TestDoc(testId="123", rationale="Assure handling of invalid data does not cause a crash.", bugId="#42")
public class InvalidStreamingTests {

    @Test
    @TestDoc(testId="124", rationale="Check for stream which aborts and leads to incomplete content.")
    public void testStreamAbort()
    {
        /* The actual test code */
    }
}
In the run listener one gets the reference to the actual test which can be scanned for annotations via reflection.The Description class is quite handy, one can retrieve the test class as easy as:
String testClassName = description.getClassName();
The information found there can be stored away, what I did in a simple CSV file. After the tests run, these information can be gathered into a fancy report. The information which can be collection reaches from start timestampes, over the actual test class calls, the result of the tests (assertions and assumptions) to the finish time stamps. The classes can be check via reflection at all times and all information can be written into a file or even to a database or server.
The reporting was done offline later on, when I first used this solution. But, if some more work effort is possible, a Maven report plugin (or a normal plugin) can be written which collects the information in the post-integration-test phase for example. Here the limitation is the imagination only. It depends whether only a technical report is needed as TXT file or a fancy marketing campaign needs to be build around it. In daily life it is something in between.
Leave a Reply