Tuesday, 10 April 2012

Writing lightweight REST integration tests with the Jersey Test Framework

Writing REST services with JAX-RS (and its reference implementation Jersey) is easy. A class annotated with @Path and some methods with @GET, @POST, ... annotations is enough for a fully functional REST service. Real-world applications however are more complex. There are request-filters for authorization and access control, context-providers for injecting data-access-objects, mappers that convert exceptions to appropriate http responses, MessageBodyReaders and -Writers to convert JSON and XML to and from Java objects, and so on. All these components can (and should) be tested using unit-tests. But this is not enough. To be sure, that these components work together correctly, integration tests are needed. These can be costly to run. They always need the full environment to be configured and running. And the more complex an application, the more complex it is to set up this environment (web-server, database, search-engine, message-queue, ...).

The Jersey Test Framework offers the possibility to write lightweight integration-tests, that do not need any external resources to be available. The web-container, where all components (resources, filters, mappers, ...) run, is configured and started on-the-fly. Moreover, it is possible to provide mocks for data-access-objects and thus extinguish the need for external services.

A short introduction to the Jersey Test Framework can be found in the jersey documentation: http://jersey.java.net/nonav/documentation/latest/test-framework.html. The code for the following example is available on github: https://github.com/mlex/jerseytest.

Example REST service

Let's start with a simple example. The following class implements a simple TODO-service. You can get a list of TODOs, add new TODOs and remove a TODO from the list.

public class TodoResource {
    private TodoService todoService;

    public String getTodos() {
        return StringUtils.join(todoService.getAllTodos(), ",");

    public void addTodo(String newTodo) {

    public void removeTodo(@PathParam("todo") String todoToRemove) {

The instance of TodoService is injected into the REST-resource using a simple SingletonInjectableProvider:

public class TodoServiceProvider extends
       SingletonTypeInjectableProvider {

    public TodoServiceProvider() {
        super(TodoService.class, new TodoService());

To keep the example as simple as possible, the TodoService simply stores the TODOs in a list. In a real-world application, the service would write the todos into a database, of course.

public class TodoService {
    List todos = new ArrayList();

    public List getAllTodos() {
        return new ArrayList(todos);

    public void addTodo(String todo) {

    public boolean removeTodo(String todo) {
        if (todos.remove(todo) == false) {
            throw new TodoNotFoundException();

The most interesting part of this example (and the part, that demonstrates the need for integration tests on top of unit-tests) is the exception, that is thrown in the removeTodo method. This exception is not catched in the TodoResource. It will be propagated and finally be transformed into a 400-Response by the following exception-mapper:

public class NotFoundMapper implements ExceptionMapper {
    public Response toResponse(TodoNotFoundException e) {
        return Response.status(Response.Status.BAD_REQUEST)

With these classes, our todo-service is ready to use. To check if everything is working, we can use curl:

curl -XPOST -H "Content-Type: text/plain" --data "fetch milk" \
curl -XPOST -H "Content-Type: text/plain" --data "call steve" \
curl -XGET http://localhost:8080/mjl-jersey-server/todo
# fetch milk,call steve

curl -XDELETE http://localhost:8080/mjl-jersey-server/todo/fetch%20milk
curl -XGET http://localhost:8080/mjl-jersey-server/todo
# fetch milk

curl -v -XDELETE http://localhost:8080/mjl-jersey-server/todo/fetch%20milk
# ...
# < HTTP/1.1 400 Bad Request
# ...
# TodoNotFoundException

Testing the REST-service

Now we want to write tests for the todo service. Testing the get- and add-todo methods with the jersey test framework wouldn't be much different from simple unit-tests. The power of the jersey test framework becomes clear, when testing the remove-todo method. When a user wants to delete a non-existent todo, we expect the service to return a 400-response. Ensuring this with standard unit-tests would be hard. The test case using the jersey test framework is quite simple.

class TodoResourceTest extends JerseyTest {

    public static TodoService todoServiceMock = Mockito.mock(TodoService.class);

    public WebAppDescriptor configure() {
        return WebAppDescriptor.Builder(

    public void shouldReturn400OnNotFoundException() {
        String todo = "test-todo";
            .thenThrow(new NotFoundException());
        ClientResponse response = resource().path("todo/"+test-todo")
        Assertions.assertEquals(Status.BAD_REQUEST, response.getClientStatus());

    public static class MockTodoServiceProvider extends
           SingletonTypeInjectableProvider {
        public MockTodoServiceProvider() {
            super(TodoService.class, todoServiceMock);

Some explanations:
Because we do not want to connect to a external database, the TodoService has to be mocked. This is done by defining a provider, that injects a mocked TodoService. Because we also want to configure the mock-object inside our test, the MockTodoServiceProvider is defined as inner class of the test and the mock-object is stored in a class variable of our test class.

The test is configured to use a GrizzlyWebTestContainer. See the last part of this blog-post for advantages and disadvantages of using other containers. The configuration of the test-container is done in the configure() method.

In the test method itself, the TodoService mock is instructed to throw a TodoNotFoundException, when the removeTodo() method is called. A WebResource pointing to our test-container is created and a DELETE request is fired. If everything works fine, the result of this request must be a 400 error. And the response-body must contain the reason for the error.

In the same way, you can also test other components, like authorization-filters, access-control and response-mappers (Jackson or JAXB) without the need of external environment to be present. Of course, there is also a downside of using this kind of tests: they are rather slow. The on-the-fly setting up and tearing down of the web container is very expensive. Another disadvantage is, that most test-containers use real system ports for their communication (the only exception is the InMemoryContainer, which has other shortcomings). These ports may be blocked by other applications, whath causes the tests to fail. This is a problem, when using helpers like infinitest, where it can happen, that multiple tests are run at the same time.

Integrated Client-Server-Tests

If there is also a java-based client-implementation for the REST-service, this client can be used in jersey tests, too. Our example TODO-service comes with such a client-implementation:

public class TodoClient {

    public static final String TODO_RESOURCE_PATH = "/todo";

    private final String uri;

    private final Client client = new Client();

    public TodoClient(String uri) {
        this.uri = uri;

    public WebResource resource() {
        return client.resource(uri).path(TODO_RESOURCE_PATH);

    public WebResource resource(String todo) {
        return resource().path("/" + todo);

    public String getAllTodos() {
        String todos = resource().get(String.class);
        return todos;

    public void addTodo(String todoToAdd) {

    public void removeTodo(String todoToRemove) {
        try {
        } catch (UniformInterfaceException e) {
            if (e.getResponse().getClientResponseStatus() == 
                    Response.Status.BAD_REQUEST) &&
                "TodoNotFoundException".equals(e.getEntity(String.class))) {
                throw TodoNotFoundException();
            } else {
                throw e;

The most interesting part of this client is again the removeTodo() method. It not only executes the HTTP request, but also checks if the request failed because the todo to delete did not exist. This is done by checking the response-code and the response-body. This can be used to simplify the jersey test:

    private TodoClient todoClient() {
        TodoClient todoClient = new TodoClient(getBaseURL());
        Whitebox.setInternalState(todoClient, "client", client());

    @Test(expected = NotFoundException.class);
    public void removeTodoShouldThrowNotFoundException() {
        final String todo = "test-todo";
            .thenThrow(new NotFoundException());

Now this test really cannot be called a unit-test anymore. In these few lines, we check, that the TodoNotFoundException thrown by the TodoServic is correctly converted in a HTTP-Response, that our client understands and converts back to the appropriate TodoNotFoundException. If any of the involved components is changed, without changing affected components, the test will fail.

Tips and Tricks

Decide what type of container to use before writing tests

There are two kinds of containers available for the jersey test framework: high-level servlet containers and low-level containers. Both have advantages and disadvantages.

The high-level servlet containers offer the full functionality of a servlet container, automatically injecting instances of HttpServletRequest, ... . If your application relys heavily on servlet specific classes, these containers will be your first (and probably only) choice. The servlet functionality comes at a price: All implementations need to open system ports, which makes the tests more fragile and also a little bit slower. Another drawback of using real servlet containers in tests is, that you don't have direct access to the instances of your resources and (context-)providers. To allow the use of mock-objects, you must work around this problem, for example by assigning context-objects to static fields, as we did with the mocked TodoService.

Low-level containers on the other hand, allow you to directly modify the ResourceConfig used. Like this, you have access to all instances (resources, providers, filters) used for the rest service. This greatly simplifies mocking. So if you don't rely on the servlet-api, you'll probably go for a low-level container.

Do not use WebAppDescriptor for low-level containers

Althoug possible, I do not recommend using WebAppDescriptors for low-level containers. The reason lies in the method LowLevelAppDescriptor::transform(), that is used to transform a WebAppDescriptor to a LowLevelAppDescriptor, when a low-level container is used. The method simply ignores all non-boolean init-params. Moreover, there is a bug when using the property com.sun.jersey.config.property.packages with multiple (colon-separated) package-names. Even if these shortcomings get fixed, you should not rely on the transform() method. The power of low-level containers lies in the possibility to directly modify the used ResourceConfig, which is only possible when using a LowLevelAppDescriptor.

Speedup jersey tests

Because the JerseyTest base class starts a new web-container before each test, the tests are rather slow. One possibility to speed them up, would be to start a web-container only once per test-suite. A implementation for a base class doing this is included in the example-application.

Extended InMemoryTestContainer

The InMemoryTestContainer is the only container, that does not open any real ports on the system. Of course, being a low-level container, no servlet-specific functionality is available with this container. But if you do not rely on the servlet-api too much, this container is the perfect choice to write really fast and lightweight integration tests.

However the InMemoryTestContainer, coming with the jersey test framework, has another drawback: you cannot declare any request- or response-filters, because they are overridden by logging filters. To work around this problem, I implemented my own in-memory-test-container (basically only copying the original code and removing the logging filters). The code is also included in the example application.

Thursday, 8 March 2012

Create a list of JAX-RS resources using sed

There are things, you can do with sed and things you simply cannot. And there are things, you can do, but should not. The following script belongs probably to the third category. But it was fun to write and the final result is quite beautiful:

sed -r 's/\r//g;s/^\s*//g;/@Path.*URI_PATH/d;/final.*URI_PATH/{;/".*"/{;s/.*"(.*)".*/\1/;h;};d;};/^@Path[^P]/{;s/.*@Path\("(.*)"\).*/\1/;H;d;};/public/{;g;/\n@/{;s/^([^\n]*)\n?(.*)?\n(@[^\n]*)\n?(.*)?/\3\t\1\4\n\2/;p;};};/\/\*\*/{;g;s/^([^\n]*)\n.*$/\1/;s/$/\n!/;h;d;};/\*\//{;g;s/\n!$//;h;d;};/^\*/{;/^\*\s*(@.*)?$/{;g;s/\n!$//;h;};x;/\n!$/{;s/\n!$//;x;s/^\*\s*/\t\t/;H;g;s/$/\n!/;};x;d;};/@(GET|PUT|POST|DELETE)/{;H;d;};d' *.java

So what is this piece of cryptic code supposed to do? It takes some java-files, scans them for JAX-RS annotations (@Path, @GET, ...) and prints a small overview of all REST-resources, that are defined in these files together with a small description, taken from the java-comments. Take the following example-implementation of a REST-Service:

class BlogPostResource {

    public final static String URI_PATH = "/blogposts";

     * Create a new blog post
    public Response postNewPost() {

     * Get blog post with given id
    public Response getPost(@PathParam("id") String id) {

     * Update the blogpost with given id.
     * Multiline comments are possible, too.
     * @param id
     *     the id of the blog post to update
    public Response updatePost(@PathParam("id") String id) {

This is what the script will return:

@POST /blogposts
          Create a new blog poast
@GET  /blogposts/{id}
          Get blog post with given id
@PUT  /blogposts/{id}
          Update the blogpost with given id.
          Multiline comments are possible, too.

There are some assumptions to be made for the script to work

  1. The base-path of each class implementing a rest-resource must be red in a static class-variable "URI_PATH". This variable must be defined before any other annotated methods.
  2. The @GET (resp @POST, @PUT, @DELETE) annotations must always be above the @Path annotation.
  3. Every rest-implementing method must have a javadoc-comment starting with /** (even if it is empty).
  4. The javadoc comment must always come before the annotations

This said, here is the less cryptic (and commented) code of the sed-script:

sed -r '
# before we start, remove all carriage-returns (i really hate these)

# also remove whitespace at the beginning of lines

# the @Path annotation of the class itself is always used in
# conjunction with the static class-variable URI_PATH, which we catch
# separately, so we can safely ignore this line
/@Path.*URI_PATH/ d 

# the static class-variable holds the global path to this resource. we
# put it in hold-space. this path will stay in hold space (more exactly
# in the first line of the hold space.
/final.*URI_PATH/ {
    /".*"/ {

# this is a path-annotation of a method receiving rest-requests. the
# path specified is appended to the hold space (the [^P] is to not get
# confused by lines starting with @PathParam.
/^@Path[^P]/ {

# a new public method (or field) is declared in this line, so we have to
# check if the hold space content is
# describing a rest-resource. this is the case if some line starting
# with a @ is found (we assume, that no comments start with a @)
/public/ {
    /\n@/ {
        # now we have to print a description of the rest-resource
        # the first line in hold space is the base path (from the
        # @Path annotation of the class itself). the next lines are the
        # description from the comments. the line starting with an @ is the
        # http-method-specifier (for example @GET). and after this
        # line comes the path specified by the @Path-annotation for the
        # method (which is not required for all methods, therefor we put a ?
        # to this part). unfortunately, there exist also methods
        # without a describing comment, so the second part gets a ?, too

# a new comment starts ...
/\/\*\*/ {
    # in any case, we have to clean up the hold space. only the
    # first line may remain

    # now we add a marker to the hold space, to tell the methods
    # parsing the comments, that we are at the beginning of a comment
    # put everything in the hold-space and delete the pattern space,
    # so that subsequent checks are not disturbed

# when we reach the end of a comment, the marker "!" must be removed
/\*\// {

# here we are inside a comment
    # if this comment-line is empty or starting with a @ (which means
    # we are inside the parameter-description part), and the "!"
    # marker is still in the hold space, we remove the marker
    /^\*\s*(@.*)?$/ {

    # now we check if the last character in the hold-space is a
    # "!". this tells us, if we are at the beginning of a (perhaps
    # multiline-)comment, or if we are already in the part describing
    # parameters (which is of no interest to us)
    /\n!$/ {
        # first, we remove the "!" character. it will be re-added,
        # when we are finished reading this comment-line
        # next we refetch the current pattern line (which we exchanged
        # with hold-space before)
        # remove the leading stars and whitespaces

        # then add the line to hold space
        # and finally re-add our marker "!"

    # before we finish, we have to re-exchange hold- and pattern
    # space, because we swapped them before the \n! check

# this is an easy part: if a http-method annotation is present, push
# it to the hold-space

# we finally delete all lines, so that they are not printed by sed in
#default mode (if the script is called with "sed -n", this is not needed)
d' *.java

By the way, this is the command, i used to create the cryptic version out of the  rather lengthy commented one:

sed -n '/#/ d; s/\s*//g; /^$/ d; 1h; 1!H; ${g;s/\n/;/g;p}'