Building a Critter Stack Application: Integration Testing Harness

Hey, did you know that JasperFx Software is ready for formal support plans for Marten and Wolverine? Not only are we trying to make the “Critter Stack” tools be viable long term options for your shop, we’re also interested in hearing your opinions about the tools and how they should change.

The older parts of the JasperFx / Critter Stack projects are named after itty bitty small towns in SW Missouri, including Alba.

Let’s build a small web service application using the whole “Critter Stack” and their friends, one small step at a time. For right now, the “finished” code is at CritterStackHelpDesk on GitHub.

The posts in this series are:

  1. Event Storming
  2. Marten as Event Store
  3. Marten Projections
  4. Integrating Marten into Our Application
  5. Wolverine as Mediator
  6. Web Service Query Endpoints with Marten
  7. Dealing with Concurrency
  8. Wolverine’s Aggregate Handler Workflow FTW!
  9. Command Line Diagnostics with Oakton
  10. Integration Testing Harness (this post)
  11. Marten as Document Database
  12. Asynchronous Processing with Wolverine
  13. Durable Outbox Messaging and Why You Care!
  14. Wolverine HTTP Endpoints
  15. Easy Unit Testing with Pure Functions
  16. Vertical Slice Architecture
  17. Messaging with Rabbit MQ
  18. The “Stateful Resource” Model
  19. Resiliency

Before I go on with anything else in this series, I think we should establish some automated testing infrastructure for our incident tracking, help desk service. While we’re absolutely going to talk about how to structure code with Wolverine to make isolated unit testing as easy as possible for our domain logic, there are some elements of your system’s behavior that are best tested with automated integration tests that use the system’s infrastructure.

In this post I’m going to show you how I like to set up an integration testing harness for a “Critter Stack” service. I’m going to use xUnit.Net in this post, and while the mechanics would be a little different, I think the basic concepts should be easily transferable to other testing libraries like NUnit or MSTest. I’m also going to bring in the Alba library that we’ll use for testing HTTP calls through our system in memory, but in this first step, all you need to understand is that Alba is helping to set up the system under test in our testing harness.

Heads up a little bit, I’m skipping to the “finished” state of the help desk API code in this post, so there’s some Marten and Wolverine concepts sneaking in that haven’t been introduced until now.

First, let’s start our new testing project with:

dotnet new xunit

Then add some additional Nuget references:

dotnet add package Shouldly
dotnet add package Alba

That gives us a skeleton of the testing project. Before going on, we need to add a project reference from our new testing project to the entry point project of our help desk API. As we are worried about integration testing right now, we’re going to want the testing project to be able to start the system under test project up by calling the normal Program.Main() entrypoint so that we’re running the application the way that the system is normally configured — give or take a few overrides.

Let’s stop and talk about this a little bit because I think this is an important point. I think integration tests are more “valid” (i.e. less prone to false positives or false negatives) as they more closely reflect the actual system. I don’t want completely separate bootstrapping for the test harness that may or may not reflect the application’s production bootstrapping (don’t blow that point off, I’ve seen countless teams do partial IoC configuration for testing that can vary quite a bit from the application’s configuration).

So if you’ll accept my argument that we should be bootstrapping the system under test with its own Program.Main() entry point, our next step is to add this code to the main service to enable the test project to access that entry point:

using System.Runtime.CompilerServices;

// You have to do this in order to reference the Program
// entry point in the test harness
[assembly:InternalsVisibleTo("Helpdesk.Api.Tests")]

Switching finally to our testing project, I like to create a class I usually call AppFixture that manages the lifetime of the system under test running in our test project like so:

public class AppFixture : IAsyncLifetime
{
    public IAlbaHost Host { get; private set; }

    // This is a one time initialization of the
    // system under test before the first usage
    public async Task InitializeAsync()
    {
        // Sorry folks, but this is absolutely necessary if you 
        // use Oakton for command line processing and want to 
        // use WebApplicationFactory and/or Alba for integration testing
        OaktonEnvironment.AutoStartHost = true;

        // This is bootstrapping the actual application using
        // its implied Program.Main() set up
        // This is using a library named "Alba". See https://jasperfx.github.io/alba for more information
        Host = await AlbaHost.For<Program>(x =>
        {
            x.ConfigureServices(services =>
            {
                // We'll be using Rabbit MQ messaging later...
                services.DisableAllExternalWolverineTransports();
                
                // We're going to establish some baseline data
                // for testing
                services.InitializeMartenWith<BaselineData>();
            });
        }, new AuthenticationStub());
    }

    public Task DisposeAsync()
    {
        if (Host != null)
        {
            return Host.DisposeAsync().AsTask();
        }

        return Task.CompletedTask;
    }
}

A few notes about the code above:

  • Alba is using the WebApplicationFactory under the covers to bootstrap our help desk API service using the in memory TestServer in place of Kestrel. WebApplicationFactory does allow us to modify the IoC service registrations for our system and override parts of the system’s normal configuration
  • In this case, I’m telling Wolverine to effectively stub out all external transports. In later posts we’ll use Rabbit MQ for example to publish messages to an external process, but in this test harness we’re going to turn that off and simple have Wolverine be able to “catch” the outgoing messages in our tests. See Wolverine’s test automation support documentation for more information about this.
  • More on this later, but Marten has a built in facility to establish baseline data sets that can be used in test automation to effectively rewind the database to an initial state with one command
  • The DisposeAsync() method is very important. If you want to make your integration tests be repeatable and run smoothly as you iterate, you need the tests to clean up after themselves and not leave locks on resources like ports or files that could stop the next test run from functioning correctly
  • Pay attention to the `OaktonEnvironment.AutoStartHost = true;` call, that’s 100% necessary if your application is using Oakton for command parsing. Sorry.
  • As will be inevitably necessary, I’m using Alba’s facility for stubbing out web authentication that allows us to both sidestep pesky authentication infrastucture in functional testing while also happily letting us pass along user claims as test inputs in individual tests
  • Bootstrapping the IHost for your application can be expensive, so I prefer to share that host across tests whenever possible, and I generally rely on having individual tests establish their inputs at beginning of each test. See the xUnit.Net documentation on sharing fixtures between tests for more context about the xUnit mechanics.

For the Marten baseline data, right now I’m just making sure there’s at least one valid Customer document that we’ll need later:

public class BaselineData : IInitialData
{
    public static Guid Customer1Id { get; } = Guid.NewGuid();
    
    public async Task Populate(IDocumentStore store, CancellationToken cancellation)
    {
        await using var session = store.LightweightSession();
        session.Store(new Customer
        {
            Id = Customer1Id,
            Region = "West Cost",
            Duration = new ContractDuration(DateOnly.FromDateTime(DateTime.Today.Subtract(100.Days())), DateOnly.FromDateTime(DateTime.Today.Add(100.Days())))
        });

        await session.SaveChangesAsync(cancellation);
    }
}

To simplify the usage a little bit, I like to have a base class for integration tests that I like to call IntegrationContext:

[Collection("integration")]
public abstract class IntegrationContext : IAsyncLifetime
{
    private readonly AppFixture _fixture;

    protected IntegrationContext(AppFixture fixture)
    {
        _fixture = fixture;
    }
    
    // more....

    public IAlbaHost Host => _fixture.Host;

    public IDocumentStore Store => _fixture.Host.Services.GetRequiredService<IDocumentStore>();

    async Task IAsyncLifetime.InitializeAsync()
    {
        // Using Marten, wipe out all data and reset the state
        // back to exactly what we described in BaselineData
        await Store.Advanced.ResetAllData();
    }

    // This is required because of the IAsyncLifetime 
    // interface. Note that I do *not* tear down database
    // state after the test. That's purposeful
    public Task DisposeAsync()
    {
        return Task.CompletedTask;
    }

    // This is just delegating to Alba to run HTTP requests
    // end to end
    public async Task<IScenarioResult> Scenario(Action<Scenario> configure)
    {
        return await Host.Scenario(configure);
    }

    // This method allows us to make HTTP calls into our system
    // in memory with Alba, but do so within Wolverine's test support
    // for message tracking to both record outgoing messages and to ensure
    // that any cascaded work spawned by the initial command is completed
    // before passing control back to the calling test
    protected async Task<(ITrackedSession, IScenarioResult)> TrackedHttpCall(Action<Scenario> configuration)
    {
        IScenarioResult result = null;

        // The outer part is tying into Wolverine's test support
        // to "wait" for all detected message activity to complete
        var tracked = await Host.ExecuteAndWaitAsync(async () =>
        {
            // The inner part here is actually making an HTTP request
            // to the system under test with Alba
            result = await Host.Scenario(configuration);
        });

        return (tracked, result);
    }
}

The first thing I want to draw your attention to is the call to await Store.Advanced.ResetAllData(); in the InitializeAsync() method that will be called before each of our integration tests executing. In my approach, I strongly prefer to reset the state of the database before each test in order to start from a known system state. I’m also assuming that each test if necessary, will add additional state to the system’s Marten database as necessary for the test. This philosophically is what I’ve long called “Self-Contained Tests.” I also think it’s important to have the tests leave the database state alone after a test run so that if you are running tests one at a time you can use the left over database state to help troubleshoot why a test might have failed.

Other folks will try to spin up a separate database (maybe with TestContainers) per test or even a completely separate IHost per test, but I think that the cost of doing it that way is just too slow. I’d rather reset the system between tests and not incur the cost of recycling database containers and/or the system’s IHost. This comes at the cost of forcing your test suite to run tests in serial order, but I also think that xUnit.Net is not the best possible tool at parallel test runs, so I’m not sure you lose out on anything there.

And now for an actual test. We have an HTTP endpoint in our system we built early on that can process a LogIncident command, and create a new event stream for this new Incident with a single IncidentLogged event. I’ve skipped ahead a little bit and added a requirement that we capture a user id from an expected Claim on the ClaimsPrincipal for the current request that you’ll see reflected in the test below:

public class log_incident : IntegrationContext
{
    public log_incident(AppFixture fixture) : base(fixture)
    {
    }

    [Fact]
    public async Task create_a_new_incident()
    {
        // We'll need a user
        var user = new User(Guid.NewGuid());
        
        // Log a new incident by calling the HTTP
        // endpoint in our system
        var initial = await Scenario(x =>
        {
            var contact = new Contact(ContactChannel.Email);
            x.Post.Json(new LogIncident(BaselineData.Customer1Id, contact, "It's broken")).ToUrl("/api/incidents");
            x.StatusCodeShouldBe(201);
            
            x.WithClaim(new Claim("user-id", user.Id.ToString()));
        });

        var incidentId = initial.ReadAsJson<NewIncidentResponse>().IncidentId;

        using var session = Store.LightweightSession();
        var events = await session.Events.FetchStreamAsync(incidentId);
        var logged = events.First().ShouldBeOfType<IncidentLogged>();

        // This deserves more assertions, but you get the point...
        logged.CustomerId.ShouldBe(BaselineData.Customer1Id);
    }
}

Summary and What’s Next

The “Critter Stack” core team and our community care very deeply about effective testing, so we’ve invested from the very beginning in making integration testing as easy as possible with both Marten and Wolverine.

Alba is another little library from the JasperFx family that just makes it easier to write integration tests at the HTTP layer. Alba is perfect for doing integration testing of your web services. I definitely find it advantageous to be able to quickly bootstrap a web service project and run tests completely in memory on demand. That’s a much easier and quicker feedback cycle than trying to deploy the service and write tests that remotely interact with the web service through HTTP. And I shouldn’t even have to mention how absurdly slow it is in comparison to try to test the same web service functionality through the actual user interface with something like Selenium.

From the Marten side of things, PostgreSQL has a pretty small Docker image size, so it’s pretty painless to spin up on development boxes. Especially contrasted with situations where development teams share a centralized development database (shudder, hope not many folks still do that), having an isolated database for each developer that they can also tear down and rebuild at will certainly helps make it a lot easier to succeed with automated integration testing.

I think that document databases in general are a lot easier to deal with in automated testing than using a relational database with an ORM as the persistence tooling as it’s much less friction in setting up database schemas or to tear down database state. Marten goes a step farther than most persistence tools by having built in APIs to tear down database state or reset to baseline data sets in between tests.

We’ll dig deeper into Wolverine’s integration testing support later in this series with message handler testing, testing handlers that in turn spawn other messages, and dealing with external messaging in tests.

I think the next post is just going to be a quick survey of “Marten as Document Database” before I get back to Wolverine’s HTTP endpoint model.

Leave a comment