Tag Archives: Alba

Using Storyteller with ASP.Net Core Systems

Continuing my rushed education into ASP.Net Core, today it’s time to talk about how to use Storyteller against ASP.Net Core systems.

As my shop has started to adopt ASP.Net Core on new projects, my team at work has started to translate some of the test automation support tooling we had with FubuMVC to new tooling that targets ASP.Net Core. A couple weeks back I released a new open source library called Alba for xUnit-based integration testing of ASP.Net Core applications. This week it’s on to our new recipe for using Storyteller to author specifications against ASP.Net Core systems.

It isn’t documented anywhere but here (yet), but we’ve created a new Storyteller addon called Storyteller.AspNetCore to provide a quick recipe for ASP.Net Core applications. The first step is to tell Storyteller how to bootstrap your ASP.Net Core system. At its very simplest, you can write this code in your Storyteller specification project (see the getting started documentation for some context on this):

    public class Program
    {
        public static void Main(string[] args)
        {
            // Run the application defined by the Startup class
            AspNetCoreSystem.Run(args);
        }
    }

More likely though, you’re going to want to customize the bootstrapping or add other directives. In that case you can subclass the AspNetCoreSystem like this example:

    public class HelloWorldSystem : AspNetCoreSystem
    {
        public HelloWorldSystem()
        {
            UseStartup();

            // You can add more directives to the IWebHostBuilder
            // like so:
            Configure(_ => _.UseKestrel());

            // No request should take longer than 250 milliseconds
            RequestPerformanceThresholdIs(250);
        }
    }

Keeping this ridiculously simple, let’s say you have a controller like so:

    [Route("api/[controller]")]
    public class TextController : Controller
    {
        public static int WaitTime = 0;

        [HttpGet]
        public string Get()
        {
            Thread.Sleep(WaitTime);

            // I'm an MVC newb, and I'm sure there's a better way to do
            HttpContext.Response.Headers.Append("content-type", "text/plain");

            return "Hello, world";
        }
    }

To author specifications against that HTTP endpoints, I wrote a Fixture class that inherits from the new AspNetCoreFixture base class (and gets some help from Alba):

    public class FakeFixture : AspNetCoreFixture
    {

        public FakeFixture()
        {
            Title = "Hello World ASP.Net Core Application";
        }

        public override void SetUp()
        {
            TextController.WaitTime = 0;
        }

        // This is just to fake slow http requests for demonstration purposes
        [FormatAs("If the request takes at least {duration} milliseconds")]
        public void RequestTakes(int duration)
        {
            TextController.WaitTime = duration;
        }

        [FormatAs("The response text from {url} should be '{contents}'")]
        public async Task TheContentsShouldBe(string url)
        {
            var result = await Scenario(_ =>
            {
                _.Get.Url(url);
            });

            return result.ResponseBody.ReadAsText().Trim();
        }
    }

The grammar method TheContentsShouldBe uses Alba to execute an HTTP request to the given url. Using the Fixture above, we can write a specification that looks like this:

AspNetCoreSpecification

Before I show the results for the specification above, the HelloWorldSystem I’m using in the sample project sets a performance threshold of 250 milliseconds for any http request. Any http request that exceeds this duration will cause the specification to fail with performance threshold violations. Knowing that, here’s the result of the specification shown above:

AspNetCoreResults

The initial request incurs some kind of one time “warmup” hit that’s tripping off the performance failure shown above for the first request. I think that my recommendation with the ASP.Net Core testing is to run a synthetic request as part of the system initialization just to get that out of the way so it doesn’t unnecessarily trip off performance threshold rules.

For more context on the performance within the specification results, switch over to the “Performance” tab:

AspNetCorePerformance

The ASP.Net Core requests show up in this table, with Type = “Http Request” and the Subject column being the relative url of the request.The red color coding designates performance records that exceeded performance thresholds.The performance tab can be invaluable to understand where performance problems may be coming from in your end to end specifications — and not just in spotting slow requests. My shop has used this tab to spot “chattiness” problems in some of our specifications where our Javascript clients were making too many requests to the web server and to identify opportunities to batch requests to make a more responsive user interface.

Lastly, a great deal of the challenge in bigger, end to end integration tests is understanding and unraveling failures. To aid in troubleshooting, the new Storyteller.AspNetCore library adds another tab to the Storyteller results to provide some additional context on specifications:

AspNetCoreRequestsIf you’re curious, Storyteller pulls this off by using an IStartupFilter behind the scenes to wrap a custom middleware around the rest of the application that feeds information into Storyteller’s results.

Hopefully I’ll be able to complete the documentation on this and some of the other Storyteller extensions we’ve been using at work and get a full 4.2 release out, but that was a bridge too far today;-)

Introducing Alba for integration testing against ASP.Net Core applications

My shop has started to slowly transition from FubuMVC to ASP.Net Core (w/ and w/o MVC) in our web applications. Instead of going full blown Don Quixote and writing my own alternative web framework like I did in 2009, I’m trying to embrace the mainstream concentrate on tactical additions where I think that makes sense.

I’ve been playing around with a small new project called Alba that seeks to make it easier to write integration tests against HTTP endpoints in ASP.Net Core applications by adapting the “Scenario” testing mechanism from FubuMVC. I’ve pushed up an alpha Nuget (1.0.0-alpha-28) if you’d like to kick the tires on it. Right now it’s very early, but we’re going to try to use it at work for a small trial ASP.Net Core project that just started. I’m also curious to see if anybody is interested in possibly helping out with either coding or just flat out testing it against your own application.

A Quick Example

First, let’s say we have a minimal MVC controller like this one:

    [Route("api/[controller]")]
    public class TextController : Controller
    {
        [HttpGet]
        public string Get()
        {
            // I'm an MVC newb, and I'm sure there's a better way
            HttpContext.Response.Headers
                .Append("content-type", "text/plain");

            return "Hello, world";
        }
    }

With that in place, I can use Alba to write a test that exercises that HTTP endpoint from end to end like this:

    public class examples : IDisposable
    {
        private readonly SystemUnderTest theSystem;

        public examples()
        {
            theSystem = SystemUnderTest.ForStartup<Startup>();
        }

        public void Dispose()
        {
            theSystem.Dispose();
        }


        [Fact]
        public async Task sample_spec()
        {
            var result = await theSystem.Scenario(_ =>
            {
                _.Get.Url("/api/text");
                _.StatusCodeShouldBeOk();
                _.ContentShouldContain("Hello, world");
                _.ContentTypeShouldBe("text/plain");
            });

            // If you so desire, you can interrogate the HTTP
            // response here:
            result.Context.Response.StatusCode
                .ShouldBe(200);
        }
    }

A couple points to note here:

  • The easiest way to tell Alba how to bootstrap your ASP.net application is to just pass your Startup type of your application to the SystemUnderTest.ForStartup<T>() method shown above in the constructor function of that test fixture class.
  • Alba is smart enough to set up the hosting content path to the base directory of your application project. To make that concrete, say your application is at “src/MyApp” and you have a testing project called “src/MyApp.Testing” and you use the standard .Net idiom using the same name for both the directory and the assembly name. In this case, Alba is able to interrogate your MyApp.Startup type, deduce that the “parallel” folder should be “MyApp.Testing,” and automatically set the hosting content path to “src/MyApp” if that folder exists. This can of course be overridden.
  • When the Scenario() method is called, it internally builds up a new HttpContext to represent the request, calls the lambda passed into Scenario() to configure that HttpContext object and register any declarative assertions against the expected response, and executes the request using the raw “RequestDelegate” of your ASP.Net Core application. There is no need to be running Kestrel or any other HTTP server to use Alba — but it doesn’t hurt anything if Kestrel is running in side of your application.
  • The Scenario() method returns a small object that exposes the HttpContext of the request and a helper object to more easily interrogate the http response body for possible further assertions.

Where would this fit in?

Alba itself isn’t a test runner, just a library that can be used within a testing harness like xUnit.Net or Storyteller to drive an ASP.Net Core application.

One of the things I’m trying to accomplish this quarter at work is to try to come up with some suggestions for how developers should decide which testing approach to take in common scenarios. Right now I’m worried that our automated testing frequently veers off into these two non-ideal extremes:

  1. Excessive mocking in unit tests where the test does very little to ascertain whether or not the code in question would actually work in the real system
  2. End to end tests using Selenium or Project White to drive business and persistence logic by manipulating the actual web application interface. These tests tend to be much more cumbersome to write and laborious to maintain as the user interface changes (especially when the developers don’t run the tests locally before committing code changes).

Alba is meant to live in the middle ground between these two extremes and give our teams an effective way to test directly against HTTP endpoints. These Scenario() tests originally came about in FubuMVC because of how aggressive we were being in moving cross cutting concerns like validation and transaction management to fubu’s equivalent to middleware. Unit testing an HTTP endpoint action was very simple, but you really needed to exercise the entire Russian Doll of attached middleware to adequately test any given endpoint.

How is this different than Microsoft.AspNetCore.TestHost?

While I’ve been very critical of Microsoft’s lack of attention to testability in some their development tools, let me give the ASP.Net team some credit here for their TestHost library that comes out of the box. Some of you are going to be perfectly content with TestHost, but Alba already comes with much more functionality for common set up and verifications against HTTP requests. I think Alba can provide a great deal of value to the .Net ecosystem even with an existing solution from Microsoft.

I did use a bit of code that I borrowed from an ASPNet repository that was in turn copy/pasted from the TestHost repository. It’s quite possible that Alba ends up using TestHost underneath the covers.