Integration Testing: IHost Lifecycle with xUnit.Net

I’m part of an initiative at work to analyze and ultimately improve our test automation practices. As part of that work, I’ll be blogging quite a bit about test automation starting with my brain dump on test automation last week and my most recent post on mocks and stubs last month. From here on out, I’m curating all of my posts and selected writings from other folks on my new Automated Testing page.

I’m already on record as saying that the generic host (IHost) in recent versions of .Net is one of the best things that’s ever happened to the .Net ecosystem. In my previous post I stated that I strongly prefer having the system under test running in process with the test harness for faster feedback cycles and easier debugging. The generic host builder introduced in .Net Core turns out to be a very effective way to bootstrap your system within automated test harnesses.

Before I dive into how to use the IHost in automated testing, here’s a couple issues I think you have to address in your integration testing strategy before we go willy nilly spinning up an IHost:

  • You ideally want to test against your code running in a realistic way, so the way code is bootstrapped and configured should be relatively close to how that code is started up in the real application.
  • There will inevitably need to be at least some configuration that needs to be different in testing or some services — usually accessing resources external to your system — that need to be replaced with stubs or some other kind of fake implementation.
  • It’s important to cleanly dispose or shutdown any IHost object you create in memory to avoid potential locks of resources like database connections, files, or ports. Failing to clean up resources in tests can easily make it harder to iterate through test fixes if you find yourself needing to manually kill processes or restart your IDE to release locked resources (been there, done that).
  • The IHost can be expensive to build up, and sometimes there’s going to be some serious benefit in reusing the IHost between tests to make the test suite run faster.
  • But the IHost is stateful, and there could easily be resources (singleton scoped services, databases, and whatnot) that could impact later test runs in the suite.

Before I jump into solutions, let’s assume that I have two projects:

  1. WebApplication is an ASP.Net Core web service project. WebApplication uses Lamar as its underlying DI container.
  2. A test project that references WebApplication

xUnit.Net Mechanics

I’m more comfortable with xUnit.Net, so I’m going to use that first. My typical usage is to share the IHost through xUnit.Net’s CollectionFixture mechanism (and if you think the usage of this thing is confusing, welcome to the club). First up, I’ll build out a new class I usually call AppFixture to manage the lifecycle of the IHost. The example project I’ve built here is an ASP.Net Core web service project, so I’m going to use Alba to wrap the host inside of AppFixture as shown below:

    public class AppFixture : IDisposable, IAsyncLifetime
    {
        public IAlbaHost Host { get; private set; }
        public async Task InitializeAsync()
        {
            // Program.CreateHostBuilder() is the code from the WebApplication
            // that configures the HostBuilder for the system
            Host = await Program
                .CreateHostBuilder(Array.Empty<string>())
                
                // This extension method starts up the underlying IHost,
                // but Alba replaces Kestrel with a TestServer and
                // wraps the IHost
                .StartAlbaAsync();
        }

        public Task DisposeAsync()
        {
            return Host.StopAsync();
        }

        public void Dispose()
        {
            Host?.Dispose();
        }
    }

A couple things to note in that code above:

  • As we’ll set up next, that class above will be constructed once in memory by xUnit and shared between test fixture classes
  • The Dispose() and DisposeAsync() methods both dispose the IHost. By normal .Net mechanics, that will also dispose the underlying Lamar IoC container, which will in turn dispose any services created by Lamar at runtime that implement IDisposable. Disposing the IHost also stops any registered IHostedService services that your application may be using for long running tasks (for my colleagues who may be reading this, both NServiceBus and MassTransit start and stop their message listeners in an IHostedService, so that might be in use even if you don’t explicitly use that technique).

Next, we’ll set up AppFixture to be shared between our integration test classes by using the [CollectionDefinition] attribute on a marker class:

    [CollectionDefinition("Integration")]
    public class AppFixtureCollection : ICollectionFixture<AppFixture>
    {
        
    }

Lastly, I like to build out a base class for integration tests like this one:

    [Collection("Integration")]
    public abstract class IntegrationContext
    {
        protected IntegrationContext(AppFixture fixture)
        {
            theHost = fixture.Host;
            
            // I am using Lamar as the underlying DI container
            // and want some Lamar specific things later on
            // in the tests
            Container = (IContainer)fixture.Host.Services;
        }

        public IAlbaHost theHost { get; }
        
        public IContainer Container { get; }
    }

The [Collection] attribute is meaningful here because that makes xUnit.Net run all the tests that are contained in test fixture classes that inherit from IntegrationContext in a single thread so we don’t have to worry about concurrent test runs.*

And finally to bring this all together, let’s say that WebApplication has this simplistic web service code to do some arithmetic:

    public class Result
    {
        public int Sum { get; set; }
        public int Product { get; set; }
    }

    public class Numbers
    {
        public int[] Values { get; set; }
    }
    
    public class ArithmeticController : ControllerBase
    {
        [HttpPost("/math")]
        public Result DoMath([FromBody] Numbers input)
        {
            var product = 1;
            foreach (var value in input.Values)
            {
                product *= value;
            }

            return new Result
            {
                Sum = input.Values.Sum(),
                Product = product
            };
        }
    }

In the next code block, let’s finally see a test fixture class that uses the new IntegrationContext as a base class and tests the HTTP endpoint shown in the block above.

    public class ArithmeticApiTests : IntegrationContext
    {
        public ArithmeticApiTests(AppFixture fixture) : base(fixture)
        {
        }

        [Fact]
        public async Task post_to_a_secured_endpoint_with_jwt_from_extension()
        {
            // Building the input body
            var input = new Numbers
            {
                Values = new[] {2, 3, 4}
            };

            var response = await theHost.Scenario(x =>
            {
                // Alba deals with Json serialization for us
                x.Post.Json(input).ToUrl("/math");
                
                // Enforce that the HTTP Status Code is 200 Ok
                x.StatusCodeShouldBeOk();
            });

            var output = response.ReadAsJson<Result>();
            output.Sum.ShouldBe(9);
            output.Product.ShouldBe(24);
        }
    }

Alright, at this point we’ve got a way to shared the system’s IHost in tests for better efficiency, and we’re making sure that all the resources in the IHost are cleaned up when the test suite is done. We’re using the WebApplication’s exact configuration for the IHost, but we still might need to alter that in testing. And there’s also the issue of needing to roll back state in our system between tests. I’ll pick up those subjects in my next couple posts, as well as using NUnit instead of xUnit.Net because that’s what the majority of code at my work uses for testing.

* It would be nice to be able to run parallel tests using our shared IHost, but that can often be problematic because of shared state, so I generally bypass test parallelization in integrated tests. The subject of parallelizing integration tests is worthy of a later blog post of some thoughts I haven’t quite elucidated yet.

Lamar v6 is out! Expanded interception options, overriding test services, better documentation

Lamar is a modern IoC container library that is optimized for usage within .Net Core / .Net 5/6 applications and natively implements the .Net DI abstractions (AddTransient/Singleton/Scoped() etc). Lamar was specifically built to be a much higher performance version of the old StructureMap library with similar usage for relatively easy migration.

Lamar v6.0.0 went live today. It’s not a very large release at all, but I took this opportunity to tuck some previously public types into being internal to clean up your Intellisense and eliminated some obsolete public members. I don’t anticipate any changes in your code for the mass majority of Lamar users upgrading.

The highlights of this release are:

  • Lamar’s documentation website got a big facelift as it moved from an old Bootstrap theme with my old stdocs tool to using Vitepress and mdsnippets. I also did a sweep to remove old StructureMap nomenclature and replace that with the newer Lamar nomenclature that very purposely mimics the .Net DI abstraction terminology.
  • New functionality to reliably override service registrations at container creation time. This was specifically meant for automated integration testing where you may want to override specific service registrations. This hasn’t been a straightforward thing in the past because the generic host builder applies service registrations from Startup classes last no matter what, and that defeated many efforts to override services in testing.
  • AaronAllBright contributed an important pull request for some missing activation features. That was expanded into a full blown interception and activation feature set to go along with the support for decorators in Lamar v1+. That’s been a frequent concern for folks looking to move off of StructureMap and one of the biggest remaining features in Lamar’s backlog, so mark that one off.
  • A couple bug fixes, but I think this is the only big one.
  • I stopped and modernized the devops a bit to replace the old Rake build with Bullseye & SimpleExec. I also moved the CI from AppVeyor to GitHub actions. I wouldn’t say that either move was a big deal, but it is nice having the CI information right in the GitHub website.

Dynamic Code Generation in Marten V4

Marten V4 extensively uses runtime code generation backed by Roslyn runtime compilation for dynamic code. This is both much more powerful than source generators in what it allows us to actually do, but can have significant memory usage and “cold start” problems (seems to depend on exact configurations, so it’s not a given that you’ll have these issues). In this post I’ll show the facility we have to “generate ahead” the code to dodge the memory and cold start issues at production time.

Before V4, Marten had used the common model of building up Expression trees and compiling them into lambda functions at runtime to “bake” some of our dynamic behavior into fast running code. That does work and a good percentage of the development tools you use every day probably use that technique internally, but I felt that we’d already outgrown the dynamic Expression generation approach as it was, and the new functionality requested in V4 was going to substantially raise the complexity of what we were going to need to do.

Instead, I (Marten is a team effort, but I get all the blame for this one) opted to use the dynamic code generation and compilation approach using LamarCodeGeneration and LamarCompiler that I’d originally built for other projects. This allowed Marten to generate much more complex code than I thought was practical with other models (we could have used IL generation too of course, but that’s an exercise in masochism). If you’re interested, I gave a talk about these tools and the approach at NDC London 2019.

I do think this has worked out in terms of performance improvements at runtime and certainly helped to be able to introduce the new document and event store metadata features in V4, but there’s a catch. Actually two:

  1. The Roslyn compiler sucks down a lot of memory sometimes and doesn’t seem to ever release it. It’s gotten better with newer releases and it’s not consistent, but still.
  2. There’s a sometimes significant lag in cold start scenarios on the first time Marten needs to generate and compile code at runtime

What we could do though, is provide what I call..

Marten’s “Generate Ahead” Strategy

To side step the problems with the Roslyn compilation, I developed a model (I did this originally in Jasper) to generate the code ahead of time and have it compiled into the entry assembly for the system. The last step is to direct Marten to use the pre-compiled types instead of generating the types at runtime.

Jumping straight into a sample console project to show off this functionality, I’m configuring Marten with the AddMarten() method you can see in this code on GitHub.

The important line of code you need to focus on here is this flag:

opts.GeneratedCodeMode = TypeLoadMode.LoadFromPreBuiltAssembly;

This flag in the Marten configuration directs Marten to first look in the entry assembly of the application for any types that it would normally try to generate at runtime, and if that type exists, load it from the entry assembly and bypass any invocation of Roslyn. I think in a real application you’d wrap that call something like this so that it only applies when the application is running in production mode:

if (Environment.IsProduction())
{

    options.GeneratedCodeMode = 
        TypeLoadMode.LoadFromPreBuiltAssembly;
}

The next thing to notice is that I have to tell Marten ahead of time what the possible document types and even about any compiled query types in this code so that Marten will “know” what code to generate in the next section. The compiled query registration is new, but you already had to let Marten know about the document types to make the schema migration functionality work anyway.

Generating and exporting the code ahead of time is done from the command line through an Oakton command. First though, add the LamarCodeGeneration.Commands Nuget to your entry project, which will also add a transitive reference to Oakton if you’re not already using it. This is all described in the Oakton getting started page, but you’ll need to change your Program.Main() method slightly to activate Oakton:

        // The return value needs to be Task<int> 
        // to communicate command success or failure
        public static Task<int> Main(string[] args)
        {
            return CreateHostBuilder(args)
                
                // This makes Oakton be your CLI
                // runner
                .RunOaktonCommands(args);
        }

If you’ll open up the command terminal of your preference at the root directory of the entry project, type this command to see the available Oakton commands:

dotnet run -- help

That’ll spit out a list of commands and the assemblies where Oakton looked for command types. You should see output similar to this:

Searching 'LamarCodeGeneration.Commands, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' for commands
Searching 'Marten.CommandLine, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' for commands

  -------------------------------------------------
    Available commands:
  -------------------------------------------------
        check-env -> Execute all environment checks against the application
          codegen -> Utilities for working with LamarCodeGeneration and LamarCompiler

and assuming you got that far, now type dotnet run -- help codegen to see the list of options with the codegen command.

If you just want to preview the generated code in the console, type:

dotnet run -- codegen preview

To just verify that the dynamic code can be successfully generated and compiled, use:

dotnet run -- codegen test

To actually export the generated code, use:

dotnet run — codegen write

That command will write a single C# file at /Internal/Generated/DocumentStorage.cs for any document storage types and another at `/Internal/Generated/Events.cs` for the event storage and projections.

Just as a short cut to clear out any old code, you can use:

dotnet run -- codegen delete

If you’re curious, the generated code — and remember that it’s generated code so it’s going to be butt ugly — is going to look like this for the document storage, and this code for the event storage and projections.

The way that I see this being used is something like this:

  • The LoadFromPreBuiltAssembly option is only turned on in Production mode so that developers can iterate at will during development. That should be disabled at development time.
  • As part of the CI/CD process for a project, you’ll run the dotnet run -- codegen write command as an initial step, then proceed to the normal compilation and testing cycles. That will bake in the generated code right into the compiled assemblies and enable you to also take advantage of any kind AOT compiler optimizations

Duh. Why not source generators doofus?

Why didn’t we use source generators you may ask? The Roslyn-based approach in Marten is both much better and much worse than source generators. Source generators are part of the compilation process itself and wouldn’t have any kind of cold start problem like Marten has with the runtime Roslyn compilation approach. That part is obviously much better, plus there’s no weird two step compilation process at CI/CD time. But on the downside, source generators can only use information that’s available at compilation time, and the Marten code generation relies very heavily on type reflection, conventions applied at runtime, and configuration built up through a fluent interface API. I do not believe that we could use source generators for what we’ve done in Marten because of that dependency on runtime information.

Using Alba to Test ASP.Net Services

TL;DR: Alba is a handy library for integration tests against ASP.Net web services, and the rapid feedback that enables is very useful

One of our projects at Calavista right now is helping a client modernize and optimize a large .Net application, with the end goal being everything running on .Net 5 and an order of magnitude improvement in system throughput. As part of the effort to upgrade the web services, I took on a task to replace this system’s usage of IdentityServer3 with IdentityServer4, but still use the existing Marten-backed data storage for user membership information.

Great, but there’s just one problem. I’ve never used IdentityServer4 before and it changed somewhat between the IdentityServer3 code I was trying to reverse engineer and its current model. I ended up getting through that work just fine. A key element of doing that was using the Alba library to create a test harness so I could iterate through configuration changes quickly by rerunning tests on the new IdentityServer4 project. It didn’t start out this way, but Alba is essentially a wrapper around the ASP.Net TestServer and just acts as a utility to make it easier to write automated tests around the HTTP services in your web service projects.

I ended up starting two new .Net projects:

  1. A new web service that hosts IdentityServer4 and is configured to use user membership information from our client’s existing Marten/Postgresql database
  2. A new xUnit.Net project to hold integration tests against the new IdentityServer4 web service.

Let’s dive right into how I set up Alba and xUnit.Net as an automated test harness for our new IdentityServer4 service. If you start a new ASP.Net project with one of the built in project templates, you’ll get a Program file that’s the main entry point for the application and a Startup class that has most of the system’s bootstrapping configuration. The templates will generate this method that’s used to configure the IHostBuilder for the application:

public static IHostBuilder CreateHostBuilder(string[] args)
{
    return Host.CreateDefaultBuilder(args)
        .ConfigureWebHostDefaults(webBuilder =>
        {
            webBuilder.UseStartup<Startup>();
        });
}

For more information on what role of the IHostBuilder is within your application, see .NET Generic Host in ASP.NET Core.

That’s important, because that gives us the ability to stand up the application exactly as it’s configured in an automated test harness. Switching to the new xUnit.Net test project, referenced my new web service project that will host IdentityServer4. Because spinning up your ASP.Net system can be relatively expensive, I only want to do that once and share the IHost between tests. That’s a perfect usage for xUnit.Net’s shared context support.

First I make what will be the shared test fixture context class for the integration tests shown below:

public class AppFixture : IDisposable
{
    public AppFixture()
    {
        // This is calling into the actual application's
        // configuration to stand up the application in 
        // memory
        var builder = Program.CreateHostBuilder(new string[0]);

        // This is the Alba "SystemUnderTest" wrapper
        System = new SystemUnderTest(builder);
    }
    
    public SystemUnderTest System { get; }

    public void Dispose()
    {
        System?.Dispose();
    }
}

The Alba SystemUnderTest wrapper is responsible for building the actual IHost object for your system, and does so using the in memory TestServer in place of Kestrel.

Just as a convenience, I like to create a base class for integration tests I tend to call IntegrationContext:

    public abstract class IntegrationContext 
        // This is some xUnit.Net mechanics
        : IClassFixture<AppFixture>
    {
        public IntegrationContext(AppFixture fixture)
        {
            System = fixture.System;
            
            // Just casting the IServiceProvider of the underlying
            // IHost to a Lamar IContainer as a convenience
            Container = (IContainer)fixture.System.Services;
        }
        
        public IContainer Container { get; }
        
        // This gives the tests access to run
        // Alba "Scenarios"
        public ISystemUnderTest System { get; }
    }

We’re using Lamar as the underlying IoC container in this application, and I wanted to use Lamar-specific IoC diagnostics in the tests, so I expose the main Lamar container off of the base class as just a convenience.

To finally turn to the tests, the very first thing to try with IdentityServer4 was just to hit the descriptive discovery endpoint just to see if the application was bootstrapping correctly and IdentityServer4 was functional *at all*. I started a new test class with this declaration:

    public class EndToEndTests : IntegrationContext
    {
        private readonly ITestOutputHelper _output;
        private IDocumentStore theStore;

        public EndToEndTests(AppFixture fixture, ITestOutputHelper output) : base(fixture)
        {
            _output = output;

            // I'm grabbing the Marten document store for the app
            // to set up user information later
            theStore = Container.GetInstance<IDocumentStore>();
        }

And then a new test just to exercise the discovery endpoint:

[Fact]
public async Task can_get_the_discovery_document()
{
    var result = await System.Scenario(x =>
    {
        x.Get.Url("/.well-known/openid-configuration");
        x.StatusCodeShouldBeOk();
    });
    
    // Just checking out what's returned
    _output.WriteLine(result.ResponseBody.ReadAsText());
}

The test above is pretty crude. All it does is try to hit the `/.well-known/openid-configuration` url in the application and see that it returns a 200 OK HTTP status code.

I tend to run tests while I’m coding by using keyboard shortcuts. Most IDEs support some kind of “re-run the last test” keyboard shortcut. Using that, my preferred workflow is to run the test once, then assuming that the test is failing the first time, work in a tight cycle of making changes and constantly re-running the test(s). This turned out to be invaluable as it took me a couple iterations of code changes to correctly re-create the old IdentityServer3 configuration into the new IdentityServer4 configuration.

Moving on to doing a simple authentication, I wrote a test like this one to exercise the system with known credentials:

[Fact]
public async Task get_a_token()
{
    // All I'm doing here is wiping out the existing database state
    // with Marten and using a little helper method to add a new
    // user to the database as part of the test setup
    // This would obviously be different in your system
    theStore.Advanced.Clean.DeleteAllDocuments();
    await theStore
        .CreateUser("aquaman@justiceleague.com", "dolphin", "System Administrator");
    
    
    var body =
        "client_id=something&client_secret=something&grant_type=password&scope=ourscope%20profile%20openid&username=aquaman@justiceleague.com&password=dolphin";

    // The test would fail here if the status
    // code was NOT 200
    var result = await System.Scenario(x =>
    {
        x.Post
            .Text(body)
            .ContentType("application/x-www-form-urlencoded")
            .ToUrl("/connect/token");
    });

    // As a convenience, I made a new class called ConnectData
    // just to deserialize the IdentityServer4 response into 
    // a .Net object
    var token = result.ResponseBody.ReadAsJson<ConnectData>();
    token.access_token.Should().NotBeNull();
    token.token_type.Should().Be("Bearer");
}

public class ConnectData
{
    public string access_token { get; set; }
    public int expires_in { get; set; }
    public string token_type { get; set; }
    public string scope { get; set; }
    
}

Now, this test took me several iterations to work through until I found exactly the right way to configure IdentityServer4 and adjusted our custom Marten backing identity store (IResourceOwnerPasswordValidator and IProfileService in IdentityServer4 world) until the tests pass. I found it extremely valuable to be able to debug right into the failing tests as I worked, and even needed to take advantage of JetBrains Rider’s capability to debug through external code to understand how IdentityServer4 itself worked. I’m very sure that I was able to get through this work much faster by iterating through tests as opposed to just trying to run the application and driving it through something like Postman or through the connected user interface.

Oakton v3 super charges the .Net Core/5 command line, and helps Lamar deliver uniquely useful IoC diagnostics

With the Great Texas Ice and Snow Storm of 2021 last week playing havoc with basically everything, I switched to easier work on Oakton and Lamar in the windows of time when we had power. The result of that is the planned v3 release that merged the former Oakton.AspNetCore library into Oakton itself, and took a dependency on Spectre.Console to make all the command output a whole lot more usable. As a logical follow up to the Oakton improvements, I mostly rewrote the Lamar.Diagnostics package that uses Oakton to add Lamar-related troubleshooting commands to your application. I used Spectre.Console to improve how the Lamar configuration was rendered to enhance its usability. I wanted the improved Lamar diagnostics for an ongoing Calavista client project, and that’ll be in heavy usage this very week.

Late last week I released version 3.0.1 of Oakton, a library for building rich command line functionality into .Net Core / .Net 5.0 applications. While competent command line parsing tools are a dime a dozen in the .Net space by this point, Oakton stands out by integrating directly into the generic HostBuilder and extending the command line operations of your currently configured .Net system (Oakton can still be used without HostBuilder as well, see Using the CommandExecutor for more information).

Jumping right into the integration effort, let’s assume that you have a .Net Core 3.1 or .Net 5 application generated by one of the built in project templates. If so, your application’s entry point probably looks like this Program.Main() method:

public static void Main(string[] args)
{
    CreateHostBuilder(args).Build().Run();
}

To integrate Oakton commands, add a reference to the Oakton Nuget, then change Program.Main() to this:

public static Task<int> Main(string[] args)
{
    return CreateHostBuilder(args)
        .RunOaktonCommands(args);
}

It’s important to change the return type to Task<int> because that’s used to communicate exit codes to the operating system to denote success or failure for commands that do some kind of system verification.

When you use dotnet run, you can pass arguments or flags to the dotnet run command itself, and also arguments and flags to your actual application. The way that works is that arguments are delimited by a double dash, so that in this usage: dotnet run --framework net5.0 -- check-env, the --framework flag is passed to dotnet run itself to modify its behavior, and the check-env argument is passed to your application.

Out of the box, Oakton adds a couple named commands to your application along with a default “run my application” command. You can access the built in Oakton command line help with dotnet run -- help. The output of that is shown below:

   ---------------------------------------------------------------------
     Available commands:
   ---------------------------------------------------------------------
     check-env -> Execute all environment checks against the application
      describe -> Writes out a description of your running application to either the console or a file
          help -> list all the available commands
           run -> Start and run this .Net application
   --------------------------------------------------------------------- 

Slightly Enhanced “Run”

Once Oakton is in place, your basic dotnet run command has a couple new flag options.

  1. dotnet run still starts up your application just like it did before you applied Oakton
  2. dotnet run -- -e Staging using the -e flag starts the application with the host environment name overridden to the value of the flag. In this example, I started the application running as “Staging.”
  3. dotnet run -- --check or dotnet run -- -c will run your application, but first perform any registered environment checks before starting the full application. More on this below.

Environment Checks

Oakton has a simple mechanism to embed environment checks directly into your application. The point of environment checks is to make your .Net application be able to check out its configuration during deployment in order to “fail fast” if external dependencies like databases are unavailable, or required files or missing, or necessary configuration items are invalid.

To use environment checks in your application with Oakton, just run this command:

dotnet run -- check-env

The command will fail if any of the environment checks fail, and you’ll get a report of every failure.

I wrote about this a couple years ago, and Oakton v3 just makes the environment check output a little prettier. If you’re using Lamar as your application’s IoC container, the Lamar.Diagnostics package adds some easier recipes for exposing environment checks in your code in combination with Oakton’s check-env command. I’ll show that later in this post.

The environment check functionality can be absolutely invaluable when your application is being deployed to an environment that isn’t perfectly reliable or has dependencies outside the control of your team. Ask me how I know this to be the case…

Describing the Application

A couple of the OSS projects I support are very sensitive to application configuration, and it’s sometimes challenging to get the right information from users who are having problems with my OSS libraries. As a possible way to ameliorate that issue, there’s Oakton’s extensible describe command:

dotnet run -- describe

Out of the box it just lists basic information about the application and referenced assemblies, but it comes with a simple extensibility mechanism I’m hoping to exploit for embedding Jasper, Marten, and Lamar diagnostics in applications.

Here’s the sample output from a simplistic application built with the worker template:

── About WorkerService ─────────────────────────────────────────
          Entry Assembly: WorkerService
                 Version: 1.0.0.0
        Application Name: WorkerService
             Environment: Development
       Content Root Path: the content root folder
AppContext.BaseDirectory: the application's base directory


── Referenced Assemblies ───────────────────────────────────────
┌───────────────────────────────────────────────────────┬─────────┐
│ Assembly Name                                         │ Version │
├───────────────────────────────────────────────────────┼─────────┤
│ System.Runtime                                        │ 5.0.0.0 │
│ Microsoft.Extensions.Configuration.UserSecrets        │ 5.0.0.0 │
│ Microsoft.Extensions.Hosting.Abstractions             │ 5.0.0.0 │
│ Microsoft.Extensions.DependencyInjection.Abstractions │ 5.0.0.0 │
│ Microsoft.Extensions.Logging.Abstractions             │ 5.0.0.0 │
│ Oakton                                                │ 3.0.2.0 │
│ Microsoft.Extensions.Hosting                          │ 5.0.0.0 │
└───────────────────────────────────────────────────────┴─────────┘

Lamar Diagnostics

As an add-on to Oakton, I updated the Lamar.Diagnostics package that uses Oakton to add Lamar diagnostics to a .Net application. If this package is added to a .Net project that uses Oakton for command line parsing and Lamar as its IoC container, you’ll see new commands light up from dotnet run -- help:

    lamar-scanning -> Runs Lamar's type scanning diagnostics
    lamar-services -> List all the registered Lamar services
    lamar-validate -> Runs all the Lamar container validations

The lamar-scanning command just gives you access to Lamar’s type scanning diagnostics report (something my team & I needed post haste for an ongoing Calavista client project).

The lamar-validate command runs the AssertConfigurationIsValid() method on your application’s configured container to validate that every service registration is valid, can be built, and also tries out Lamar’s own built in environment check system. You can read more about what this method does in the Lamar documentation.

You can also add the Lamar container validation step into the Oakton environment check system with this service registration:

// This adds a Container validation
// to the Oakton "check-env" command
// "services" would be an IServiceCollection object
// either in StartUp.ConfigureServices() or in a Lamar
// ServiceRegistry
services.CheckLamarConfiguration();

As I said earlier, Lamar still supports a very old StructureMap feature to embed environment checks right into your application’s services with the [ValidateMethod] attribute. Let’s say that you have a class in your system called DatabaseUsingService that requires a valid connection string from configuration, and DatabaseUsingService is registered in your Lamar container. You could add a method on that class just to check if the database is reachable like the Validate() method below:

    public class DatabaseUsingService
    {
        private readonly DatabaseSettings _settings;

        public DatabaseUsingService(DatabaseSettings settings)
        {
            _settings = settings;
        }

        [ValidationMethod]
        public void Validate()
        {
            // For *now*, Lamar requires validate methods be synchronous
            using (var conn = new SqlConnection(_settings.ConnectionString))
            {
                // If this blows up, the environment check fails:)
                conn.Open();
            }
        }
    }

When Lamar validates the container, it will see the methods marked with [ValidationMethod], pull the service from the configured container, and try to call this method and record any exceptions.

Lamar Services Preview

The big addition in the latest Lamar.Diagnostics package is a big improvement in how it renders a report about the registered services compared to the old, built in WhatDoIHave() textual report.

Starting from a pretty basic ASP.Net Core application, using the dotnet run -- lamar-services command gives us this output:

If you want to dig into the dependency train and how Lamar is going to build a certain registration, you can use this flag dotnet run -- lamar-services --build-plans. Which is going to look like this:

Hat tip to Spectre.Console for making the layout a lot prettier and easier to read.

Where does Oakton go next?

I’m somewhat hopeful that other folks would pick up Oakton’s command line model and add more utilities, application describer parts, and environment checks in either Oakton itself or in extension libraries. For my part, Marten v4’s command line support will use Oakton to add schema management and event store utilities directly to your application. I think it’d also be helpful to have an ASP.Net specific set of application described parts to at least preview routes.

Using Oakton for Development-Time Commands in .Net Core Applications

All of the sample code in this blog post is on GitHub in the OaktonDevelopmentCommands project.

Last year I released the Oakton.AspNetCore library that provides an expanded command line experience for ASP.Net Core applications (and environment tests too!) that adds additional command line flags for the basic dotnet run behavior. Oakton.AspNetCore also gives you the ability to embed completely different named command directly into your application — either from your own application or Oakton.AspNetCore can pull in commands from Nuget libraries.

To make this concrete, I started a new sample project on GitHub with the dotnet new webapi template. To add the new command line experience, I added a Nuget reference to Oakton.AspNetCore and modified the Program.Main() method to this:

// I changed the return type to Task
public static Task Main(string[] args)
{
    return CreateHostBuilder(args)
        // This extension method makes Oakton the active
        // command line parser and executor
        .RunOaktonCommands(args);
}

To show the ability to add commands from an external library, I also swapped out the built in DI container with Lamar, and added a reference to the Lamar.Diagnostics Nuget to expose Lamar’s diagnostics reports via the command line.

Just to show the Lamar integration, I added just the one line of code to the host builder configuration you can see below:

public static IHostBuilder CreateHostBuilder(string[] args) =>
    Host.CreateDefaultBuilder(args)
        .UseLamar() // Overriding the IoC container to use Lamar
        .ConfigureWebHostDefaults(webBuilder =>
        {
            webBuilder.UseStartup();
        });

Now, to switch the actual command line to see the results of all of that, you can see Oakton’s built in command line help by typing the command dotnet run --?, which gives this output:

  ------------------------------------------------------------------------------
    Available commands:
  ------------------------------------------------------------------------------
         check-env -> Execute all environment checks against the application
          describe -> Writes out a description of your running application to either the console or a file
    lamar-scanning -> Runs Lamar's type scanning diagnostics
    lamar-services -> List all the registered Lamar services
    lamar-validate -> Runs all the Lamar container validations
               run -> Runs the configured AspNetCore application
          sayhello -> I'm a simple command that just starts up the app and says hello
  ------------------------------------------------------------------------------

A couple notes about the output above:

  • With the dotnet cli mechanics, the basic usage is dotnet [command] [optional flags to dotnet itself] -- [arguments to your application]. The double dashes marks the boundaries between arguments and flags meant for dotnet itself and arguments or flags for your application. So as an example, to run your application compiled as “Release” and using the “Testing” environment name for your .Net Core application, the command would be dotnet run --configuration Release -- --environment Testing.
  • The check-env, describe, and run commands come in from the base Oakton.AspNetCore library. The run command is the default, so dotnet run actually delegates to the run command.
  • All the lamar-******* commands come from Lamar.Diagnostics because Oakton.AspNetCore can happily find and apply commands from other assemblies in the project.
  • We’ll build out sayhello later in this post

Assuming that you are a Lamar (or a StructureMap) user, the first step to unravel most IoC configuration problems is the old WhatDoIHave() method to see what the service registrations are. To get at that data faster, you can use the dotnet run -- lamar-services command just to dump out the WhatDoIHave() output to the console or a text file.

However, adding Lamar.Diagnostics to your application adds some dependencies you may not want to be deployed. With a little help from the .Net Core SDK project system, we can just tell .Net to only use Lamar.Diagnostics at development time by using the <PrivateAssets> tag in the csproj file like this:

<PackageReference Include="Lamar.Diagnostics" Version="1.1.5">
    <PrivateAssets>all</PrivateAssets>
</PackageReference>

 

I don’t know how to do that without breaking into the csproj file, but once you do, the Lamar.Diagnostics assembly will not be deployed if you’re using dotnet publish to bundle up your application for deployment.

The “AspNetCore” naming is unfortunately a misnomer, because as of .Net Core 3.* the Oakton extension works for any project configured and bootstrapped with the new generic HostBuilder (purely console applications, worker applications, or any kind of web application).

Now, to add a custom command to the application directly into the application without polluting the deployed application, we’re just using some conditional compilation as shown in this super simple example:

// This is necessary to tell Oakton
// to search this assembly for Oakton commands
[assembly:Oakton.OaktonCommandAssembly]

namespace OaktonDevelopmentCommands
{
    // The conditional compilation here just keeps this command from
    // being present in the Release build of the application
#if DEBUG
    // This is also an OaktonAsyncCommand if you need to 
    // call async APIs
    [Description("I'm a simple command that just starts up the app and says hello")]
    public class SayHelloCommand : OaktonCommand<NetCoreInput>
    {
        public override bool Execute(NetCoreInput input)
        {
            // Super cheesy, just starting up the application
            // and shutting it right down
            using (var host = input.BuildHost())
            {
                // You do have access to the host's underlying
                // IoC provider, and hence to any application service
                // including the compiled IConfiguration as well
                
                Console.WriteLine("Hey, I can start up the application!");
            }

            // Gotta return true to let Oakton know that the command succeeded
            // This is important if you're using commands that need to report
            // success or failure to the command line.
            return true;
        }
    }
#endif
}

It’s hoke-y, but the command up above will only be compiled into your application if you are compiling as “Debug” — as you generally do when you’re working locally or running tests. When you deploy as a “Release” build, this command won’t be part of the compiled executable. As silly as it looks, this is being very useful in one of my client projects right now.

It’s an OSS Nuget Release Party! (Jasper v1.0, Lamar, Alba, Oakton)

My bandwidth for OSS work has been about zero for the past couple months. With the COVID-19 measures drastically reducing my commute and driving kids to school time, getting to actually use some of my projects at work, and a threat from an early adopter to give up on Jasper if I didn’t get something out soon, I suddenly went active again and got quite a bit of backlog things, pull requests, and bug fixes out.

My main focus in terms of OSS development for the past 3-4 years has been a big project called “Jasper” that was originally going to be a modernized .Net Core successor to FubuMVC. Just by way of explanation, Jasper, MO is my ancestral hometown and all the other projects I’m mentioning here are named after either other small towns or landmarks around the titular “Jasper.”

Alba

Alba is a library for HTTP contract testing with ASP.Net Core endpoints. It does quite a bit to reduce the repetitive code from using TestServer by itself in tests and makes your tests be much more declarative and intention revealing. Alba v3.1.1 was released a couple weeks ago to address a problem exposing the application’s IServiceProvider from the Alba SystemUnderTest. Fortunately for once, I caught this one myself while dogfooding Alba on a project at work.

Alba originated with the code we used to test FubuMVC HTTP behavior back in the day, but was modernized to work with ASP.Net Core instead of OWIN, and later retrofitted to just use TestServer under the covers.

Baseline

Baseline is a grab bag of utility code and extension methods on common .Net types that you can’t believe is missing from the BCL, most of which is originally descended from FubuCore.

As an ancillary project, I ripped out the type scanning and assembly finding code from Lamar into a separate BaselineTypeDiscovery Nuget that’s used by most of the other libraries in this post. There was a pretty significant pull request in the latest BaselineTypeDiscovery v1.1.0 release that should improve the application start up time for folks that use Lamar to discover assemblies in their application.

Oakton

Oakton is a command parsing library for .Net that was lifted originally from FubuCore that is used by Jasper. Oakton v2.0.4 and Oakton.AspNetCore v2.1.3 just upgrade the assembly discovery features to use the newer BaselineTypeDiscovery release above.

Lamar

Lamar is a modern, fast, ASP.Net Core compliant IoC container and the successor to StructureMap. I let a pretty good backlog of issues and pull requests amass, so I took some time yesterday to burn that down and the result is Lamar v4.2. This release upgrades the type scanning, fixes some bugs, and added quite a few fixes to the documentation website.

Jasper

Jasper at this point is a command executor ala MediatR (but much more ambitious) and a lightweight messaging framework — but the external messaging will mature much more in subsequent releases.

This feels remarkably anti-climactic seeing as how it’s been my main focus for years, but I pushed Jasper v1.0 today, specifically for some early adopters. The documentation is updated here. There’s also an up to date repository of samples that should grow. I’ll make a much bigger deal out of Jasper when I make the v1.1 or v2.0 release sometime after the Coronavirus has receded and my bandwidth slash ambition level is higher. For right now I’m just wanting to get some feedback from early users and let them beat it up.

Marten

There’s nothing new to say from me about Marten here except that my focus on Jasper has kept me from contributing too much to Marten. With Jasper v1.0 out, I’ll shortly (and finally) turn my attention to helping with the long planned Marten v4 release. For a little bit of synergy, part of my plans there is to use Jasper for some of the advanced Marten event store functionality we’re planning.

 

 

 

 

 

 

Lamar 4.1: Multithreading improvements, diagnostics, documentation updates, and some thoughts on troubleshooting

As promised in my previous post My (Big) OSS Plans for 2020, the very first thing out the gate for me this year is a bug fix release for Lamar.

Lamar 4.1 and its related libraries was released on Nuget late last week with a variety of bug fixes, a couple new features, and a documentation refresh at https://jasperfx.github.io/lamar — including some new guidance here and there as a direct reaction to GitHub issues. Continuing my personal theme of OSS interactions being more positive than not over the past couple years, I received a lot of help from Lamar users. This release was largely the result of users submitting pull requests with fixes or failing unit tests that reproduced issues — and I cannot stress enough how helpful those reproduction tests are for an OSS maintainer. Another user took the time to investigate how an error message could be greatly improved. Thank you to all the users who helped on this release with pull requests, suggestions, and feedback on GitHub.

All told, the libraries updated are:

  • Lamar 4.1.0 — Multi-threading issues were finally addressed, fixes for Lamar + ASP.Net Core logging, some finer grained control over type scanning registrations
  • Lamar.Microsoft.DependencyInjection v4.1.0 — Adds support back for IWebHostBuilder for .Net Core 3.0 applications
  • Lamar.Diagnostics v1.1.3 — More on this one below
  • LamarCompiler 2.1.1 — Just updated the Roslyn dependencies. Lamar itself doesn’t use this, so you’re very unlikely to be impacted
  • LamarCodeGeneration v1.4.0 — Some small additions to support a couple Jasper use cases
  • LamarCodeGeneration.Commands v1.0.2 — This isn’t documented yet, and is really just to support some diagnostics and pre-generation of code for Jasper

 

A Note on Troubleshooting

I have a partially written blog post slash treatise on troubleshooting and debugging. One of the things I try to suggest when troubleshooting a technical issue is to come up with a series of theories about why something isn’t working and figuring out the quickest way to prove or disprove that theory.

In the case of Lamar’s multi-threading issues addressed in this release and a very similar issue fixed previously, the “obvious” theory was that somewhere there was some issue with data structures or locking. Myself and several others tried to investigate Lamar’s internals down this path, but came up empty handed.

The actual root cause turned out to be related to the Expression construction and compilation inside of Lamar that allowed variables to bleed through threads in heavily multi-threaded usage.

So, I still think that my idea of “build a theory about why something is failing, then try to knock it down” is a good approach, but not 100% effective. I’m adding a section to that blog post entitled “don’t get tunnel vision” and talk about fixating on one theory and not considering other explanations;-)

Then again, some things are just hard some time.

Lamar.Diagnostics

Back in October I blogged about the new Oakton.AspNetCore package that extends the command line capabilities of standard .Net Core / ASP.Net Core applications with additional diagnostics. As a drop in extension to Oakton.AspNetCore, the new Lamar.Diagnostics package can be installed into a .Net Core application to give you ready access to all of Lamar’s built in diagnostics through the command line.

Your newly available commands from the root of your project with Lamar.Diagnostics are:

  1. dotnet run -- lamar-services — prints out the Container.WhatDoIHave() output to the console or a designated file
  2. dotnet run -- lamar-scanning — prints out the Container.WhatDidIScan() output to the console or a designated file
  3. dotnet run -- lamar-validate — runs the Container.AssertConfigurationIsValid() command

You can also opt into Lamar’s built in environment tests being used through Oakton.AspNetCore’s environment check capability.

In all cases, these commands work by calling IHostBuilder.Build() to build up your application — but doesn’t call Start() so none of your IHostedService objects will run — and calls the underlying Container methods. By doing this, you get ready access to the Lamar diagnostics against your application exacstly the way that your applicaiton is configured, without you having to add any additional code to your system to get at this diagnostic information.

 

 

My (Big) OSS Plans for 2020

It’s now a yearly thing for me to blog about my aspirations and plans for various OSS projects at the beginning of the year. I was mostly on the nose in 2018, and way, way off in 2019. I’m hoping and planning for a much bigger year in 2020 as I’ve gotten much more enthusiastic and energetic about some ongoing efforts recently.

Most of my time and ambition next year is going toward Jasper, Marten, and Storyteller:

Jasper

Jasper is a toolkit for common messaging scenarios between .Net applications with a robust in process command runner that can be used either with or without the messaging. Jasper wholeheartedly embraces the .Net Core 3.0 ecosystem rather than trying to be its own standalone framework.

Jasper has been gestating for years, I almost gave up on it completely early last year, and purposely set it aside until after .Net Core 3.0 was released. However, I came back to it a few months ago with fresh eyes and full of new ideas from doing a lot of messaging scenario work for Calavista clients. I’m very optimistic right now about Jasper from a purely technical perspective. I’m furiously updating documentation, building sample projects, and dealing with last minute API improvements in an effort to kick out the big v1.0 release sometime in January of 2020.

Marten

Marten is a library that allows .Net developers to treat the outstanding Postgresql database as both a document database and an event store. Marten is mostly supported by a core team, but I’m hoping to be much more involved again this year. The big thing is a proposed v4 release that looks like it’s mostly going to be focused on the event sourcing functionality. There’s an initial GitHub issue for the overall work here, and I want to write a bigger post soon on some ideas about the approach. There’s going to be some new functionality, but the general theme is to make the event sourcing be much more scalable. Marten is a team effort now, and there’s plenty of room for more contributors or collaborators.

For some synergy, I’m also planning on building out some reference applications that use Jasper to integrate Marten with cloud based queueing and load distribution for the asynchronous projection support. I’m excited about this work just to level up on my cloud computing skills.

Storyteller

Storyteller is a tool I’ve worked on and used for what I prefer to call executable specification in .Net (but other folks would call Behavior Driven Development, which I don’t like only because BDD is overloaded). I did quite a bit of preliminary work last year on a proposed Storyteller v6 that would bring it more squarely into the .Net Core world, I wrote a post last year called Spitballing the Future of Storyteller that laid out all my thoughts. I liked where it was heading, but I got distracted by other things.

For more synergy, Storyteller v6 will use Jasper a little bit for its communication between the user interface and the specification process. It also dovetails nicely with my need to update my Javascript UI skillset.

 

Smaller Projects

Lamar — the modern successor to StructureMap and ASP.Net Core compliant IoC container. I will be getting a medium sized maintenance release out very soon as I’ve let the issue list back up. I’m only focused on dealing with problems and questions as they come in.

EDIT 1/6/2020 –> There’s a Lamar 4.1 release out!

Alba — a library that wraps TestServer to make integration testing against ASP.Net Core HTTP endpoints easier. The big work late last year was making it support ASP.Net Core v3. I don’t have any plans to do much with it this year, but that could change quickly if I get to use it on a client project this year.

Oakton — yet another command line parser. It’s used by Jasper, Storyteller, and the Marten command line package. I feel like it’s basically done and said so last year, but I added some specific features for ASP.Net Core applications and might add more along those lines this year.

StructureMap — Same as last year. I answer questions here and there, but otherwise it’s finished/abandoned

FubuMVC — A fading memory, and I’m pleasantly surprised when someone mentions it to me about once or twice a year

 

Environment Checks and Better Command Line Abilities for your .Net Core Application

Oakton.AspNetCore is a new package built on top of the Oakton 2.0+ command line parser that adds extra functionality to the command line execution of ASP.Net Core and .Net Core 3.0 codebases. At the bottom of this blog post is a small section showing you how to set up Oakton.AspNetCore to run commands in your .Net Core application.

First though, you need to understand that when you use the dotnet run command to build and execute your ASP.Net Core application, you can pass arguments and flags both to dotnet run itself and to your application through the string[] args argument of Program.Main(). These two types of arguments or flags are separated by a double dash, like this example: dotnet run --framework netcoreapp2.0 -- ?. In this case, “–framework netcoreapp2.0” is used by dotnet run itself, and the values to the right of the “–” are passed into your application as the args array.

With that out of the way, let’s see what Oakton.AspNetCore brings to the table.

Extended “Run” Options

In the default ASP.Net Core templates, your application can be started with all its defaults by using dotnet run.  Oakton.AspNetCore retains that usage, but adds some new abilities with its “Run” command. To check the syntax options, type dotnet run -- ? run:

 Usages for 'run' (Runs the configured AspNetCore application)
  run [-c, --check] [-e, --environment <environment>] [-v, --verbose] [-l, --log-level <logleve>] [----config:<prop> <value>]

  ---------------------------------------------------------------------------------------------------------------------------------------
    Flags
  ---------------------------------------------------------------------------------------------------------------------------------------
                        [-c, --check] -> Run the environment checks before starting the host
    [-e, --environment <environment>] -> Use to override the ASP.Net Environment name
                      [-v, --verbose] -> Write out much more information at startup and enables console logging
          [-l, --log-level <logleve>] -> Override the log level
          [----config:<prop> <value>] -> Overwrite individual configuration items
  ---------------------------------------------------------------------------------------------------------------------------------------

To run your application under a different hosting environment name value, use a flag like so:

dotnet run -- --environment Testing

or

dotnet run -- -e Testing

To overwrite configuration key/value pairs, you’ve also got this option:

dotnet run -- --config:key1 value1 --config:key2 value2

which will overwrite the configuration keys for “key1” and “key2” to “value1” and “value2” respectively.

Lastly, you can have any configured environment checks for your application immediately before starting your application by using this flag:

dotnet run -- --check

More on this function in the next section.

 

Environment Checks

I’m a huge fan of building environment tests directly into your application. Environment tests allow your application to self-diagnose issues with deployment, configuration, or environmental dependencies upfront that would impact its ability to run.

As a very real world example, let’s say your ASP.Net Core application needs to access another web service that’s managed independently by other teams and maybe, just maybe your testers have occasionally tried to test your application when:

  • Your application configuration has the wrong Url for the other web service
  • The other web service isn’t running at all
  • There’s some kind of authentication issue between your application and the other web service

In the real world project that spawned the example above, we added a formal environment check that would try to touch the health check endpoint of the external web service and throw an exception if we couldn’t connect to the external system. The next step was to execute our application as it was configured and deployed with this environment check as part of our Continuous Deployment pipeline. If the environment check failed, the deployment itself failed and triggered off the normal set of failure alerts letting us know to go fix the environment rather than letting our testers waste time on a bad deployment.

With all that said, let’s look at what Oakton.AspNetCore does here to help you add environment checks. Let’s say your application uses a single Sql Server database, and the connection string should be configured in the “connectionString” key of your application’s connection. You would probably want an environment check just to verify at a minimum that you can successfully connect to your database as it’s configured.

In your ASP.Net Core Startup class, you could add a new service registration for an environment check like this example:

// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
    // Other registrations we don't care about...
    
    // This extension method is in Oakton.AspNetCore
    services.CheckEnvironment<IConfiguration>("Can connect to the application database", config =>
    {
        var connectionString = config["connectionString"];
        using (var conn = new SqlConnection(connectionString))
        {
            // Just attempt to open the connection. If there's anything
            // wrong here, it's going to throw an exception
            conn.Open();
        }
    });
}

Now, during deployments or even just pulling down the code to run locally, we can run the environment checks on our application like so:

dotnet run -- check-env

Which in the case of our application above, blows up with output like this because I didn’t add configuration for the database in the first place:

Running Environment Checks
   1.) Failed: Can connect to the application database
System.InvalidOperationException: The ConnectionString property has not been initialized.
   at System.Data.SqlClient.SqlConnection.PermissionDemand()
   at System.Data.SqlClient.SqlConnectionFactory.Permissi
onDemand(DbConnection outerConnection)
   at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1
 retry, DbConnectionOptions userOptions)
   at System.Data.ProviderBase.DbConnectionClosed.TryOpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, 
DbConnectionOptions userOptions)
   at System.Data.SqlClient.SqlConnection.TryOpen(TaskCompletionSource`1 retry)
   at System.Data.SqlClient.SqlConnection.Open()
   at MvcApp.Startup.<>c.<ConfigureServices>b_
_4_0(IConfiguration config) in /Users/jeremydmiller/code/oakton/src/MvcApp/Startup.cs:line 41
   at Oakton.AspNetCore.Environment.EnvironmentCheckExtensions.<>c__DisplayClass2_0`1.<CheckEnvironment>b__0(IServ
iceProvider s, CancellationToken c) in /Users/jeremydmiller/code/oakton/src/Oakton.AspNetCore/Environment/EnvironmentCheckExtensions.cs:line 53
   at Oakton.AspNetCore.Environment.LambdaCheck.Assert(IServiceP
rovider services, CancellationToken cancellation) in /Users/jeremydmiller/code/oakton/src/Oakton.AspNetCore/Environment/LambdaCheck.cs:line 19
   at Oakton.AspNetCore.Environment.EnvironmentChecker.ExecuteAll
EnvironmentChecks(IServiceProvider services, CancellationToken token) in /Users/jeremydmiller/code/oakton/src/Oakton.AspNetCore/Environment/EnvironmentChecker.cs:line 31

If you ran this command during continuous deployment scripts, the command should cause your build to fail when it detects environment problems.

In some of Calavista’s current projects , we’ve been adding environment tests to our applications for items like:

  • Can our application read certain configured directories?
  • Can our application as it’s configured connect to databases?
  • Can your application reach other web services?
  • Are required configuration items specified? That’s been an issue as we’ve had to build out Continuous Deployment pipelines to many, many different server environments

I don’t see the idea of “Environment Tests” mentioned very often, and it might have other names I’m not aware of. I learned about the idea back in the Extreme Programming days from a blog post from Nat Pryce that I can’t find any longer, but there’s this paper from those days too.

 

Add Other Commands

I’ve frequently worked in projects where we’ve built parallel console applications that reproduce a lot of the same IoC and configuration setup to perform administrative tasks or add other diagnostics. It could be things like adding users, rebuilding an event store projection, executing database migrations, or loading some kind of data into the application’s database. What if instead, you could just add these directly to your .Net Core application as additional dotnet run -- [command] options? Fortunately, Oakton.AspNetCore let’s you do exactly that, and even allows you to package up reusable commands in other assemblies that could be distributed by Nuget.

If you use Lamar as your IoC container in an ASP.Net Core application (or .Net Core 3.0 console app using the new unified HostBuilder), we now have an add on Nuget called Lamar.Diagnostics that will add new Oakton commands to your application that give you access to Lamar’s diagnostic tools from the command line. As an example, this library adds a command to write out the “WhatDoIHave()” report for the underlying Lamar IoC container of your application to the command line or a file like this:

dotnet run --lamar-services

Now, using the command above as an example, to build or add your own commands start by decorating the assembly containing the command classes with this attribute:

[assembly:OaktonCommandAssembly]

Having this assembly tells Oakton.AspNetCore to search the assembly for additional Oakton commands. There is no other setup necessary.

If your command needs to use the application’s services or configuration, have the Oakton input type inherit from NetCoreInput type from Oakton.AspNetCore like so:

public class LamarServicesInput : NetCoreInput
{
    // Lots of other flags
}

Next, the new command for “lamar-services” is just this:

[Description("List all the registered Lamar services", Name = "lamar-services")]
public class LamarServicesCommand : OaktonCommand<LamarServicesInput>
{
    public override bool Execute(LamarServicesInput input)
    {
        // BuildHost() will return an IHost for your application
        // if you're using .Net Core 3.0, or IWebHost for
        // ASP.Net Core 2.*
        using (var host = input.BuildHost())
        {
            // The actual execution using host.Services
            // to get at the underlying Lamar Container
        }

        return true;
    }


}

Getting Started

In both cases I’m assuming that you’ve bootstrapped your application with one of the standard project templates like dotnet new webapi or dotnet new mvc. In both cases, you’ll first add a reference to the Oakton.AspNetCore Nuget. Next, break into the Program.Main()entry point method in your project and modify it like the following samples.

If you’re absolutely cutting edge and using ASP.Net Core 3.0:

public class Program
{
    public static Task<int> Main(string[] args)
    {
        return CreateHostBuilder(args)
            
            // This extension method replaces the calls to
            // IWebHost.Build() and Start()
            .RunOaktonCommands(args);
    }

    public static IHostBuilder CreateHostBuilder(string[] args) =>
        Host.CreateDefaultBuilder(args)
            .ConfigureWebHostDefaults(x => x.UseStartup<Startup>());
    
}

For what I would guess is most folks, the ASP.Net Core 2.* setup (and this would work as well for ASP.Net Core 3.0 as well):

public class Program
{
    public static Task<int> Main(string[] args)
    {
        return CreateWebHostBuilder(args)
            
            // This extension method replaces the calls to
            // IWebHost.Build() and Start()
            .RunOaktonCommands(args);
    }

    public static IWebHostBuilder CreateWebHostBuilder(string[] args) =>
        WebHost.CreateDefaultBuilder(args)
            .UseStartup<Startup>();
    
}

The two changes from the template defaults is to:

  1. Change the return value to Task<int>
  2. Replace the calls to IHost.Build() and IHost.Start() to use the RunOaktonCommands(args) extension method that hangs off IWebHostBuilder and the new unified IHostBuilder if you’re targeting netcoreapp3.0.

And that’s it, you’re off to the races.