.Net 6 WebApplicationBuilder and Lamar

TL;DR — The latest Lamar V8.0.1 release has some bug fixes and mild breaking changes around the .Net Core DI integration that eliminates user reported problems with the new .Net 6 bootstrapping.

Hey, before I jump into the Lamar improvements for .Net 6, read Jimmy Bogard’s latest post for an example reason why you would opt to use Lamar over the built in DI container.

I’ve had a rash of error reports against Lamar when used with the new WebApplicationBuilder bootstrapping model that came with ASP.Net Core 6. Fortunately, the common culprit (in ahem oddball .Net Core mechanics more than Lamar itself) was relatively easy to find, and the most recent Lamar V8 made some minor adjustments to the .Net Core adapter code to fix the issues.

To use Lamar with the new .Net 6 bootstrapping model, you need to install the Lamar.Microsoft.DependencyInjection Nuget and use the UseLamar() extension method on IHostBuilder to opt into using Lamar in place of the built in DI container.

You can find more information about using Lamar with the new WebApplicationBuilder model and Minimal APIs in the Lamar documentation.

As an example, consider this simplistic system from the Lamar testing code:

var builder = WebApplication.CreateBuilder(args);

// use Lamar as DI.
builder.Host.UseLamar((context, registry) =>
{
    // register services using Lamar
    registry.For<ITest>().Use<MyTest>();
    
    // Add your own Lamar ServiceRegistry collections
    // of registrations
    registry.IncludeRegistry<MyRegistry>();

    // discover MVC controllers -- this was problematic
    // inside of the UseLamar() method, but is "fixed" in
    // Lamar V8
    registry.AddControllers();
});

var app = builder.Build();
app.MapControllers();

// Add Minimal API routes
app.MapGet("/", (ITest service) => service.SayHello());

app.Run();

Notice that we’re adding service registrations directly within the nested lambda passed into the UseLamar() method. In the previous versions of Lamar, those service registrations were completely isolated and additive to the service registrations in the Startup.ConfigureServices() — and that was very rarely an issue. In the new .Net 6 model, that became problematic as some of Microsoft’s out of the box service registration extension methods like AddControllers() depend on state being smuggled through the service collection and did not work inside of the UseLamar() method before Lamar v8.

The simple “fix” in Lamar v8 was to ensure that the service registrations inside of UseLamar() were done additively to the existing set of service registrations built up by the core .Net host building like so:

        /// <summary>
        /// Shortcut to replace the built in DI container with Lamar using service registrations
        /// dependent upon the application's environment and configuration.
        /// </summary>
        /// <param name="builder"></param>
        /// <param name="registry"></param>
        /// <returns></returns>
        public static IHostBuilder UseLamar(this IHostBuilder builder, Action<HostBuilderContext, ServiceRegistry> configure = null)
        {
            return builder
                .UseServiceProviderFactory<ServiceRegistry>(new LamarServiceProviderFactory())
                .UseServiceProviderFactory<IServiceCollection>(new LamarServiceProviderFactory())
                .ConfigureServices((context, services) =>
                {
                    var registry = new ServiceRegistry(services);
                
                    configure?.Invoke(context, registry);
                
                    // Hack-y, but this makes everything work as 
                    // expected
                    services.Clear();
                    services.AddRange(registry);

#if NET6_0_OR_GREATER
                    // This enables the usage of implicit services in Minimal APIs
                    services.AddSingleton(s => (IServiceProviderIsService) s.GetRequiredService<IContainer>());
#endif
                    
                });
        }

The downside of this “fix” was that I eliminated all other overloads of the UseLamar() extension method that relied on custom Lamar ServiceRegistry types. You can still use the IncludeRegistry<T>() method to use custom ServiceRegistry types though.

As always, if you have any issues with Lamar with or without ASP.Net Core, the Lamar Gitter room is the best and fastest way to ask questions.

My professional and OSS aspirations for 2022

I trot out one of these posts at the beginning of each year, but this time around it’s “aspirations” instead of “plans” because a whole lot of stuff is gonna be a repeat from 2020 and 2021 and I’m not going to lose any sleep over what doesn’t get done in the New Year or not be open to brand new opportunities.

In 2022 I just want the chance to interact with other developers. I’ll be at ThatConference in Round Rock, TX in January May? speaking about Event Sourcing with Marten (my first in person conference since late 2019). Other than that, my only goal for the year (Covid-willing) is to maybe speak at a couple more in person conferences just to be able to interact with other developers in real space again.

My peak as a technical blogger was the late aughts, and I think I’m mostly good with not sweating any kind of attempt to regain that level of readership. I do plan to write material that I think would be useful for my shop, or just about what I’m doing in the OSS space when I feel like it.

Which brings me to the main part of this post, my involvement with the JasperFx (Marten, Lamar, etc). family of OSS projects (plus Storyteller) which takes up most of my extracurricular software related time. Just for an idea of the interdependencies, here’s the highlights of the JasperFx world:

.NET Transactional Document DB and Event Store on PostgreSQL

Marten took a big leap forward late in 2021 with the long running V4.0 release. I think that release might have been the single biggest, most complicated OSS release that I’ve ever been a part of — FubuMVC 1.0 notwithstanding. There’s also a 5.0-alpha release out that addresses .Net 6 support and the latest version of Npgsql.

Right now Marten is a victim of its own success, and our chat room is almost constantly hair on fire with activity, which directly led to some planned improvements for V5 (hopefully by the end of January?) in this discussion thread:

  • Multi-tenancy through a separate database per tenant (long planned, long delayed, finally happening now)
  • Some kind of ability to register and resolve services for more than one Marten database in a single application
  • And related to the previous two bullet points, improved database versioning and schema migrations that could accommodate there being more than one database within a single .Net codebase
  • Improve the “generate ahead” model to make it easier to adopt. Think faster cold start times for systems that use Marten

Beyond that, some of the things I’d like to maybe do with Marten this year are:

  • Investigate the usage of Postgresql table partitioning and database sharding as a way to increase scalability — especially with the event sourcing support
  • Projection snapshotting
  • In conjunction with Jasper, expand Marten’s asynchronous projection support to shard projection work across multiple running nodes, introduce some sort of optimized, no downtime projection rebuilds, and add some options for event streaming with Marten and Kafka or Pulsar
  • Try to build an efficient GraphQL adapter for Marten. And by efficient, I mean that you wouldn’t have to bounce through a Linq translation first and hopefully could opt into Marten’s JSON streaming wherever possible. This isn’t likely, but sounds kind of interesting to play with.

In a perfect, magic, unicorns and rainbows world, I’d love to see the Marten backlog in GitHub get under 50 items and stay there permanently. Commence laughing at me on that one:(

Jasper is a toolkit for common messaging scenarios between .Net applications with a robust in process command runner that can be used either with or without the messaging.

I started working on rebooting Jasper with a forthcoming V2 version late last year, and made quite a bit of progress before Marten got busy and .Net 6 being released necessitated other work. There’s a non-zero chance I will be using Jasper at work, which makes that a much more viable project. I’m currently in flight with:

  • Building Open Telemetry tracing directly into Jasper
  • Bi-directional compatibility with MassTransit applications (absolutely necessary to adopt this in my own shop).
  • Performance optimizations
  • .Net 6 support
  • Documentation overhaul
  • Kafka as a message transport option (Pulsar was surprisingly easy to add, and I’m hopeful that Kafka is similar)

And maybe, just maybe, I might extend Jasper’s somewhat unique middleware approach to web services utilizing the new ASP.Net Core Minimal API support. The idea there is to more or less create an improved version of the old FubuMVC idiom for building web services.

Lamar is a modern IoC container and the successor to StructureMap

I don’t have any real plans for Lamar in the new year, but there are some holes in the documentation, and a couple advanced features could sure use some additional examples. 2021 ended up being busy for Lamar though with:

  1. Lamar v6 added interception (finally), a new documentation website, and a facility for overriding services at test time
  2. Lamar v7 added support for IAsyncEnumerable (also finally), a small enhancement for the Minimal API feature in ASP.Net Core, and .Net 6 support

Add Robust Command Line Options to .Net Applications

Oakton did have a major v4/4.1 release to accommodate .Net 6 and ASP.Net Core Minimal API usage late in 2021, but I have yet to update the documentation. I would like to shift Oakton’s documentation website to VitePress first. The only plans I have for Oakton this year is to maybe see if there’d be a good way for Oakton to enable “buddy” command line tools to your application like the dotnet ef tool using the HostFactoryResolver class.

The bustling metropolis of Alba, MO

Alba is a wrapper around the ASP.Net Core TestServer for declarative, in process testing of ASP.Net Core web services. I don’t have any plans for Alba in the new year other than to respond to any issues or opportunities to smooth out usage from my shop’s usage of Alba.

Alba did get a couple major releases in 2021 though:

  1. Alba 5.0 streamlined the entry API to mimic IHost, converted the documentation website to VitePress, and introduced new facilities for dealing with security in testing.
  2. Alba 6.0 added support for WebApplicationFactory and ASP.Net Core 6

Solutions for creating robust, human readable acceptance tests for your .Net or CoreCLR system and a means to create “living” technical documentation.

Storyteller has been mothballed for years, and I was ready to abandon it last year, but…

We still use Storyteller for some big, long running integration style tests in both Marten and Jasper where I don’t think xUnit/NUnit is a good fit, and I think maybe I’d like to reboot Storyteller later this year. The “new” Storyteller (I’m playing with the idea of calling it “Bobcat” as it might be a different tool) would be quite a bit smaller and much more focused on enabling integration testing rather than trying to be a BDD tool.

Not sure what the approach might be, it could be:

  • “Just” write some extension helpers to xUnit or NUnit for more data intensive tests
  • “Just” write some extension helpers to SpecFlow
  • Rebuild the current Storyteller concept, but also support a Gherkin model
  • Something else altogether?

My goals if this happens is to have a tool for automated testing that maybe supports:

  • Much more data intensive tests
  • Better handles integration tests
  • Strong support for test parallelization and even test run sharding in CI
  • Could help write characterization tests with a record/replay kind of model against existing systems (I’d *love* to have this at work)
  • Has some kind of model that is easy to use within an IDE like Rider or VS, even if there is a separate UI like Storyteller does today

And I’d still like to rewrite a subset of the existing Storyteller UI as an excuse to refresh my front end technology skillset.

To be honest, I don’t feel like Storyteller has ever been much of a success, but it’s the OSS project of mine that I’ve most enjoyed working on and most frequently used myself.

Weasel

Weasel is a set of libraries for database schema migrations and ADO.Net helpers that we spun out of Marten during its V4 release. I’m not super excited about doing this, but Weasel is getting some sort of database migration support very soon. Weasel isn’t documented itself yet, so that’s the only major plan other than supporting whatever Marten and/or Jasper needs this year.

Baseline

Baseline is a grab bag of helpers and extension methods that dates back to the early FubuMVC project. I haven’t done much with Baseline in years, and it might be time to prune it a little bit as some of what Baseline does is now supported in the .Net framework itself. The file system helpers especially could be pruned down, but then also get asynchronous versions of what’s left.

StructureMap

I don’t think that I got a single StructureMap question last year and stopped following its Gitter room. There are still plenty of systems using StructureMap out there, but I think the mass migration to either Lamar or another DI container is well underway.

Lamar v7 meets .Net 6, Minimal APIs, and IAsyncDisposable

It’s been a busy couple weeks in OSS world for me scurrying around and getting things usable in .Net 6. Today I’m happy to announce the release of Lamar 7.0. The Nuget for Lamar itself and Lamar.Microsoft.DependencyInjection with adjusted dependencies for .Net 6 went up yesterday, and I made some additions to the documentation website just now. There are no breaking changes in the API, but Lamar dropped all support for any version of .Net < .Net 5.0. Before I get into the highlights, I’d like to thank:

  • Babu Annamalai for making the docs so easy to re-publish
  • Khalid Abuhakmeh and Stephan Steiger for their help with the Minimal API support
  • Andrew Lock for writing some very helpful blog posts about new .Net 6 internals that have helped me get through .Net 6 improvements to several tools the past couple weeks.

Lamar and Minimal API

Lamar v7 adds some specific support for better usability of the new Minimal API feature in ASP.Net Core. Below is the sample we use in the Lamar documentation and the internal tests:

var builder = WebApplication.CreateBuilder(args);

// use Lamar as DI.
builder.Host.UseLamar((context, registry) =>
{
    // register services using Lamar
    registry.For<ITest>().Use<MyTest>();
    registry.IncludeRegistry<MyRegistry>();

    // add the controllers
    registry.AddControllers();
});


var app = builder.Build();
app.MapControllers();

// [FromServices] is NOT necessary when using Lamar v7
app.MapGet("/", (ITest service) => service.SayHello());

app.Run();

Lamar and IAsyncDisposable

Just copy/pasting the documentation here…

The Lamar IContainer itself, and all nested containers (scoped containers in .Net DI nomenclature) implement both IDisposable and IAsyncDisposable. It is not necessary to call both Dispose() and DisposeAsync() as either method will dispose all tracked IDisposable / IAsyncDisposable objects when either method is called.

// Asynchronously disposing the container
await container.DisposeAsync();

The following table explains what method is called on a tracked object when the creating container is disposed:

If an object implements…Container.Dispose()Container.DisposeAsync()
IDisposableDispose()Dispose()
IAsyncDisposableDisposeAsync().GetAwaiter().GetResult()DisposeAsync()
IDisposable and IAsyncDisposableDisposeAsync()DisposeAsync()

If any objects are being created by Lamar that only implement IAsyncDisposable, it is probably best to strictly use Container.DisposeAsync() to avoid any problematic mixing of sync and async code.

JasperFx OSS Plans for .Net 6 (Marten et al)

I’m going to have to admit that I got caught flat footed by the .Net 6 release a couple weeks ago. I hadn’t really been paying much attention to the forthcoming changes, maybe got cocky by how easy the transition from netcoreapp3.1 to .Net 5 was, and have been unpleasantly surprised by how much work it’s going to take to move some OSS projects up to .Net 6. All at the same time that the advance users of the world are clamoring for all their dependencies to target .Net 6 yesterday.

All that being said, here’s my running list of plans to get the projects in the JasperFx GitHub organization successfully targeting .Net 6. I’ll make edits to this page as things get published to Nuget.

Baseline

Baseline is a grab bag utility library full of extension methods that I’ve relied on for years. Nobody uses it directly per se, but it’s a dependency of just about every other project in the organization, so it went first with the 3.2.2 release adding a .Net 6 target. No code changes were necessary other than adding .Net 6 to the CI testing. Easy money.

Oakton

EDIT: Oakton v4.0 is up on Nuget. WebApplication is supported, but you can’t override configuration in commands with this model like you can w/ HostBuilder only. I’ll do a follow up at some point to fill in this gap.

Oakton is a tool to add extensible command line options to .Net applications based on the HostBuilder model. Oakton is my problem child right now because it’s a dependency in several other projects and its current model does not play nicely with the new WebApplicationBuilder approach for configuring .Net 6 applications. I’d also like to get the Oakton documentation website moved to the VitePress + MarkdownSnippets model we’re using now for Marten and some of the other JasperFx projects. I think I’ll take a shortcut here and publish the Nuget and let the documentation catch up later.

Alba

Alba is an automated testing helper for ASP.Net Core. Just like Oakton, Alba worked very well with the HostBuilder model, but was thrown for a loop with the new WebApplicationBuilder configuration model that’s the mechanism for using the new Minimal API (*cough* inevitable Sinatra copy *cough*) model. Fortunately though, Hawxy came through with a big pull request to make Alba finally work with the WebApplicationFactory model that can accommodate the new WebApplicationBuilder model, so we’re back in business soon. Alba 5.1 will be published soon with that work after some documentation updates and hopefully some testing with the Oakton + WebApplicationBuilder + Alba model.

EDIT: Alba 7.0 is up with the necessary changes, but the docs will come later this week

Lamar

Lamar is an IoC/DI container and the modern successor to StructureMap. The biggest issue with Lamar on v6 was Nuget dependencies on the IServiceCollection model, plus needing some extra implementation to light up the implied service model of Minimal APIs. All the current unit tests and even integration tests with ASP.Net Core are passing on .Net 6. To finish up a new Lamar 7.0 release is:

  • One .Net 6 related bug in the diagnostics
  • Better Minimal API support
  • Upgrade Oakton & Baseline dependencies in some of the Lamar projects
  • Documentation updates for the new IAsyncDisposable support and usage with WebApplicationBuilder with or without Minimal API usage

EDIT: Lamar 7.0 is up on Nuget with .Net 6 support

Marten/Weasel

We just made the gigantic V4 release a couple months ago knowing that we’d have to follow up quickly with a V5 release with a few breaking changes to accommodate .Net 6 and the latest version of Npgsql. We are having to make a full point release, so that opens the door for other breaking changes that didn’t make it into V4 (don’t worry, I think shifting from V4 to V5 will be easy for most people). The other Marten core team members have been doing most of the work for this so far, but I’m going to jump into the fray later this week to do some last minute changes:

  • Review some internal changes to Npgsql that might have performance impacts on Marten
  • Consider adding an event streaming model within the new V4 async daemon. For folks that wanna use that to publish events to some kind of transport (Kafka? Some kind of queue?) with strict ordering. This won’t be much yet, but it keeps coming up so we might as well consider it.
  • Multi-tenancy through multiple databases. It keeps coming up, and potentially causes breaking API changes, so we’re at least going to explore it

I’m trying not to slow down the Marten V5 release with .Net 6 support for too long, so this is all either happening really fast or not at all. I’ll blog more later this week about multi-tenancy & Marten.

Weasel is a spin off library from Marten for database change detection and ADO.Net helpers that are reused in other projects now. It will be published simultaneously with Marten.

Jasper

Oh man, I’d love, love, love to have Jasper 2.0 done by early January so that it’ll be available for usage at my company on some upcoming work. This work is on hold while I deal with the other projects, my actual day job, and family and stuff.

Self Diagnosing Deployments with Oakton and Lamar

So here’s the deal, sometimes, somehow you deploy a new version of a system into a testing, staging, or production environment and it doesn’t work. Shocking and sometimes distressing when that happens, right?

There are any number of problems that could be the cause. Maybe a database is in an invalid state, maybe a file got missed in the deployment, maybe some bit of configuration is wrong, maybe a downstream or upstream collaborating system is down or unreachable. Who knows, right? Well, what if we could make our systems self-diagnosing so they can do a quick rundown of how they’re configured and running right at start up time and fail fast if something is detected to be wrong with the deployment? And also have the system tell us exactly what is wrong through first causes without having to later trace runtime failures to the ultimate root cause?

To that end, let’s talk about some mechanisms in both the Oakton and Lamar libraries to quickly add in what I like to call “environment checks”, meaning self diagnosing checks on system startup to test out system configuration and validity. Oakton has a facility to run environment checks in either a separate diagnostic command directly in your application, or to run the environment checks at system start up time, make a detailed report of any failures, and make the process stop and return an error code. In turn, a continuous deployment script should be able to detect the failure to start the system and rollback to a previously known, good state.

In the past, I’ve worked in teams where we’ve embedded environment checks to:

  • Verify that a configured database is reachable
  • Ping an external web service dependency
  • Check for the existence of required files
  • Verify that an expected COM dependency was registered (shudder, there’s some bad memories in that one)
  • Test that security features are correctly configured and usable — and I think that’s a big one
  • Assert that the system has the ability to read and write configured file system directories — also some serious scar tissue

The point here is to make your deployments fail fast anytime there’s an environmental misconfiguration, and do so in a way that makes it easy for you to spot exactly what’s wrong. Environment checks can help teams avoid system down time and keep testers from blowing up the defect lists with false negatives from misconfigured systems.

Getting Started with Oakton

Just to get started, I spun up a new .Net 5 web service with dotnet new webapi. That template gives you this code in the Program.Main() method:

        public static void Main(string[] args)
        {
            CreateHostBuilder(args).Build().Run();
        }

I’m going to add a Nuget reference to Oakton, then change that method up above so that Oakton is handling the command line parsing and system startup like this:

        // It's important to return Task<int> so that
        // Oakton can signal failures with non-zero
        // return codes
        public static Task<int> Main(string[] args)
        {
            return CreateHostBuilder(args).RunOaktonCommands(args);
        }

There’s still nothing going on in our web service, but from the command line we could run dotnet run -- check-env just to start up our application’s IHost and run all the environment checks in a test mode. Nothing there yet, so you’d get output like this:

   ___            _      _
  / _ \    __ _  | | __ | |_    ___    _ __
 | | | |  / _` | | |/ / | __|  / _ \  | '_ \
 | |_| | | (_| | |   <  | |_  | (_) | | | | |
  \___/   \__,_| |_|\_\  \__|  \___/  |_| |_|

No environment checks.
All environment checks are good!

We can add environment checks directly to our application with Oakton’s built in mechanisms like this example that just validates that there’s an appsettings.json file in the application path:

        public void ConfigureServices(IServiceCollection services)
        {
            // Literally just proving out that appsettings.json is available
            services.CheckThatFileExists("appsettings.json");
            
            // Other registrations
        }

And now that same dotnet run -- check-env file gives us this output:

   ___            _      _
  / _ \    __ _  | | __ | |_    ___    _ __
 | | | |  / _` | | |/ / | __|  / _ \  | '_ \
 | |_| | | (_| | |   <  | |_  | (_) | | | | |
  \___/   \__,_| |_|\_\  \__|  \___/  |_| |_|

   1.) Success: File 'appsettings.json' exists

Running Environment Checks ---------------------------------------- 100%

All environment checks are good!

One of the other things that Oakton does is intercept the basic dotnet run command with extra options, so we can use the --check flag like so:

dotnet run -- --check

That command is going to:

  1. Bootstrap and start the IHost for your application
  2. Load and execute all the registered environment checks for your application
  3. Report the status of each environment check to the console
  4. Fail the executable if any environment checks fail

In my tiny sample app I’m building here, the output of that call is this:

C:\code\JasperSamples\EnvironmentChecks\WebApplication\WebApplication>dotnet run -- --check
Building...
   1.) Success: File 'appsettings.json' exists

Running Environment Checks ---------------------------------------- 100%

info: Microsoft.Hosting.Lifetime[0]
      Now listening on: https://localhost:5001
info: Microsoft.Hosting.Lifetime[0]
      Now listening on: http://localhost:5000
info: Microsoft.Hosting.Lifetime[0]
      Application started. Press Ctrl+C to shut down.
info: Microsoft.Hosting.Lifetime[0]
      Hosting environment: Development
info: Microsoft.Hosting.Lifetime[0]
      Content root path: C:\code\JasperSamples\EnvironmentChecks\WebApplication\WebApplication

At deployment time, that call is probably just [your application name] --check to start the application.

Environment Checks with Lamar

Now that we’ve got the basics for environment checks built into our new web service with Oakton, I want to introduce Lamar as the DI/IoC tool for the application. I’ll add a Nuget reference to Lamar.Microsoft.DependencyInjection to my project. First to make Lamar be the IoC container for my new service, I’ll add this line of code to the Program.CreateHostBuilder() method:

        public static IHostBuilder CreateHostBuilder(string[] args) =>
            Host.CreateDefaultBuilder(args)
                
                // Make Lamar the application container
                .UseLamar()
                .ConfigureWebHostDefaults(webBuilder =>
                {
                    webBuilder.UseStartup<Startup>();
                });

Now, I’m going to go a little farther and add a reference to the Lamar.Diagnostics Nuget as well. That adds some Lamar specific diagnostic commands to our command line options, but also allows us to add Lamar container checks at startup like this:

        public void ConfigureServices(IServiceCollection services)
        {
            // Literally just proving out that appsettings.json is available
            services.CheckThatFileExists("appsettings.json");
            
            // Do a full check of the Lamar configuration
            // and run Lamar environment checks too!
            services.CheckLamarConfiguration();

Running our dotnet run -- check-env command again gives us:

   ___            _      _
  / _ \    __ _  | | __ | |_    ___    _ __
 | | | |  / _` | | |/ / | __|  / _ \  | '_ \
 | |_| | | (_| | |   <  | |_  | (_) | | | | |
  \___/   \__,_| |_|\_\  \__|  \___/  |_| |_|

   1.) Success: File 'appsettings.json' exists
   2.) Success: Lamar IoC Service Registrations
   3.) Success: Lamar IoC Type Scanning

Running Environment Checks ---------------------------------------- 100%

All environment checks are good!

The Lamar container checks are a little heavyweight, so watch out for that because you might want to dial them back. Those checks run through every single service registration in Lamar, verifies that all the dependencies exist, and tries to build every registration at least once.

Next though, let’s look at how Lamar lets us plug its own form of environment checks. On any concrete class that is built by Lamar, you can directly embed environment checks with methods like this fake service:

    public class SometimesMisconfiguredService
    {
        private readonly IConfiguration _configuration;

        public SometimesMisconfiguredService(IConfiguration configuration)
        {
            _configuration = configuration;
        }

        [ValidationMethod]
        public void Validate() // The method name does not matter
        {
            var connectionString = _configuration.GetConnectionString("database");
            using var conn = new SqlConnection(connectionString);
            
            // Just try to connect to the configured database
            conn.Open();
        }
    }

The method name above doesn’t matter. All that Lamar needs to see is the [ValidationMethod] attribute. See Lamar’s Environment Tests for a little more information about using this feature. And I don’t really need to “do” anything other than throw an exception if the check logically fails.

And I’ll register that service in the container in the Startup.ConfigureServices() method like so:

            services.AddSingleton<SometimesMisconfiguredService>();

Now, going back to the application and I’ll try to start it up with the environment check flag active — but I haven’t configured a database connection string or even attempted to spin up a database at all, so this should fail fast. And it does with a call to dotnet run -- --check:

   1.) Success: File 'appsettings.json' exists
   2.) Failed: Lamar IoC Service Registrations

ERROR: Lamar.IoC.ContainerValidationException: Error in WebApplication.SometimesMisconfiguredService.Validate()

AND A LOT OF STACK TRACE VERBIAGE FROM THE FAILURES

To get at just the Lamar container validation, you can also use dotnet run -- lamar-validate. IoC container tools are a dime a dozen in .Net, and many of them are perfectly competent. When I’m asked “why use Lamar?”, my stock answer is to use Lamar for its diagnostic capabilities.

More Information

Integration Testing: IHost Lifecycle with xUnit.Net

I’m part of an initiative at work to analyze and ultimately improve our test automation practices. As part of that work, I’ll be blogging quite a bit about test automation starting with my brain dump on test automation last week and my most recent post on mocks and stubs last month. From here on out, I’m curating all of my posts and selected writings from other folks on my new Automated Testing page.

I’m already on record as saying that the generic host (IHost) in recent versions of .Net is one of the best things that’s ever happened to the .Net ecosystem. In my previous post I stated that I strongly prefer having the system under test running in process with the test harness for faster feedback cycles and easier debugging. The generic host builder introduced in .Net Core turns out to be a very effective way to bootstrap your system within automated test harnesses.

Before I dive into how to use the IHost in automated testing, here’s a couple issues I think you have to address in your integration testing strategy before we go willy nilly spinning up an IHost:

  • You ideally want to test against your code running in a realistic way, so the way code is bootstrapped and configured should be relatively close to how that code is started up in the real application.
  • There will inevitably need to be at least some configuration that needs to be different in testing or some services — usually accessing resources external to your system — that need to be replaced with stubs or some other kind of fake implementation.
  • It’s important to cleanly dispose or shutdown any IHost object you create in memory to avoid potential locks of resources like database connections, files, or ports. Failing to clean up resources in tests can easily make it harder to iterate through test fixes if you find yourself needing to manually kill processes or restart your IDE to release locked resources (been there, done that).
  • The IHost can be expensive to build up, and sometimes there’s going to be some serious benefit in reusing the IHost between tests to make the test suite run faster.
  • But the IHost is stateful, and there could easily be resources (singleton scoped services, databases, and whatnot) that could impact later test runs in the suite.

Before I jump into solutions, let’s assume that I have two projects:

  1. WebApplication is an ASP.Net Core web service project. WebApplication uses Lamar as its underlying DI container.
  2. A test project that references WebApplication

xUnit.Net Mechanics

I’m more comfortable with xUnit.Net, so I’m going to use that first. My typical usage is to share the IHost through xUnit.Net’s CollectionFixture mechanism (and if you think the usage of this thing is confusing, welcome to the club). First up, I’ll build out a new class I usually call AppFixture to manage the lifecycle of the IHost. The example project I’ve built here is an ASP.Net Core web service project, so I’m going to use Alba to wrap the host inside of AppFixture as shown below:

    public class AppFixture : IDisposable, IAsyncLifetime
    {
        public IAlbaHost Host { get; private set; }
        public async Task InitializeAsync()
        {
            // Program.CreateHostBuilder() is the code from the WebApplication
            // that configures the HostBuilder for the system
            Host = await Program
                .CreateHostBuilder(Array.Empty<string>())
                
                // This extension method starts up the underlying IHost,
                // but Alba replaces Kestrel with a TestServer and
                // wraps the IHost
                .StartAlbaAsync();
        }

        public Task DisposeAsync()
        {
            return Host.StopAsync();
        }

        public void Dispose()
        {
            Host?.Dispose();
        }
    }

A couple things to note in that code above:

  • As we’ll set up next, that class above will be constructed once in memory by xUnit and shared between test fixture classes
  • The Dispose() and DisposeAsync() methods both dispose the IHost. By normal .Net mechanics, that will also dispose the underlying Lamar IoC container, which will in turn dispose any services created by Lamar at runtime that implement IDisposable. Disposing the IHost also stops any registered IHostedService services that your application may be using for long running tasks (for my colleagues who may be reading this, both NServiceBus and MassTransit start and stop their message listeners in an IHostedService, so that might be in use even if you don’t explicitly use that technique).

Next, we’ll set up AppFixture to be shared between our integration test classes by using the [CollectionDefinition] attribute on a marker class:

    [CollectionDefinition("Integration")]
    public class AppFixtureCollection : ICollectionFixture<AppFixture>
    {
        
    }

Lastly, I like to build out a base class for integration tests like this one:

    [Collection("Integration")]
    public abstract class IntegrationContext
    {
        protected IntegrationContext(AppFixture fixture)
        {
            theHost = fixture.Host;
            
            // I am using Lamar as the underlying DI container
            // and want some Lamar specific things later on
            // in the tests
            Container = (IContainer)fixture.Host.Services;
        }

        public IAlbaHost theHost { get; }
        
        public IContainer Container { get; }
    }

The [Collection] attribute is meaningful here because that makes xUnit.Net run all the tests that are contained in test fixture classes that inherit from IntegrationContext in a single thread so we don’t have to worry about concurrent test runs.*

And finally to bring this all together, let’s say that WebApplication has this simplistic web service code to do some arithmetic:

    public class Result
    {
        public int Sum { get; set; }
        public int Product { get; set; }
    }

    public class Numbers
    {
        public int[] Values { get; set; }
    }
    
    public class ArithmeticController : ControllerBase
    {
        [HttpPost("/math")]
        public Result DoMath([FromBody] Numbers input)
        {
            var product = 1;
            foreach (var value in input.Values)
            {
                product *= value;
            }

            return new Result
            {
                Sum = input.Values.Sum(),
                Product = product
            };
        }
    }

In the next code block, let’s finally see a test fixture class that uses the new IntegrationContext as a base class and tests the HTTP endpoint shown in the block above.

    public class ArithmeticApiTests : IntegrationContext
    {
        public ArithmeticApiTests(AppFixture fixture) : base(fixture)
        {
        }

        [Fact]
        public async Task post_to_a_secured_endpoint_with_jwt_from_extension()
        {
            // Building the input body
            var input = new Numbers
            {
                Values = new[] {2, 3, 4}
            };

            var response = await theHost.Scenario(x =>
            {
                // Alba deals with Json serialization for us
                x.Post.Json(input).ToUrl("/math");
                
                // Enforce that the HTTP Status Code is 200 Ok
                x.StatusCodeShouldBeOk();
            });

            var output = response.ReadAsJson<Result>();
            output.Sum.ShouldBe(9);
            output.Product.ShouldBe(24);
        }
    }

Alright, at this point we’ve got a way to shared the system’s IHost in tests for better efficiency, and we’re making sure that all the resources in the IHost are cleaned up when the test suite is done. We’re using the WebApplication’s exact configuration for the IHost, but we still might need to alter that in testing. And there’s also the issue of needing to roll back state in our system between tests. I’ll pick up those subjects in my next couple posts, as well as using NUnit instead of xUnit.Net because that’s what the majority of code at my work uses for testing.

* It would be nice to be able to run parallel tests using our shared IHost, but that can often be problematic because of shared state, so I generally bypass test parallelization in integrated tests. The subject of parallelizing integration tests is worthy of a later blog post of some thoughts I haven’t quite elucidated yet.

Lamar v6 is out! Expanded interception options, overriding test services, better documentation

Lamar is a modern IoC container library that is optimized for usage within .Net Core / .Net 5/6 applications and natively implements the .Net DI abstractions (AddTransient/Singleton/Scoped() etc). Lamar was specifically built to be a much higher performance version of the old StructureMap library with similar usage for relatively easy migration.

Lamar v6.0.0 went live today. It’s not a very large release at all, but I took this opportunity to tuck some previously public types into being internal to clean up your Intellisense and eliminated some obsolete public members. I don’t anticipate any changes in your code for the mass majority of Lamar users upgrading.

The highlights of this release are:

  • Lamar’s documentation website got a big facelift as it moved from an old Bootstrap theme with my old stdocs tool to using Vitepress and mdsnippets. I also did a sweep to remove old StructureMap nomenclature and replace that with the newer Lamar nomenclature that very purposely mimics the .Net DI abstraction terminology.
  • New functionality to reliably override service registrations at container creation time. This was specifically meant for automated integration testing where you may want to override specific service registrations. This hasn’t been a straightforward thing in the past because the generic host builder applies service registrations from Startup classes last no matter what, and that defeated many efforts to override services in testing.
  • AaronAllBright contributed an important pull request for some missing activation features. That was expanded into a full blown interception and activation feature set to go along with the support for decorators in Lamar v1+. That’s been a frequent concern for folks looking to move off of StructureMap and one of the biggest remaining features in Lamar’s backlog, so mark that one off.
  • A couple bug fixes, but I think this is the only big one.
  • I stopped and modernized the devops a bit to replace the old Rake build with Bullseye & SimpleExec. I also moved the CI from AppVeyor to GitHub actions. I wouldn’t say that either move was a big deal, but it is nice having the CI information right in the GitHub website.

Dynamic Code Generation in Marten V4

Marten V4 extensively uses runtime code generation backed by Roslyn runtime compilation for dynamic code. This is both much more powerful than source generators in what it allows us to actually do, but can have significant memory usage and “cold start” problems (seems to depend on exact configurations, so it’s not a given that you’ll have these issues). In this post I’ll show the facility we have to “generate ahead” the code to dodge the memory and cold start issues at production time.

Before V4, Marten had used the common model of building up Expression trees and compiling them into lambda functions at runtime to “bake” some of our dynamic behavior into fast running code. That does work and a good percentage of the development tools you use every day probably use that technique internally, but I felt that we’d already outgrown the dynamic Expression generation approach as it was, and the new functionality requested in V4 was going to substantially raise the complexity of what we were going to need to do.

Instead, I (Marten is a team effort, but I get all the blame for this one) opted to use the dynamic code generation and compilation approach using LamarCodeGeneration and LamarCompiler that I’d originally built for other projects. This allowed Marten to generate much more complex code than I thought was practical with other models (we could have used IL generation too of course, but that’s an exercise in masochism). If you’re interested, I gave a talk about these tools and the approach at NDC London 2019.

I do think this has worked out in terms of performance improvements at runtime and certainly helped to be able to introduce the new document and event store metadata features in V4, but there’s a catch. Actually two:

  1. The Roslyn compiler sucks down a lot of memory sometimes and doesn’t seem to ever release it. It’s gotten better with newer releases and it’s not consistent, but still.
  2. There’s a sometimes significant lag in cold start scenarios on the first time Marten needs to generate and compile code at runtime

What we could do though, is provide what I call..

Marten’s “Generate Ahead” Strategy

To side step the problems with the Roslyn compilation, I developed a model (I did this originally in Jasper) to generate the code ahead of time and have it compiled into the entry assembly for the system. The last step is to direct Marten to use the pre-compiled types instead of generating the types at runtime.

Jumping straight into a sample console project to show off this functionality, I’m configuring Marten with the AddMarten() method you can see in this code on GitHub.

The important line of code you need to focus on here is this flag:

opts.GeneratedCodeMode = TypeLoadMode.LoadFromPreBuiltAssembly;

This flag in the Marten configuration directs Marten to first look in the entry assembly of the application for any types that it would normally try to generate at runtime, and if that type exists, load it from the entry assembly and bypass any invocation of Roslyn. I think in a real application you’d wrap that call something like this so that it only applies when the application is running in production mode:

if (Environment.IsProduction())
{

    options.GeneratedCodeMode = 
        TypeLoadMode.LoadFromPreBuiltAssembly;
}

The next thing to notice is that I have to tell Marten ahead of time what the possible document types and even about any compiled query types in this code so that Marten will “know” what code to generate in the next section. The compiled query registration is new, but you already had to let Marten know about the document types to make the schema migration functionality work anyway.

Generating and exporting the code ahead of time is done from the command line through an Oakton command. First though, add the LamarCodeGeneration.Commands Nuget to your entry project, which will also add a transitive reference to Oakton if you’re not already using it. This is all described in the Oakton getting started page, but you’ll need to change your Program.Main() method slightly to activate Oakton:

        // The return value needs to be Task<int> 
        // to communicate command success or failure
        public static Task<int> Main(string[] args)
        {
            return CreateHostBuilder(args)
                
                // This makes Oakton be your CLI
                // runner
                .RunOaktonCommands(args);
        }

If you’ll open up the command terminal of your preference at the root directory of the entry project, type this command to see the available Oakton commands:

dotnet run -- help

That’ll spit out a list of commands and the assemblies where Oakton looked for command types. You should see output similar to this:

Searching 'LamarCodeGeneration.Commands, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' for commands
Searching 'Marten.CommandLine, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' for commands

  -------------------------------------------------
    Available commands:
  -------------------------------------------------
        check-env -> Execute all environment checks against the application
          codegen -> Utilities for working with LamarCodeGeneration and LamarCompiler

and assuming you got that far, now type dotnet run -- help codegen to see the list of options with the codegen command.

If you just want to preview the generated code in the console, type:

dotnet run -- codegen preview

To just verify that the dynamic code can be successfully generated and compiled, use:

dotnet run -- codegen test

To actually export the generated code, use:

dotnet run — codegen write

That command will write a single C# file at /Internal/Generated/DocumentStorage.cs for any document storage types and another at `/Internal/Generated/Events.cs` for the event storage and projections.

Just as a short cut to clear out any old code, you can use:

dotnet run -- codegen delete

If you’re curious, the generated code — and remember that it’s generated code so it’s going to be butt ugly — is going to look like this for the document storage, and this code for the event storage and projections.

The way that I see this being used is something like this:

  • The LoadFromPreBuiltAssembly option is only turned on in Production mode so that developers can iterate at will during development. That should be disabled at development time.
  • As part of the CI/CD process for a project, you’ll run the dotnet run -- codegen write command as an initial step, then proceed to the normal compilation and testing cycles. That will bake in the generated code right into the compiled assemblies and enable you to also take advantage of any kind AOT compiler optimizations

Duh. Why not source generators doofus?

Why didn’t we use source generators you may ask? The Roslyn-based approach in Marten is both much better and much worse than source generators. Source generators are part of the compilation process itself and wouldn’t have any kind of cold start problem like Marten has with the runtime Roslyn compilation approach. That part is obviously much better, plus there’s no weird two step compilation process at CI/CD time. But on the downside, source generators can only use information that’s available at compilation time, and the Marten code generation relies very heavily on type reflection, conventions applied at runtime, and configuration built up through a fluent interface API. I do not believe that we could use source generators for what we’ve done in Marten because of that dependency on runtime information.

Using Alba to Test ASP.Net Services

TL;DR: Alba is a handy library for integration tests against ASP.Net web services, and the rapid feedback that enables is very useful

One of our projects at Calavista right now is helping a client modernize and optimize a large .Net application, with the end goal being everything running on .Net 5 and an order of magnitude improvement in system throughput. As part of the effort to upgrade the web services, I took on a task to replace this system’s usage of IdentityServer3 with IdentityServer4, but still use the existing Marten-backed data storage for user membership information.

Great, but there’s just one problem. I’ve never used IdentityServer4 before and it changed somewhat between the IdentityServer3 code I was trying to reverse engineer and its current model. I ended up getting through that work just fine. A key element of doing that was using the Alba library to create a test harness so I could iterate through configuration changes quickly by rerunning tests on the new IdentityServer4 project. It didn’t start out this way, but Alba is essentially a wrapper around the ASP.Net TestServer and just acts as a utility to make it easier to write automated tests around the HTTP services in your web service projects.

I ended up starting two new .Net projects:

  1. A new web service that hosts IdentityServer4 and is configured to use user membership information from our client’s existing Marten/Postgresql database
  2. A new xUnit.Net project to hold integration tests against the new IdentityServer4 web service.

Let’s dive right into how I set up Alba and xUnit.Net as an automated test harness for our new IdentityServer4 service. If you start a new ASP.Net project with one of the built in project templates, you’ll get a Program file that’s the main entry point for the application and a Startup class that has most of the system’s bootstrapping configuration. The templates will generate this method that’s used to configure the IHostBuilder for the application:

public static IHostBuilder CreateHostBuilder(string[] args)
{
    return Host.CreateDefaultBuilder(args)
        .ConfigureWebHostDefaults(webBuilder =>
        {
            webBuilder.UseStartup<Startup>();
        });
}

For more information on what role of the IHostBuilder is within your application, see .NET Generic Host in ASP.NET Core.

That’s important, because that gives us the ability to stand up the application exactly as it’s configured in an automated test harness. Switching to the new xUnit.Net test project, referenced my new web service project that will host IdentityServer4. Because spinning up your ASP.Net system can be relatively expensive, I only want to do that once and share the IHost between tests. That’s a perfect usage for xUnit.Net’s shared context support.

First I make what will be the shared test fixture context class for the integration tests shown below:

public class AppFixture : IDisposable
{
    public AppFixture()
    {
        // This is calling into the actual application's
        // configuration to stand up the application in 
        // memory
        var builder = Program.CreateHostBuilder(new string[0]);

        // This is the Alba "SystemUnderTest" wrapper
        System = new SystemUnderTest(builder);
    }
    
    public SystemUnderTest System { get; }

    public void Dispose()
    {
        System?.Dispose();
    }
}

The Alba SystemUnderTest wrapper is responsible for building the actual IHost object for your system, and does so using the in memory TestServer in place of Kestrel.

Just as a convenience, I like to create a base class for integration tests I tend to call IntegrationContext:

    public abstract class IntegrationContext 
        // This is some xUnit.Net mechanics
        : IClassFixture<AppFixture>
    {
        public IntegrationContext(AppFixture fixture)
        {
            System = fixture.System;
            
            // Just casting the IServiceProvider of the underlying
            // IHost to a Lamar IContainer as a convenience
            Container = (IContainer)fixture.System.Services;
        }
        
        public IContainer Container { get; }
        
        // This gives the tests access to run
        // Alba "Scenarios"
        public ISystemUnderTest System { get; }
    }

We’re using Lamar as the underlying IoC container in this application, and I wanted to use Lamar-specific IoC diagnostics in the tests, so I expose the main Lamar container off of the base class as just a convenience.

To finally turn to the tests, the very first thing to try with IdentityServer4 was just to hit the descriptive discovery endpoint just to see if the application was bootstrapping correctly and IdentityServer4 was functional *at all*. I started a new test class with this declaration:

    public class EndToEndTests : IntegrationContext
    {
        private readonly ITestOutputHelper _output;
        private IDocumentStore theStore;

        public EndToEndTests(AppFixture fixture, ITestOutputHelper output) : base(fixture)
        {
            _output = output;

            // I'm grabbing the Marten document store for the app
            // to set up user information later
            theStore = Container.GetInstance<IDocumentStore>();
        }

And then a new test just to exercise the discovery endpoint:

[Fact]
public async Task can_get_the_discovery_document()
{
    var result = await System.Scenario(x =>
    {
        x.Get.Url("/.well-known/openid-configuration");
        x.StatusCodeShouldBeOk();
    });
    
    // Just checking out what's returned
    _output.WriteLine(result.ResponseBody.ReadAsText());
}

The test above is pretty crude. All it does is try to hit the `/.well-known/openid-configuration` url in the application and see that it returns a 200 OK HTTP status code.

I tend to run tests while I’m coding by using keyboard shortcuts. Most IDEs support some kind of “re-run the last test” keyboard shortcut. Using that, my preferred workflow is to run the test once, then assuming that the test is failing the first time, work in a tight cycle of making changes and constantly re-running the test(s). This turned out to be invaluable as it took me a couple iterations of code changes to correctly re-create the old IdentityServer3 configuration into the new IdentityServer4 configuration.

Moving on to doing a simple authentication, I wrote a test like this one to exercise the system with known credentials:

[Fact]
public async Task get_a_token()
{
    // All I'm doing here is wiping out the existing database state
    // with Marten and using a little helper method to add a new
    // user to the database as part of the test setup
    // This would obviously be different in your system
    theStore.Advanced.Clean.DeleteAllDocuments();
    await theStore
        .CreateUser("aquaman@justiceleague.com", "dolphin", "System Administrator");
    
    
    var body =
        "client_id=something&client_secret=something&grant_type=password&scope=ourscope%20profile%20openid&username=aquaman@justiceleague.com&password=dolphin";

    // The test would fail here if the status
    // code was NOT 200
    var result = await System.Scenario(x =>
    {
        x.Post
            .Text(body)
            .ContentType("application/x-www-form-urlencoded")
            .ToUrl("/connect/token");
    });

    // As a convenience, I made a new class called ConnectData
    // just to deserialize the IdentityServer4 response into 
    // a .Net object
    var token = result.ResponseBody.ReadAsJson<ConnectData>();
    token.access_token.Should().NotBeNull();
    token.token_type.Should().Be("Bearer");
}

public class ConnectData
{
    public string access_token { get; set; }
    public int expires_in { get; set; }
    public string token_type { get; set; }
    public string scope { get; set; }
    
}

Now, this test took me several iterations to work through until I found exactly the right way to configure IdentityServer4 and adjusted our custom Marten backing identity store (IResourceOwnerPasswordValidator and IProfileService in IdentityServer4 world) until the tests pass. I found it extremely valuable to be able to debug right into the failing tests as I worked, and even needed to take advantage of JetBrains Rider’s capability to debug through external code to understand how IdentityServer4 itself worked. I’m very sure that I was able to get through this work much faster by iterating through tests as opposed to just trying to run the application and driving it through something like Postman or through the connected user interface.

Oakton v3 super charges the .Net Core/5 command line, and helps Lamar deliver uniquely useful IoC diagnostics

With the Great Texas Ice and Snow Storm of 2021 last week playing havoc with basically everything, I switched to easier work on Oakton and Lamar in the windows of time when we had power. The result of that is the planned v3 release that merged the former Oakton.AspNetCore library into Oakton itself, and took a dependency on Spectre.Console to make all the command output a whole lot more usable. As a logical follow up to the Oakton improvements, I mostly rewrote the Lamar.Diagnostics package that uses Oakton to add Lamar-related troubleshooting commands to your application. I used Spectre.Console to improve how the Lamar configuration was rendered to enhance its usability. I wanted the improved Lamar diagnostics for an ongoing Calavista client project, and that’ll be in heavy usage this very week.

Late last week I released version 3.0.1 of Oakton, a library for building rich command line functionality into .Net Core / .Net 5.0 applications. While competent command line parsing tools are a dime a dozen in the .Net space by this point, Oakton stands out by integrating directly into the generic HostBuilder and extending the command line operations of your currently configured .Net system (Oakton can still be used without HostBuilder as well, see Using the CommandExecutor for more information).

Jumping right into the integration effort, let’s assume that you have a .Net Core 3.1 or .Net 5 application generated by one of the built in project templates. If so, your application’s entry point probably looks like this Program.Main() method:

public static void Main(string[] args)
{
    CreateHostBuilder(args).Build().Run();
}

To integrate Oakton commands, add a reference to the Oakton Nuget, then change Program.Main() to this:

public static Task<int> Main(string[] args)
{
    return CreateHostBuilder(args)
        .RunOaktonCommands(args);
}

It’s important to change the return type to Task<int> because that’s used to communicate exit codes to the operating system to denote success or failure for commands that do some kind of system verification.

When you use dotnet run, you can pass arguments or flags to the dotnet run command itself, and also arguments and flags to your actual application. The way that works is that arguments are delimited by a double dash, so that in this usage: dotnet run --framework net5.0 -- check-env, the --framework flag is passed to dotnet run itself to modify its behavior, and the check-env argument is passed to your application.

Out of the box, Oakton adds a couple named commands to your application along with a default “run my application” command. You can access the built in Oakton command line help with dotnet run -- help. The output of that is shown below:

   ---------------------------------------------------------------------
     Available commands:
   ---------------------------------------------------------------------
     check-env -> Execute all environment checks against the application
      describe -> Writes out a description of your running application to either the console or a file
          help -> list all the available commands
           run -> Start and run this .Net application
   --------------------------------------------------------------------- 

Slightly Enhanced “Run”

Once Oakton is in place, your basic dotnet run command has a couple new flag options.

  1. dotnet run still starts up your application just like it did before you applied Oakton
  2. dotnet run -- -e Staging using the -e flag starts the application with the host environment name overridden to the value of the flag. In this example, I started the application running as “Staging.”
  3. dotnet run -- --check or dotnet run -- -c will run your application, but first perform any registered environment checks before starting the full application. More on this below.

Environment Checks

Oakton has a simple mechanism to embed environment checks directly into your application. The point of environment checks is to make your .Net application be able to check out its configuration during deployment in order to “fail fast” if external dependencies like databases are unavailable, or required files or missing, or necessary configuration items are invalid.

To use environment checks in your application with Oakton, just run this command:

dotnet run -- check-env

The command will fail if any of the environment checks fail, and you’ll get a report of every failure.

I wrote about this a couple years ago, and Oakton v3 just makes the environment check output a little prettier. If you’re using Lamar as your application’s IoC container, the Lamar.Diagnostics package adds some easier recipes for exposing environment checks in your code in combination with Oakton’s check-env command. I’ll show that later in this post.

The environment check functionality can be absolutely invaluable when your application is being deployed to an environment that isn’t perfectly reliable or has dependencies outside the control of your team. Ask me how I know this to be the case…

Describing the Application

A couple of the OSS projects I support are very sensitive to application configuration, and it’s sometimes challenging to get the right information from users who are having problems with my OSS libraries. As a possible way to ameliorate that issue, there’s Oakton’s extensible describe command:

dotnet run -- describe

Out of the box it just lists basic information about the application and referenced assemblies, but it comes with a simple extensibility mechanism I’m hoping to exploit for embedding Jasper, Marten, and Lamar diagnostics in applications.

Here’s the sample output from a simplistic application built with the worker template:

── About WorkerService ─────────────────────────────────────────
          Entry Assembly: WorkerService
                 Version: 1.0.0.0
        Application Name: WorkerService
             Environment: Development
       Content Root Path: the content root folder
AppContext.BaseDirectory: the application's base directory


── Referenced Assemblies ───────────────────────────────────────
┌───────────────────────────────────────────────────────┬─────────┐
│ Assembly Name                                         │ Version │
├───────────────────────────────────────────────────────┼─────────┤
│ System.Runtime                                        │ 5.0.0.0 │
│ Microsoft.Extensions.Configuration.UserSecrets        │ 5.0.0.0 │
│ Microsoft.Extensions.Hosting.Abstractions             │ 5.0.0.0 │
│ Microsoft.Extensions.DependencyInjection.Abstractions │ 5.0.0.0 │
│ Microsoft.Extensions.Logging.Abstractions             │ 5.0.0.0 │
│ Oakton                                                │ 3.0.2.0 │
│ Microsoft.Extensions.Hosting                          │ 5.0.0.0 │
└───────────────────────────────────────────────────────┴─────────┘

Lamar Diagnostics

As an add-on to Oakton, I updated the Lamar.Diagnostics package that uses Oakton to add Lamar diagnostics to a .Net application. If this package is added to a .Net project that uses Oakton for command line parsing and Lamar as its IoC container, you’ll see new commands light up from dotnet run -- help:

    lamar-scanning -> Runs Lamar's type scanning diagnostics
    lamar-services -> List all the registered Lamar services
    lamar-validate -> Runs all the Lamar container validations

The lamar-scanning command just gives you access to Lamar’s type scanning diagnostics report (something my team & I needed post haste for an ongoing Calavista client project).

The lamar-validate command runs the AssertConfigurationIsValid() method on your application’s configured container to validate that every service registration is valid, can be built, and also tries out Lamar’s own built in environment check system. You can read more about what this method does in the Lamar documentation.

You can also add the Lamar container validation step into the Oakton environment check system with this service registration:

// This adds a Container validation
// to the Oakton "check-env" command
// "services" would be an IServiceCollection object
// either in StartUp.ConfigureServices() or in a Lamar
// ServiceRegistry
services.CheckLamarConfiguration();

As I said earlier, Lamar still supports a very old StructureMap feature to embed environment checks right into your application’s services with the [ValidateMethod] attribute. Let’s say that you have a class in your system called DatabaseUsingService that requires a valid connection string from configuration, and DatabaseUsingService is registered in your Lamar container. You could add a method on that class just to check if the database is reachable like the Validate() method below:

    public class DatabaseUsingService
    {
        private readonly DatabaseSettings _settings;

        public DatabaseUsingService(DatabaseSettings settings)
        {
            _settings = settings;
        }

        [ValidationMethod]
        public void Validate()
        {
            // For *now*, Lamar requires validate methods be synchronous
            using (var conn = new SqlConnection(_settings.ConnectionString))
            {
                // If this blows up, the environment check fails:)
                conn.Open();
            }
        }
    }

When Lamar validates the container, it will see the methods marked with [ValidationMethod], pull the service from the configured container, and try to call this method and record any exceptions.

Lamar Services Preview

The big addition in the latest Lamar.Diagnostics package is a big improvement in how it renders a report about the registered services compared to the old, built in WhatDoIHave() textual report.

Starting from a pretty basic ASP.Net Core application, using the dotnet run -- lamar-services command gives us this output:

If you want to dig into the dependency train and how Lamar is going to build a certain registration, you can use this flag dotnet run -- lamar-services --build-plans. Which is going to look like this:

Hat tip to Spectre.Console for making the layout a lot prettier and easier to read.

Where does Oakton go next?

I’m somewhat hopeful that other folks would pick up Oakton’s command line model and add more utilities, application describer parts, and environment checks in either Oakton itself or in extension libraries. For my part, Marten v4’s command line support will use Oakton to add schema management and event store utilities directly to your application. I think it’d also be helpful to have an ASP.Net specific set of application described parts to at least preview routes.