Tag Archives: .Net

Environment Checks and Better Command Line Abilities for your .Net Core Application

Oakton.AspNetCore is a new package built on top of the Oakton 2.0+ command line parser that adds extra functionality to the command line execution of ASP.Net Core and .Net Core 3.0 codebases. At the bottom of this blog post is a small section showing you how to set up Oakton.AspNetCore to run commands in your .Net Core application.

First though, you need to understand that when you use the dotnet run command to build and execute your ASP.Net Core application, you can pass arguments and flags both to dotnet run itself and to your application through the string[] args argument of Program.Main(). These two types of arguments or flags are separated by a double dash, like this example: dotnet run --framework netcoreapp2.0 -- ?. In this case, “–framework netcoreapp2.0” is used by dotnet run itself, and the values to the right of the “–” are passed into your application as the args array.

With that out of the way, let’s see what Oakton.AspNetCore brings to the table.

Extended “Run” Options

In the default ASP.Net Core templates, your application can be started with all its defaults by using dotnet run.  Oakton.AspNetCore retains that usage, but adds some new abilities with its “Run” command. To check the syntax options, type dotnet run -- ? run:

 Usages for 'run' (Runs the configured AspNetCore application)
  run [-c, --check] [-e, --environment <environment>] [-v, --verbose] [-l, --log-level <logleve>] [----config:<prop> <value>]

  ---------------------------------------------------------------------------------------------------------------------------------------
    Flags
  ---------------------------------------------------------------------------------------------------------------------------------------
                        [-c, --check] -> Run the environment checks before starting the host
    [-e, --environment <environment>] -> Use to override the ASP.Net Environment name
                      [-v, --verbose] -> Write out much more information at startup and enables console logging
          [-l, --log-level <logleve>] -> Override the log level
          [----config:<prop> <value>] -> Overwrite individual configuration items
  ---------------------------------------------------------------------------------------------------------------------------------------

To run your application under a different hosting environment name value, use a flag like so:

dotnet run -- --environment Testing

or

dotnet run -- -e Testing

To overwrite configuration key/value pairs, you’ve also got this option:

dotnet run -- --config:key1 value1 --config:key2 value2

which will overwrite the configuration keys for “key1” and “key2” to “value1” and “value2” respectively.

Lastly, you can have any configured environment checks for your application immediately before starting your application by using this flag:

dotnet run -- --check

More on this function in the next section.

 

Environment Checks

I’m a huge fan of building environment tests directly into your application. Environment tests allow your application to self-diagnose issues with deployment, configuration, or environmental dependencies upfront that would impact its ability to run.

As a very real world example, let’s say your ASP.Net Core application needs to access another web service that’s managed independently by other teams and maybe, just maybe your testers have occasionally tried to test your application when:

  • Your application configuration has the wrong Url for the other web service
  • The other web service isn’t running at all
  • There’s some kind of authentication issue between your application and the other web service

In the real world project that spawned the example above, we added a formal environment check that would try to touch the health check endpoint of the external web service and throw an exception if we couldn’t connect to the external system. The next step was to execute our application as it was configured and deployed with this environment check as part of our Continuous Deployment pipeline. If the environment check failed, the deployment itself failed and triggered off the normal set of failure alerts letting us know to go fix the environment rather than letting our testers waste time on a bad deployment.

With all that said, let’s look at what Oakton.AspNetCore does here to help you add environment checks. Let’s say your application uses a single Sql Server database, and the connection string should be configured in the “connectionString” key of your application’s connection. You would probably want an environment check just to verify at a minimum that you can successfully connect to your database as it’s configured.

In your ASP.Net Core Startup class, you could add a new service registration for an environment check like this example:

// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
    // Other registrations we don't care about...
    
    // This extension method is in Oakton.AspNetCore
    services.CheckEnvironment<IConfiguration>("Can connect to the application database", config =>
    {
        var connectionString = config["connectionString"];
        using (var conn = new SqlConnection(connectionString))
        {
            // Just attempt to open the connection. If there's anything
            // wrong here, it's going to throw an exception
            conn.Open();
        }
    });
}

Now, during deployments or even just pulling down the code to run locally, we can run the environment checks on our application like so:

dotnet run -- check-env

Which in the case of our application above, blows up with output like this because I didn’t add configuration for the database in the first place:

Running Environment Checks
   1.) Failed: Can connect to the application database
System.InvalidOperationException: The ConnectionString property has not been initialized.
   at System.Data.SqlClient.SqlConnection.PermissionDemand()
   at System.Data.SqlClient.SqlConnectionFactory.Permissi
onDemand(DbConnection outerConnection)
   at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1
 retry, DbConnectionOptions userOptions)
   at System.Data.ProviderBase.DbConnectionClosed.TryOpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, 
DbConnectionOptions userOptions)
   at System.Data.SqlClient.SqlConnection.TryOpen(TaskCompletionSource`1 retry)
   at System.Data.SqlClient.SqlConnection.Open()
   at MvcApp.Startup.<>c.<ConfigureServices>b_
_4_0(IConfiguration config) in /Users/jeremydmiller/code/oakton/src/MvcApp/Startup.cs:line 41
   at Oakton.AspNetCore.Environment.EnvironmentCheckExtensions.<>c__DisplayClass2_0`1.<CheckEnvironment>b__0(IServ
iceProvider s, CancellationToken c) in /Users/jeremydmiller/code/oakton/src/Oakton.AspNetCore/Environment/EnvironmentCheckExtensions.cs:line 53
   at Oakton.AspNetCore.Environment.LambdaCheck.Assert(IServiceP
rovider services, CancellationToken cancellation) in /Users/jeremydmiller/code/oakton/src/Oakton.AspNetCore/Environment/LambdaCheck.cs:line 19
   at Oakton.AspNetCore.Environment.EnvironmentChecker.ExecuteAll
EnvironmentChecks(IServiceProvider services, CancellationToken token) in /Users/jeremydmiller/code/oakton/src/Oakton.AspNetCore/Environment/EnvironmentChecker.cs:line 31

If you ran this command during continuous deployment scripts, the command should cause your build to fail when it detects environment problems.

In some of Calavista’s current projects , we’ve been adding environment tests to our applications for items like:

  • Can our application read certain configured directories?
  • Can our application as it’s configured connect to databases?
  • Can your application reach other web services?
  • Are required configuration items specified? That’s been an issue as we’ve had to build out Continuous Deployment pipelines to many, many different server environments

I don’t see the idea of “Environment Tests” mentioned very often, and it might have other names I’m not aware of. I learned about the idea back in the Extreme Programming days from a blog post from Nat Pryce that I can’t find any longer, but there’s this paper from those days too.

 

Add Other Commands

I’ve frequently worked in projects where we’ve built parallel console applications that reproduce a lot of the same IoC and configuration setup to perform administrative tasks or add other diagnostics. It could be things like adding users, rebuilding an event store projection, executing database migrations, or loading some kind of data into the application’s database. What if instead, you could just add these directly to your .Net Core application as additional dotnet run -- [command] options? Fortunately, Oakton.AspNetCore let’s you do exactly that, and even allows you to package up reusable commands in other assemblies that could be distributed by Nuget.

If you use Lamar as your IoC container in an ASP.Net Core application (or .Net Core 3.0 console app using the new unified HostBuilder), we now have an add on Nuget called Lamar.Diagnostics that will add new Oakton commands to your application that give you access to Lamar’s diagnostic tools from the command line. As an example, this library adds a command to write out the “WhatDoIHave()” report for the underlying Lamar IoC container of your application to the command line or a file like this:

dotnet run --lamar-services

Now, using the command above as an example, to build or add your own commands start by decorating the assembly containing the command classes with this attribute:

[assembly:OaktonCommandAssembly]

Having this assembly tells Oakton.AspNetCore to search the assembly for additional Oakton commands. There is no other setup necessary.

If your command needs to use the application’s services or configuration, have the Oakton input type inherit from NetCoreInput type from Oakton.AspNetCore like so:

public class LamarServicesInput : NetCoreInput
{
    // Lots of other flags
}

Next, the new command for “lamar-services” is just this:

[Description("List all the registered Lamar services", Name = "lamar-services")]
public class LamarServicesCommand : OaktonCommand<LamarServicesInput>
{
    public override bool Execute(LamarServicesInput input)
    {
        // BuildHost() will return an IHost for your application
        // if you're using .Net Core 3.0, or IWebHost for
        // ASP.Net Core 2.*
        using (var host = input.BuildHost())
        {
            // The actual execution using host.Services
            // to get at the underlying Lamar Container
        }

        return true;
    }


}

Getting Started

In both cases I’m assuming that you’ve bootstrapped your application with one of the standard project templates like dotnet new webapi or dotnet new mvc. In both cases, you’ll first add a reference to the Oakton.AspNetCore Nuget. Next, break into the Program.Main()entry point method in your project and modify it like the following samples.

If you’re absolutely cutting edge and using ASP.Net Core 3.0:

public class Program
{
    public static Task<int> Main(string[] args)
    {
        return CreateHostBuilder(args)
            
            // This extension method replaces the calls to
            // IWebHost.Build() and Start()
            .RunOaktonCommands(args);
    }

    public static IHostBuilder CreateHostBuilder(string[] args) =>
        Host.CreateDefaultBuilder(args)
            .ConfigureWebHostDefaults(x => x.UseStartup<Startup>());
    
}

For what I would guess is most folks, the ASP.Net Core 2.* setup (and this would work as well for ASP.Net Core 3.0 as well):

public class Program
{
    public static Task<int> Main(string[] args)
    {
        return CreateWebHostBuilder(args)
            
            // This extension method replaces the calls to
            // IWebHost.Build() and Start()
            .RunOaktonCommands(args);
    }

    public static IWebHostBuilder CreateWebHostBuilder(string[] args) =>
        WebHost.CreateDefaultBuilder(args)
            .UseStartup<Startup>();
    
}

The two changes from the template defaults is to:

  1. Change the return value to Task<int>
  2. Replace the calls to IHost.Build() and IHost.Start() to use the RunOaktonCommands(args) extension method that hangs off IWebHostBuilder and the new unified IHostBuilder if you’re targeting netcoreapp3.0.

And that’s it, you’re off to the races.

 

Advertisements

Alba 3.1 supercharges your ASP.Net Core HTTP Contract Testing

I was just able to push a Nuget for Alba 3.1 that adds support for ASP.Net Core 3.0 and updated the documentation website to reflect the mild additions. Big thanks are in order to Lauri Kotilainen and Jonathan Mezach for making this release happen.

If you’re not familiar with Alba, in its current incarnation it’s a wrapper around the ASP.Net Core TestServer that adds a whole lot of useful helpers to make your testing of ASP.Net Core HTTP services be much more declarative and easier than it is with TestServer alone.

Alba is descended from some built in support for HTTP contract testing in FubuMVC that I salvaged and adapted for usage with ASP.Net Core a few years back. Finally, Alba has used TestServer rather than its own homegrown HTTP runner since the big 3.0 release this January.

Lamar and Oakton join the .Net Core 3.0 Party

Like many other .Net OSS authors, I’ve been putting in some time this week making sure that various .Net tools support the brand spanking new .Net Core and ASP.Net Core 3.0 bits that were just released. First up is Oakton and Lamar, with the rest of the SW Missouri projects to follow soon.

Oakton

Oakton is yet another command line parser tool for .Net. The main Oakton 2.0.1 library adds support for the Netstandard2.1 target, but does not change in any other way. The Oakton.AspNetCore library got a full 2.0.0 release. If you’re using ASP.Net Core v2.0, your usage is unchanged. However, if you are targeting netcoreapp3.0, the extension methods now depend on the newly unified IHostBuilder rather than the older IWebHostBuilder and the bootstrapping looks like this now:

public class Program
{
    public static Task<int> Main(string[] args)
    {
        return CreateHostBuilder(args)
            
            // This extension method replaces the calls to
            // IWebHost.Build() and Start()
            .RunOaktonCommands(args);
    }

    public static IHostBuilder CreateHostBuilder(string[] args) =>
        Host.CreateDefaultBuilder(args)
            .ConfigureWebHostDefaults(x => x.UseStartup<Startup>());
    
}

 

Oakton.AspNetCore provides an improved and extensible command line experience for ASP.Net Core. I’ve been meaning to write a bigger blog post about it, but that’s gonna wait for another day.

Lamar

The 3.0 support in Lamar was unpleasant, because the targeting covers the spread from .Net 4.6.1 to netstandard2.0 to netstandard2.1, and the test library covered several .Net runtimes. After this, I’d really like to not have to type “#if/#else/#endif” ever, ever again (and I do use ReSharper/Rider’s ALT-CTRL-J surround with feature religiously that helps, but you get my point).

The new bits and releases are:

  • Lamar v3.2.0 adds a netstandard2.1 target
  • LamarCompiler v2.1.0 adds a netstandard2.1 target (probably only used by Jasper, but who knows who’s picked it up, so I updated it)
  • LamarCodeGeneration v1.1.0 adds a netstandard2.1 target but is otherwise unchanged
  • Lamar.Microsoft.DependencyInjection v4.0.0 — this is the adapter library to replace the built in DI container in ASP.Net Core with Lamar. This is a full point release because some of the method signatures changed. I deleted the special Lamar handling for IOption<T> and ILogger<T> because they no longer added any value. If your application targets netcoreapp3.0, the UseLamar() method and its overloads hang off of IHostBuilder instead of IWebHostBuilder. If you are remaining on ASP.Net Core v2.*, UseLamar() still hangs off of IWebHostBuilder
  • Lamar.Diagnostics v1.1.0 — assuming that you use the Oakton.AspNetCore command line adapter for your ASP.Net Core application, adding a reference to this library adds new commands to expose the Lamar diagnostic capabilities from the command line of your application. This version targets both netstandard2.0 for ASP.Net Core v2.* and netstandard2.1 for ASP.Net Core 3.*.

 

The challenges

The biggest problem was that in both of these projects, I wanted to retain support for .Net 4.6+ and netstandard2.0 or netcoreapp2.* runtime targets. That’s unfortunately meant a helluva lot of conditional compilation and conditional Nuget references per target framework. In some cases, the move from the old ASP.Net Core <=2.* IWebHostBuilder to the newer, unified IHostBuilder took some doing in both the real code and in the unit tests. Adding to the fun is that there are real differences sometimes between the old, full .Net framework, netcoreapp2.0, netcoreapp2.1/2, and certainly the brand new netcoreapp3.0.

Another little hiccup was that the dotnet pack pathing support was fixed to what it was originally in the project.json early days, but that broke all our build scripts and that had to be adjusted (the artifacts path is relative to the current directory now rather than to the binary target path like is was).

At least on AppVeyor, we had to force the existing image to install the latest .Net SDK as part of the build because the image we were using didn’t yet support the brand new .Net SDK. I’d assume that is very temporary and I can’t speak to other hosted CI providers. If you’re wondering how to do that, see this example from Lamar that I got from Jimmy Byrd.

 

Other Projects I’m Involved With

  • StructureMap — If someone feels like doing it, I’d take a pull request to StructureMap.Microsoft.DependencyInjection to put out a .Net Core 3.0 compatible version (really just means supporting the new IHostBuilder instead of or in addition to the old IWebHostBuilder).
  • Alba — Some other folks are working on that one, and I’m anticipating an Alba on ASP.Net Core 3.0 release in the next couple days. I’ll write a follow up blog post when that’s ready
  • Marten — I’m not anticipating much effort here, but we should at least have our testing libraries target netcoreapp3.0 and see what happens. We did have some issues with netcoreapp2.1, so we’ll see I guess
  • Jasper — I’ve admittedly had Jasper mostly on ice until .Net Core 3.0 was released. I’ll start work on Jasper next week, but in that is going to be a true conversion up to netcoreapp3.0 and some additional structural changes to finally get to a 1.0 release of that thing.
  • Storyteller — I’m not sure yet, but I’ve started doing a big overhaul of Storyteller that’s gotten derailed by the .Net Core 3.0 stuff

 

Storyteller 5.0 – Streamlined CLI, Netstandard 2.0, and easier debugging

I published the Storyteller 5.0 release last night. I punted on doing any kind of big user interface overhaul for now, and just released the back end improvements on their own with some vague idea that there’d be an improved or at least restyled user interface later this year.

The key improvements are:

  • Netstandard 2.0 support
  • An easier getting started story
  • Streamlined command line usage
  • Easier “F5 debugging” for specifications in your IDE
  • No changes whatsoever to your Fixture code from 4.0

Getting Started with Storyteller 5

Previous versions of Storyteller have been problematic for new users getting started and setting up projects with the right Nuget dependencies. I felt like things got a little better with the dotnet cli, but the enduring problem with that is how few .Net developers seem to be using it or familiar with it. When you use Storyteller 5, you need two dependencies in your Storyteller specification project:

  1. A reference to the Storyteller 5.0 assembly via Nuget
  2. The dotnet-storyteller command line tool referenced as a dotnet cli tool in your project, and that’s where most of the trouble come in.

To start up a new Storyteller 5.0 specification project, first make the directory where you want the project to live. Next, use the dotnet new console command to create a new project with a csproj file and a Program.cs file.

In your csproj file, replace the contents with this, or just add the package reference for Storyteller and the cli tool reference for dotnet-storyteller as shown below:

  

  
    netcoreapp2.0
    EXE
  
  
    
  
  
    
  

Next, we need to get into the entry point to this new console application change the Program.Main() method to activate the Storyteller engine within this new project:

    public class Program
    {
        public static int Main(string[] args)
        {
            return StorytellerAgent.Run(args);
        }
    }

Internally, the StorytellerAgent is using Oakton to parse the arguments and carry out one of these commands:

  ------------------------------------------------------------------------------
    Available commands:
  ------------------------------------------------------------------------------
       agent -> Used by dotnet storyteller to remote control the Storyteller specification engine
         run -> Executes Specifications and Writes Results
        test -> Try to start and warmup the system under test for diagnostics
    validate -> Use to validate specifications for syntax errors or missing grammars or fixtures
  ------------------------------------------------------------------------------

If you execute the console application with no arguments like this:

|> dotnet run

It will execute all the specifications and write the results to a file named “stresults.htm.”

You can customize the running behavior by passing in optional flags with the pattern dotnet run -- run --flag flagvalue like this example that just writes the results file to a different location:

|> dotnet run -- run Arithmetic -r ./artifacts/results.htm

If you’re not already familiar with the dotnet cli, what’s going on here is that anything to the right of the “–” double dash is considered to be the command line arguments passed into your application’s Main() method. The “run” argument tells the StorytellerAgent that you actually want to run specifications and it’s unfortunately not redundant and very much mandatory if you want to customize how Storyteller runs specifications.

See the Storyteller 5.0 quickstart project for a working example.

Running the Storyteller Specification Editor

Assuming that you’ve got the cli tools reference to dotnet-storyteller and you’ve executed `dotnet restore` at least once (compiling through VS.Net or Rider does this for you), the only thing you need to do to launch the specification editor tool is this from the command line:

|> dotnet storyteller

F5 Debugging

Debugging complicated Storyteller specifications has been its Achille’s Heel from the very beginning. You can always attach a debugger to a running Storyteller process, but that’s clumsy (quicker in Rider than VS.Net, but still). As a cheap but effective improvement in v5, you can run a single specification from the command line with this signature:

|> dotnet run -- run "Suite1 / ChildSuite1 / Specification Name"

This is admittedly pretty ugly, but remember that you can tell either Rider or VS.Net to pass arguments to your console application when your press F5 to run an application in debug mode. I utilize this quite a bit in Jasper development to troubleshoot individual specifications. Here’s what the configuration looks like for this in Rider:

 

RunSingleSpec

See the “Program arguments” specifically. Once the path to the specification is configured, I can just hit F5 and jump right into a debugging session running just that specification.

We looked pretty hard at supporting the dotnet test tooling so you could run Storyteller specifications from either Visual Studio.Net’s or Rider/ReSharper’s test runners, but all I could think about after trying to reverse engineer xUnit’s tooling around that was a certain Monty Python scene.

Introducing Oakton — Command line parsing minus the usual cruft

As the cool OSS kids would say, “I made another thing.” Oakton is a library that I maintain that I use for command line parsing in the console applications I build and maintain. For those who’ve followed me for a long time, Oakton is an improved version of the command line parsing in FubuCore that now targets Netstandard 1.3 as well as .Net 4.5.1 and 4.6 on the full framework.

What sets Oakton apart from the couple dozen other tools like this in the .Net ecosystem is how it allows you to cleanly separate the command line parsing from your actual command parsing so that you can write cleaner code and more easily test your command execution in automated tests.

Here’s the quick start example from the documentation that’ll have prettier code output. Let’s say you just want a command that will print out a name with an optional title and the option to override the color of the text.

A command in Oakton comes in two parts, a concrete input class that just establishes the required arguments and optional flags through public fields or settable properties:

    public class NameInput
    {
        [Description("The name to be printed to the console output")]
        public string Name { get; set; }
        
        [Description("The color of the text. Default is black")]
        public ConsoleColor Color { get; set; } = ConsoleColor.Black;
        
        [Description("Optional title preceeding the name")]
        public string TitleFlag { get; set; }
    }

The [Description] attributes are optional and embed usage messages for the integrated help output.

Now then, the actual command would look like this:

    [Description("Print somebody's name")]
    public class NameCommand : OaktonCommand
    {
        public NameCommand()
        {
            // The usage pattern definition here is completely
            // optional
            Usage("Default Color").Arguments(x => x.Name);
            Usage("Print name with specified color").Arguments(x => x.Name, x => x.Color);
        }

        public override bool Execute(NameInput input)
        {
            var text = input.Name;
            if (!string.IsNullOrEmpty(input.TitleFlag))
            {
                text = input.TitleFlag + " " + text;
            }
            
            // This is a little helper in Oakton for getting
            // cute with colors in the console output
            ConsoleWriter.Write(input.Color, text);


            // Just telling the OS that the command
            // finished up okay
            return true;
        }
    }

Again, the [Description] attributes and the Usage property in the constructor function are all optional, but add more information to the user help display. You’ll note that your command is completely decoupled from any and all text parsing and does nothing but do work against the single input argument. That’s done very intentionally and we believe that this sets Oakton apart from most other command line parsing tools in .Net that too freely commingle parsing with the actual functionality.

Finally, you need to execute the command in the application’s main function:

    class Program
    {
        static int Main(string[] args)
        {
            // As long as this doesn't blow up, we're good to go
            return CommandExecutor.ExecuteCommand&lt;NameCommand&gt;(args);
        }
    }

Oakton is fairly full-featured, so you have the options to:

  1. Expose help information in your tool
  2. Support all the commonly used primitive types like strings, numbers, dates, and booleans
  3. Use idiomatic Unix style naming and usage conventions for optional flags
  4. Support multiple commands in a single tool with different arguments and flags (because the original tooling was too inspired by the git command line)

 

So a couple questions:

  • Does the .Net world really need a new library for command line parsing? Nope, there’s dozens out there and a semi-official one somewhere inside of ASP.Net Core. It’s no big deal on my part though because other than the docs I finally wrote up this week, this code is years old and “done.”
  • Where’s the code? The GitHub repo is here.
  • Is it documented, because you used to be terrible at that? Yep, the docs are at http://jasperfx.github.io/oakton.
  • If I really want to use this, where can I ask questions? You can always use GitHub issues, or try the Gitter room.
  • Are there any real world examples of this actually being used? Yep, try Marten.CommandLine, the dotnet-stdocs tool, and Jasper.CommandLine.
  • What’s the license? Apache v2.

Where does the name “Oakton” come from?

A complete lack of creativity on my part. Oakton is a bustling non-incorporated area not far from my grandparent’s farm on the back way to Lamar, MO that consists of a Methodist church, a cemetery, the crumbling ruins of the general store, and maybe 3-4 farmhouses. Fun fact, when I was really small, I tagged along with my grandfather when he’d take tractor parts to get fixed by the blacksmith that used to be there.

An Experience Report of Moving a Complicated Codebase to the CoreCLR

TL;DR – I like the CoreCLR, project.json project system, and the new dotnet CLI so far, but there are a lot of differences in API that could easily burn you when you try to port existing .Net code to the CoreCLR.

As I wrote about a couple weeks ago, I’ve been working to port my Storyteller project to the CoreCLR en route to it being cross platform and generally modernized. As of earlier this week I think I can declare that it’s (mostly) running correctly in the new world order. Moreover, I’ve been able to dogfood Storyteller’s documentation generation feature on Mac OSX today without much trouble so far.

As semi-promised, here’s my experience report of moving an existing codebase over to targeting the CoreCLR, the usage of the project.json project system, and the new dotnet CLI.

 

Random Differences

  • AppDomain is gone, and you’ll have to use a combination of AppContext or DependencyContext to replace some of the information you’ve gotten from AppDomain.CurrentDomain about the running process. This is probably an opportunity for polyfill’s
  • The Thread class is very different and a couple methods (Yield(), Abort()) were removed. This is causing me to eventually go down to pinvoke in Storyteller
  • A lot of convenience methods that were probably just syntactic sugar anyway have been removed. I’ve found differences with Xml support and Stream’s. Again, I’ve gotten around this by adding extension methods to the Netstandard/CoreCLR code to add back in some of these things

 

Project.json May be Just a Flash in the Pan, but it’s a Good One

Having one tiny file that controls how a library is compiled, Nuget dependencies, and how that library is packed up into a Nuget later has been a huge win. Even better yet, I really appreciate how the project.json system handles transitive dependencies so you don’t have to do so much bookkeeping within your downstream projects. I think at this point I even prefer the project.json + dotnet restore combination over the Nuget workflow we have been using with Paket (which I still think was much better than out of the box Nuget was).

I’m really enjoying having the wildcard file inclusions in project.json so you’re not constantly having all the aggravation from merging the Xml-based csproj files. It’s a little bit embarrassing that it’s taken Microsoft so long to address that problem, but I’ll take it now.

I really hope that the new, trimmed down csproj file format is as usable as project.json. Honestly, assuming that their promised automatic conversion works as advertised, I’d recommend going to project.json as an interim solution rather than waiting.

 

I Love the Dotnet CLI

I think the new dotnet CLI is going to be a huge win for .Net development and it’s maybe my favorite part of .Net’s new world order. I love being able to so quickly restore packages, build, run tests, and even pack up Nuget files without having to invest time in writing out build scripts to piece it altogether.

I’ve long been a fan of using Rake for automating build scripts and I’ve resisted the calls to move to an arbitrarily different Make clone. With some of my simpler OSS projects, I’m completely forgoing build scripts in favor of just depending on the dotnet cli commands. For example, I have a small project called Oakton using the dotnet CLI, and it’s entire CI build script is just:

rmdir artifacts
dotnet restore src
dotnet test src/Oakton.Testing
dotnet pack src/Oakton --output artifacts --version-suffix %build.number%

In Storyteller itself, I removed the Rake script altogether and just a small shell script that delegates to both NPM and dotnet to do everything that needs to be done.

I’m also a fan of the “dotnet test” command too, especially when you want to quickly run the support for one .Net framework version. I don’t know if this is the test adapter or the CoreCLR itself being highly optimized, but I’ve been seeing a dramatic improvement in test execution time since switching over to the dotnet cli. In Marten I think it cut the test execution time of the main testing suite down by 60-70% some how.

The best source I’ve found on the dotnet CLI has been Scott Hanselman’s blog posts on DotNetCore.

 

AppDomain No Longer Exists (For Now)

AppDomain’s getting ripped out of the CoreCLR (yes, I know they’re supposed to come back in Netstandard 2.0, but who knows when that’ll be) was the single biggest problem I had moving Storyteller over to the CoreCLR. I outlined the biggest problem in a previous post on how testing tools generally use AppDomain’s to isolate the system under test from the test harness itself.

I ended up having Storyteller spawn a separate process to run the system under test in a way so that users can rebuild that system without having to first shut down the Storyteller specification editor tool. The first step was to replace the little bit of Remoting I had been using between AppDomain’s with a different communication scheme that just shot JSON back and forth over sockets. Fortunately, I had already been doing something very similar through a remoting proxy, so it wasn’t that bad of a change.

The next step was to change Storyteller testing projects to be changed from a class library to an executable that could be invoked to start up the system under test and start listening on a supplied port for the JSON messages described in the previous paragraph.This was a lot of work, but it might end up being more usable. Instead of depending on something else to have pre-compiled the system under test, Storyteller can start up the system under test with a spawned call to “dotnet run” that does any necessary compilation for you. It also makes it pretty easy to direct Storyteller to run the system under test under different .Net versions.

Of course, the old System.IO.Process class behaves differently in the CoreCLR (and across platforms too), and that’s still causing me some grief.

 

Reflection Got Scrambled in the CoreCLR

So, that sucked…

Infrastructure or testing tools like Storyteller will frequently need to use a lot of reflection. Unfortunately, the System.Reflection namespace in the CoreCLR has a very different API than classic .Net and that has been consistently been a cause of friction as I’ve ported code to the CoreCLR. The challenge is even worse if you’re trying to target both classic .Net and the CoreCLR.

Here’s an example, in classic .Net I can check whether a Type is an enumeration type with “type.IsEnum.” In the CoreCLR, it’s “type.GetTypeInfo().IsEnum.” Not that big a change, but basically anything you need to do against a Type is now on the paired TypeInfo and you now have to bounce through Type.GetTypeInfo().

One way or another, if you want to multi-target both CoreCLR and .Net classic, you’ll be picking up the idea of “polyfills” you see all over the Javascript world. The “GetTypeInfo()” method doesn’t exist in .Net 4.6, so you might do:

public static Type GetTypeInfo(this Type type)
{
    return type;
}

to make your .Net 4.6 code look like the CoreCLR equivalent. Or in some cases, I’ve just built polyfill extension methods in the CoreCLR to make it look like the older .Net 4.6 API:

#if !NET45
        public static IEnumerable<Type> GetInterfaces(Type type)
        {
            return type.GetTypeInfo().GetInterfaces();
        }

#endif

Finally, you’re going to have to dip into conditional compilation fairly often like this sample from StructureMap’s codebase:

    public static class AssemblyLoader
    {
        public static Assembly ByName(string assemblyName)
        {
#if NET45
            // This method doesn't exist in the CoreCLR
            return Assembly.Load(assemblyName);
#else
            return Assembly.Load(new AssemblyName(assemblyName));
#endif
        }
    }

There are other random changes as well that I’ve bumped into, especially around generics and Assembly loading. Again, not something that the average codebase is going to get into, but if you try to do any kind of type scanning over the file system to auto-discover assemblies at runtime, expect some churn when you go to the CoreCLR.

If you’re thinking about porting some existing .Net code to the CoreCLR and it uses a lot of reflection, be aware that that’s potentially going to be a lot of work to convert over.

Definitely see Porting a .Net Framework Library to .Net Core from Michael Whelan.

Moving Storyteller to the CoreCLR and going Cross Platform

This is half me thinking out loud and half experience report with new .Net new world order. If you want to know more about what Storyteller is, there’s an online webinar here or my blog post about the Storyteller 3 reboot and vision.

Storyteller 3 is an OSS acceptance test automation tool that we use at work for executable specifications and end to end regression testing. Storyteller doesn’t have a huge number of users, but the early feedback has been mostly positive from the community and it gets plenty of pull requests that have helped quite a bit with usability. Not that my Marten work is settling down I’ve been able to start concentrating on Storyteller again.

My current focus for the moment is making Storyteller work on the CoreCLR as a precursor to being truly cross platform. Going a little farther than that, I’m proposing some changes to its architecture that I think will make it relatively painless to use the existing user interface and test runners with completely different, underlying test engines (I’m definitely thinking about a Node.js based runner and doing a port of the testing engine to Scala or maybe even Swift or Kotlin way down the road as a learning exercise).

My first step has been to chip away at Storyteller’s codebase by slowly replacing dependencies that aren’t supported on the CoreCLR (this work is in the project.json branch on Github):

Current State Proposed End State
  • Targets .Net 4.6
  • Self-hosted w/ Nowin
  • FubuMVC for the web application
  • Fleck for web sockets support
  • Tests execute in a separate AppDomain with all communication done via sending Json messages through .Net Remoting
  • FubuCore for the command line parsing
  • Uses Fixie for unit testing
  • RhinoMocks for mocking
  • Csproj/MSBuild for compiling, Paket for Nuget management
  • A single Nuget for the testing engine library and the test running/documentation generation executable
  • No Visual Studio or VS Code integration
  • Targets .Net 4.6 and the CoreCLR
  • Self-hosted with Kestrel
  • Raw ASP.Net Core middleware
  • Kestrel/ASP.Net Core for Websockets
  • Tests will execute in a separate process, and the communication between processes will all be done with sockets
  • Using Oakton for the command line parsing
  • Uses xUnit for unit testing
  • NSubstitute for mocking
  • The dotnet CLI for CI builds and all Nuget management, project.json for all the projects
  • A nuget for the .Net testing engine library, a second one for the command line tooling for specification running and editing, a third nuget for the documentation generation
  • A fourth nuget for integrating Storyteller with dotnet test
  • A VS Code plugin?

Some thoughts on the work so far:

  • Kestrel and the bits of ASP.Net Core I’m using have been pretty easy to get up and going. The Websockets support doesn’t feel terribly discoverable, but it was easy to find good enough examples and get it going. I was a little irritated with the ASP.Net team for effectively ditching the community-driven OWIN specification (yes, I know that ASP.Net Core supports OWIN, but it’s an emulation) for their own middleware signature. However, I think that what they did do is probably going to be much more discoverable and usable for the average user. I will miss referring to OWIN as the “mystery meat” API.
  • I actually like the new dotnet CLI and I’m looking forward to it stabilizing a bit. I think that it does a lot to improve the .Net development experience. It’s an upside down world when an alt.net founder like me is defending a new Microsoft tool that isn’t universally popular with more mainstream .Net folks.
  • I still like Fixie and I hope that project continues to move forward, but xUnit is the only game in town for the dotnet CLI and CoreCLR.
  • Converting the projects to the new project.json format was relatively harmless compared to the nightmare I had doing the same with StructureMap, but I’m not yet targeting the CoreCLR.
  • I’ve always been gun shy about attempting any kind of Visual Studio.Net integration, but from a cursory look around xUnit’s dotnet runner code, I’m thinking that a “dotnet test” adapter for Storyteller is very feasible.
  • The new Storyteller specification editing user interface is a React.js-based SPA. I’m thinking that that architecture should make it fairly simple to build a VS Code extension for Storyteller.

 

The Killer Problem: AppDomain’s are Gone (for now)

Storyteller, like most tools for automating tests against .Net, relies on AppDomain’s to isolate the application under test from the test harness so that you can happily rebuild your application and rerun without having to completely drop and restart the testing tool. Great, and other than .Net Remoting not being the most developer-friendly thing in the world, that’s worked out fairly well in Storyteller 3 (it had been a mess in Storyteller 1 & 2).

There’s just one little problem, AppDomain’s and Remoting are no longer in the CoreCLR until at least next year (and I’m not wanting to count on them coming back). It would be perfect if you were able to unload the new AssemblyLoadContext, but as far as I know that’s not happening any time soon.

At this point, I’m thinking that Storyteller will work by running tests in a completely separate process to be launched and shut down by the Storyteller test running executable. To make that work, users will have to make their Storyteller specification project be an executable that bootstraps their system under test and then pass an ISystem object and the raw command line parameters into some kind of Storyteller runner.

I’ve been experimenting with using raw sockets for the cross process communication and so far, so good. I’m just shooting json strings back and forth. I thought about using HTTP between the processes, but I came down to just feeling like that would be too heavy. I also considered using our LightningQueues project in its “ZeroMQ” mode, but again, I opted for lighter weight. The other advantage for me is that the “dotnet test” adapter communication is done by json over sockets as well.

I think this strategy of running separate processes would make Storyteller a little more complicated to set up compared to the existing “just make a class library and Storyteller will find your custom ISystem if one exists” strategy.  My big hope is that the combination of depending on a separate command line launched process and shooting json across sockets will make it much easier to bring on alternative test running engines that would be usable with the existing Storyteller user interface tooling.