Wolverine and “Clone n’ Go!” Development

I’ve been able to talk and write a bit about Wolverine in the last couple weeks. This builds on the previous blog posts in this list:

When I start with a brand new codebase, I want to be able to be up and going mere minutes after doing an initial clone of the Git repository. And by “going,” I mean being able to run all the tests and running any kind of application in the codebase.

In most cases an application codebase I work with these days is going to have infrastructure dependencies. Usually a database, possibly some messaging infrastructure as well. Not to worry, because Wolverine has you covered with a lot of functionality out of the box to get your infrastructural dependencies configured in the shape you need to start running your application.

Before I get into Wolverine specifics, I’m assuming that the basic developer box has some baseline infrastructure installed:

  • The latest .NET SDK
  • Docker Desktop
  • Git itself
  • Node.js — not used by this post at all, but it’s almost impossible to not need Node.js at some point these days

Yet again, I want to go back to the simple banking application from previous posts that was using both Marten and Rabbit MQ for external messaging. Here’s the application bootstrapping:

using AppWithMiddleware;
using IntegrationTests;
using JasperFx.Core;
using Marten;
using Oakton;
using Wolverine;
using Wolverine.FluentValidation;
using Wolverine.Marten;
using Wolverine.RabbitMQ;

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddMarten(opts =>
{
    // This would be from your configuration file in typical usage
    opts.Connection(Servers.PostgresConnectionString);
    opts.DatabaseSchemaName = "wolverine_middleware";
})
    // This is the wolverine integration for the outbox/inbox,
    // transactional middleware, saga persistence we don't care about
    // yet
    .IntegrateWithWolverine()
    
    // Just letting Marten build out known database schema elements upfront
    // Helps with Wolverine integration in development
    .ApplyAllDatabaseChangesOnStartup();

builder.Host.UseWolverine(opts =>
{
    // Middleware introduced in previous posts
    opts.Handlers.AddMiddlewareByMessageType(typeof(AccountLookupMiddleware));
    opts.UseFluentValidation();

    // Explicit routing for the AccountUpdated
    // message handling. This has precedence over conventional routing
    opts.PublishMessage<AccountUpdated>()
        .ToLocalQueue("signalr")

        // Throw the message away if it's not successfully
        // delivered within 10 seconds
        .DeliverWithin(10.Seconds())
        
        // Not durable
        .BufferedInMemory();
    
    var rabbitUri = builder.Configuration.GetValue<Uri>("rabbitmq-broker-uri");
    opts.UseRabbitMq(rabbitUri)
        // Just do the routing off of conventions, more or less
        // queue and/or exchange based on the Wolverine message type name
        .UseConventionalRouting()
        
        // This tells Wolverine to set up any missing Rabbit MQ queues, exchanges,
        // or bindings needed by the application if they are missing
        .AutoProvision() 
        .ConfigureSenders(x => x.UseDurableOutbox());
});

var app = builder.Build();

// One Minimal API that just delegates directly to Wolverine
app.MapPost("/accounts/debit", (DebitAccount command, IMessageBus bus) => bus.InvokeAsync(command));

// This is important, I'm opting into Oakton to be my
// command line executor for extended options
return await app.RunOaktonCommands(args);

After cloning this codebase, I should be able to quickly run a docker compose up -d command from the root of the codebase to set up dependencies like this:

version: '3'
services:
  postgresql:
    image: "clkao/postgres-plv8:latest"
    ports:
     - "5433:5432"
    environment:
      - POSTGRES_DATABASE=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
  
  rabbitmq:
    image: "rabbitmq:3-management"
    ports:
     - "5672:5672"
     - "15672:15672"

As it is, the Wolverine setup I showed above would allow you to immediately be up and running because:

  • In its default setting Marten is able to detect and build out missing database schema objects in the underlying application database at runtime
  • The Postgresql database schema objects necessary for Wolverine’s transactional outbox are created at bootstrapping time if they’re missing by Marten with the combination of the IntegrateWithWolverine() call and the ApplyAllDatabaseChangesOnStartup() declaration.
  • Any missing Rabbit MQ queues or exchanges are created at runtime due to the AutoProvision() declaration we made in the Rabbit MQ integration with Wolverine

Cool, right?

But there’s more! Wolverine heavily uses the related Oakton library for expanded command line utilities that can be helpful for diagnosing configuration issues, checking up on infrastructure, or applying infrastructure set up at deployment time instead of depending on doing things at runtime.

If I go to the root of the main project and type dotnet run -- help, I’ll get a list of the available command line options like this:

The available commands are:

  Alias           Description
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
  check-env       Execute all environment checks against the application
  codegen         Utilities for working with JasperFx.CodeGeneration and JasperFx.RuntimeCompiler
  db-apply        Applies all outstanding changes to the database(s) based on the current configuration
  db-assert       Assert that the existing database(s) matches the current configuration
  db-dump         Dumps the entire DDL for the configured Marten database
  db-patch        Evaluates the current configuration against the database and writes a patch and drop file if there are any differences
  describe        Writes out a description of your running application to either the console or a file
  help            List all the available commands
  marten-apply    Applies all outstanding changes to the database based on the current configuration
  marten-assert   Assert that the existing database matches the current Marten configuration
  marten-dump     Dumps the entire DDL for the configured Marten database
  marten-patch    Evaluates the current configuration against the database and writes a patch and drop file if there are any differences
  projections     Marten's asynchronous projection and projection rebuilds
  resources       Check, setup, or teardown stateful resources of this system
  run             Start and run this .Net application
  storage         Administer the envelope storage


Use dotnet run -- ? [command name] or dotnet run -- help [command name] to see usage help about a specific command

Let me call out just a few highlights:

  • `dotnet run — resources setup` would do any necessary set up of both the Marten or Rabbit MQ items. Likewise, if we were using Sql Server as the backing storage and integrating that with Wolverine as the outbox storage, this command would set up the necessary Sql Server tables and functions if they were missing. This generically applies as well to Wolverine’s Azure Service Bus or Amazon SQS integrations
  • `dotnet run — check-env` would run a set of environment checks to verify that the application can connect to the configured Rabbit MQ broker, the Postgresql database, and any other checks you may have. This is a great way to make deployments “fail fast”
  • `dotnet run — storage clear` would delete any persisted messages in the Wolverine inbox/outbox to remove old messages that might interfere with successful testing

Questions, comments, feedback? Hopefully this shows that Wolverine is absolutely intended for “grown up development” in real life.

Advertisement

2 thoughts on “Wolverine and “Clone n’ Go!” Development

  1. How would you compare Wolverine’s maturity to more established options, like MassTransit or NServiceBus? I am looking to bring something in where I work to push us into a more message based architecture and am exploring options. Specifically, is there long-running transaction support, i.e. Sagas? I think NSB includes monitoring capabilities OOTB as well, which we would need. I’m probably missing about a hundred or two other capabilities here, but that is a decent start.

    1. NSB is 15 years old and MT about the same, so yeah, they’re more mature:)

      Wolverine has built in Saga support for both Marten & EF Core so far: (https://wolverine.netlify.app/guide/durability/sagas.html). The saga support is quite a bit lower ceremony than NSB, and a little bit lower ceremony than MT. OpenTelemetry tracing and .NET ILogger integration comes out of the box with Wolverine. You just have to explicitly tell your app to export Wolverine data. It also has built in metrics through System.Diagnostics

      And yeah, there’s a ton of other functionality too:) I’m still working on the Wolverine documentation, and there’s plenty of existing functionality that isn’t doc’d yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s