Lamar stays and how that enfolds

I wrote a blog post at the end of last week about Lamar when I just happened to be feeling discouraged about a couple things (it happens sometimes, and I swear that having to support IoC tools exposes me to more difficult people than every other project I work on combined). I got some rest this weekend, a bit of positive reinforcement from other folks, and actually thought through how to fix the issues. Long story short, I’m not giving up on anything and here’s what I think the very doable game plan is for Lamar (and the closely related Jasper project):

Lamar

  • Short term: Get a small bug fix release out soon that has some options to “force” all the compilation upfront in one dynamic assembly. That’s gonna hurt the cold start time, but should help the memory usage. We’ll also look to see if there’s any places where Lamar is holding on unnecessarily to the Roslyn compilation objects to ensure that they can be garbage collected
  • Medium term: Introduce an alternative compilation model based on Expressions compiled to Lambdas with FastExpressionCompiler. This model will kick in automatically whenever there’s one or more internal types in the “build plan”, and could be opted into globally through a container level switch. I didn’t want to do this originally because the model just isn’t very fun to work with.  After thinking it through quite a bit over the weekend, I think it won’t be bad at all to retrofit this alternative to Lamar’s existing Frame and Variable model. This will knock out the performance issues with internal types and address all the memory issues.
  • Long term: Probably split up LamarCompiler a little bit to remove the actual code compilation to significantly slim down Lamar’s dependency tree and move Lamar to a purely Expression based model. Introducing the Expression model will inevitably make the exception stacktrace messages coming out of Lamar explode, so there might have to be an effort to rewrite them to make them more user friendly (I had to do this in StructureMap 3 several years ago).

 

Jasper

I’m very close to pulling the trigger on a Jasper v1.0, with the understanding that it’s inevitable that there will be a Jasper v2.0 later in the year to incorporate .Net Core 3.0. I don’t think that Jasper will have any issues with memory usage related to Lamar because it uses Lamar very differently than MVC in any flavor. The changes to Lamar will impact Jasper though, so:

Short term: Jasper v1.0 with Lamar as it is.

Medium term: Jasper gets a model where you can happily use the runtime codegen and compilation during development time while things are churning, but for production usage you have the ability to just drop the code that would be generated to disk, have that compiled into your system in the first place, and let Jasper use those types. Ultra fast production time cold start times, no worries at all about Roslyn doing bad things from memory. I’ve already done successful proof of concept development on this one.

Long term: profit.

As an aside, I got quizzed quite a bit about why Jasper has to be specific to Lamar as its IoC container and can’t just support whatever tool folks want to use. The reason is that Jasper uses Lamar’s very specific code generation in its pipeline to avoid using an IoC container at runtime whatsoever and also to avoid forcing users to have to conform to all kinds of Jasper specific adapter interfaces. I could maybe force Jasper to still pull this off with the built in DI container or another IoC container with Jasper-centric adapters to expose all of its metadata in a way such that the codegen understands it, but just ick.

If you took Lamar’s runtime codegen away, I think Jasper inevitably looks like a near clone of NServiceBus or Brighter both in its usability and runtime pipeline and why does the world need that?

 

 

One thought on “Lamar stays and how that enfolds

  1. Although your previous post made sense when considering whether to keep going with Lamar/Jasper or not, it is great to hear you have decided to keep with them.

Leave a comment