Skip to main content
All posts
Software Engineering6 min read

Event Sourcing and CQRS in .NET 9: When the Complexity Is Worth It

A practical guide to Event Sourcing and CQRS in .NET 9 — when the pattern justifies its complexity, implementation with Marten, and production lessons learned.

Published

Event Sourcing and CQRS are powerful patterns. They are also the most over-applied patterns in enterprise software. Teams adopt them because they sound elegant, then spend months fighting infrastructure complexity that a simple CRUD application would have avoided.

This post is not an evangelism piece. It explains when Event Sourcing genuinely earns its keep, how to implement it in .NET 9 with Marten, and what production problems you will face that blog tutorials never mention.

When Event Sourcing Is Worth It

Before any implementation detail, the honest question: do you need this?

You Probably Need Event Sourcing If:

Complete audit trail is a regulatory requirement — Financial services, healthcare, and legal systems need to prove exactly what happened, when, and in what order. An append-only event log is legally defensible in ways that mutable database rows are not.

Your domain has complex state machines — An insurance claim that moves through Created → UnderReview → AdditionalInfoRequested → Approved → Paid → Disputed → Resolved has transitions that are best modelled as events. Each transition carries context (who, why, what data) that current-state storage loses.

You need temporal queries — "What was this account's balance on March 15th?" or "What permissions did this user have when they performed this action?" Event replay to a point in time answers these naturally.

Multiple read models from the same data — A single event stream can project into a relational reporting database, a search index, a notification system, and a real-time dashboard simultaneously.

You Probably Do Not Need Event Sourcing If:

  • Your domain is simple CRUD with no complex state transitions
  • You need strong consistency for reads immediately after writes
  • Your team has no experience with the pattern and no time to learn
  • Your audit requirements can be met with a simple change-log table

Architecture Overview

Loading diagram...

Write side: Commands are validated against the aggregate's current state (rebuilt from events). If valid, new events are appended to the event store.

Read side: Projections subscribe to events and update optimised read models (SQL tables, Elasticsearch indices, Redis caches). Queries hit these read models directly.

Implementation with Marten in .NET 9

Marten uses PostgreSQL as both event store and document database. It is the most pragmatic choice for .NET teams — no separate event store infrastructure needed.

The Aggregate

Csharp
public sealed class ShoppingCart
{
    public Guid Id { get; private set; }
    public CartStatus Status { get; private set; }
    public List<CartItem> Items { get; private set; } = new();
    public decimal TotalAmount => Items.Sum(i => i.Price * i.Quantity);

    // Marten calls Apply() when replaying events to rebuild state
    public void Apply(CartCreated @event)
    {
        Id = @event.CartId;
        Status = CartStatus.Active;
    }

    public void Apply(ItemAdded @event)
    {
        Items.Add(new CartItem(@event.ProductId, @event.ProductName, @event.Price, @event.Quantity));
    }

    public void Apply(ItemRemoved @event)
    {
        Items.RemoveAll(i => i.ProductId == @event.ProductId);
    }

    public void Apply(CartCheckedOut @event)
    {
        Status = CartStatus.CheckedOut;
    }
}

Command Handler

Csharp
public sealed class AddItemHandler
{
    private readonly IDocumentSession _session;

    public AddItemHandler(IDocumentSession session) => _session = session;

    public async Task Handle(AddItemCommand command, CancellationToken ct)
    {
        var stream = await _session.Events.FetchForWriting<ShoppingCart>(command.CartId, ct);
        var cart = stream.Aggregate;

        if (cart.Status != CartStatus.Active)
            throw new InvalidOperationException("Cannot add items to a checked-out cart.");

        if (cart.Items.Count >= 50)
            throw new InvalidOperationException("Cart item limit reached.");

        stream.AppendOne(new ItemAdded(
            command.CartId,
            command.ProductId,
            command.ProductName,
            command.Price,
            command.Quantity
        ));

        await _session.SaveChangesAsync(ct);
    }
}

FetchForWriting loads the event stream, replays events to rebuild the aggregate, and uses optimistic concurrency control. If another write happened between fetch and save, a concurrency exception is thrown.

Projections (Read Models)

Csharp
public sealed class CartSummaryProjection : SingleStreamProjection<CartSummary>
{
    public CartSummaryProjection()
    {
        DeleteEvent<CartCheckedOut>();
    }

    public CartSummary Create(CartCreated @event) => new()
    {
        Id = @event.CartId,
        CustomerId = @event.CustomerId,
        CreatedAt = @event.Timestamp,
        ItemCount = 0,
        TotalAmount = 0
    };

    public CartSummary Apply(ItemAdded @event, CartSummary current) => current with
    {
        ItemCount = current.ItemCount + @event.Quantity,
        TotalAmount = current.TotalAmount + (@event.Price * @event.Quantity)
    };

    public CartSummary Apply(ItemRemoved @event, CartSummary current) => current with
    {
        ItemCount = current.ItemCount - @event.Quantity,
        TotalAmount = current.TotalAmount - (@event.Price * @event.Quantity)
    };
}

Marten runs projections inline (same transaction as event append) or asynchronously (eventual consistency but higher throughput).

Registration in Program.cs

Csharp
builder.Services.AddMarten(opts =>
{
    opts.Connection(builder.Configuration.GetConnectionString("Postgres")!);
    opts.Events.StreamIdentity = StreamIdentity.AsGuid;
    opts.Projections.Add<CartSummaryProjection>(ProjectionLifecycle.Inline);
    opts.Projections.Add<DailyRevenueProjection>(ProjectionLifecycle.Async);
})
.UseLightweightSessions()
.AddAsyncDaemon(DaemonMode.HotCold); // For async projections

Production Challenges Nobody Mentions

Snapshotting

Replaying 10,000 events to rebuild an aggregate is slow. Marten supports automatic snapshotting:

Csharp
opts.Events.UseAggregateSnapshots<ShoppingCart>(snapshotting =>
{
    snapshotting.SnapshotEvery(50); // Snapshot every 50 events
});

Event Versioning and Upcasting

Your events will evolve. Never modify stored events — use upcasters:

Csharp
public sealed class OrderCreatedV1ToV2Upcaster : EventUpcaster<OrderCreatedV1, OrderCreatedV2>
{
    protected override OrderCreatedV2 Upcast(OrderCreatedV1 old) => new(
        old.OrderId,
        old.CustomerId,
        Currency: "EUR" // New field with default
    );
}

Projection Rebuilds

When you change a projection's logic, you need to replay all events to rebuild the read model. With Marten:

Bash
dotnet run -- marten-rebuild CartSummaryProjection

For large event stores (millions of events), this can take hours. Plan for it.

Eventual Consistency in Read Models

Async projections mean queries might return stale data. Solutions:

  • Return the event version with writes, let the client poll until the read model catches up
  • Use inline projections for critical read models (sacrifices throughput)
  • Design the UI to be optimistic (show the expected state immediately)

Event Lifecycle: From Command to Read Model

Loading diagram...

When to Combine with CQRS (and When Not To)

Use CQRS + Event Sourcing together when:

  • Read and write patterns differ significantly (many readers, few writers)
  • Different read models need different shapes of the same data
  • You need read model scalability independent of write throughput

Use Event Sourcing without CQRS when:

  • A single read model suffices
  • You primarily need the audit trail, not read optimization

Use CQRS without Event Sourcing when:

  • You need read/write separation for performance (read replicas)
  • Your domain is simple but read load is high

Cost-Benefit Summary

BenefitCost
Complete audit trailEvent versioning complexity
Temporal queriesProjection rebuild time
Multiple read modelsEventual consistency
Natural domain modellingTeam learning curve
Debug production issues (replay events)Storage growth (append-only)

The complexity is real. But for the right domains — financial transactions, insurance claims, logistics workflows, compliance-heavy systems — Event Sourcing transforms impossible debugging into trivial event replay.


Considering Event Sourcing for your enterprise system? Contact us — we help teams adopt the pattern where it genuinely adds value and avoid it where it does not.

Topics

event sourcing .NETCQRS patternMarten EventStoredomain-driven designenterprise architecture patterns

Frequently Asked Questions

Event Sourcing is worth it when you need a complete audit trail (financial systems, healthcare), when your domain has complex state transitions that are better modelled as events, when you need temporal queries (what was the state at time T), or when multiple read models from the same data are required.

Expert engagement

Need expert guidance?

Our team specializes in cloud architecture, security, AI platforms, and DevSecOps. Let's discuss how we can help your organization.

Get in touchNo commitment · No sales pressure

Related articles

All posts