AdvancedSoftware Engineer

Build an append-only event store with projection and replay

Implement a minimal event sourcing system for a bank account domain: define domain events (AccountOpened, MoneyDeposited, MoneyWithdrawn), build an append-only event store backed by a PostgreSQL table, write a projection that replays events to derive current account state, and verify that replaying the full history produces identical results to the current state.

Why this matters

Event sourcing makes auditing, debugging, and temporal queries trivial; because you never mutate state, you can reconstruct what any entity looked like at any point in time. Building it from scratch forces you to confront the real constraints: event ordering, schema versioning, and the projection performance trade-off that frameworks hide from you.

Before you start

Step-by-step guide

  1. 1

    Define your domain events

    Create a Python dataclass or Pydantic model for each event: AccountOpened(account_id, owner_name, timestamp), MoneyDeposited(account_id, amount, timestamp), MoneyWithdrawn(account_id, amount, timestamp). Each event is immutable; no methods, no mutation, just data.

  2. 2

    Create the event store table

    Write CREATE TABLE with: id SERIAL PRIMARY KEY, aggregate_id UUID, event_type VARCHAR, payload JSONB, sequence_number BIGINT, occurred_at TIMESTAMPTZ. The sequence_number is per-aggregate; it is what you use to detect concurrency conflicts. Run the migration.

  3. 3

    Implement append

    Write an append(event) function that serialises the event to JSON, increments the sequence number for the aggregate, and INSERTs a row. Raise a concurrency error if the expected sequence number does not match. Test it: open an account, deposit, withdraw, and verify 3 rows appear.

  4. 4

    Implement replay

    Write a replay(aggregate_id) function that SELECTs all events ORDER BY sequence_number and applies each to an initial state dict using a match/case or dispatch table. The final state is the account balance. Verify it matches what you expect from the 3 events you stored.

  5. 5

    Build a snapshot for performance

    For accounts with 1000+ events, replay is slow. Add a snapshot table that stores the projected state at a given sequence number. Update replay to: load the latest snapshot, then apply only events after that sequence number. Measure replay time before and after for 10,000 events.

  6. 6

    Test temporal queries

    Query events for a specific account up to a given timestamp. Replay up to that point. Verify you can reconstruct the account state at any point in its history. This is the feature that makes event sourcing worth the complexity; in a CRUD system, this would require a full audit log redesign.

Relevant Axiom pages

What to do next

Back to Practice Lab