Recently, I’ve been reflecting on how modern applications handle data. We often focus so much on the current state that we lose the story of how we got there. This realization hit me while working on a financial system where audit trails were not just nice-to-have but mandatory. That’s when I discovered event sourcing—a pattern that fundamentally changes how we think about data persistence.
What if I told you there’s a way to never lose any business transaction? Event sourcing does exactly that by storing every change as an immutable event. Instead of overwriting data, we build state by replaying these events. This approach has transformed how I build resilient systems, especially with TypeScript’s type safety and EventStore’s robust event storage.
Have you ever wondered how systems like banking platforms or e-commerce sites maintain perfect audit trails? The secret often lies in event sourcing. Let me show you how to implement this pattern step by step.
First, let’s set up our environment. We’ll use TypeScript for type safety, EventStoreDB for event storage, and Express for our API layer. Here’s how to initialize the project:
npm init -y
npm install express @eventstore/db-client uuid class-validator
npm install -D typescript @types/express @types/uuid
The core of event sourcing lies in domain events. These represent business facts that have occurred. Think of them as the building blocks of your application’s history. Here’s how I define a base event class:
abstract class DomainEvent {
constructor(
public readonly eventId: string,
public readonly aggregateId: string,
public readonly occurredOn: Date,
public readonly eventType: string
) {}
}
class OrderCreated extends DomainEvent {
constructor(aggregateId: string, public readonly customerId: string) {
super(uuid(), aggregateId, new Date(), 'OrderCreated');
}
}
Notice how each event captures a specific business moment. But how do we reconstruct current state from these events? That’s where aggregates come in. An aggregate is a cluster of related objects that we treat as a unit for data changes.
Consider this order aggregate that rebuilds its state from events:
class Order {
private status: OrderStatus = OrderStatus.DRAFT;
constructor(private id: string, private events: DomainEvent[] = []) {
this.replayEvents(events);
}
private replayEvents(events: DomainEvent[]) {
events.forEach(event => {
if (event.eventType === 'OrderCreated') {
this.status = OrderStatus.CONFIRMED;
}
// Handle other event types...
});
}
}
Why would we choose this approach over traditional CRUD? The answer becomes clear when you need to debug production issues or analyze business trends. Since every change is stored, you can recreate the system’s state at any point in time.
Now, let’s look at storing these events. EventStoreDB is purpose-built for this pattern. Here’s how I connect to it:
import { EventStoreDBClient } from '@eventstore/db-client';
const client = EventStoreDBClient.connectionString(
'esdb://localhost:2113?tls=false'
);
async function appendEvent(streamName: string, event: DomainEvent) {
await client.appendToStream(streamName, event);
}
But what about reading data? This is where CQRS (Command Query Responsibility Segregation) shines. Commands write events, while queries read from optimized projections. This separation allows each side to scale independently.
Here’s a simple command handler:
class CreateOrderHandler {
async handle(command: CreateOrder) {
const order = Order.create(command.orderId, command.customerId);
const events = order.getUncommittedEvents();
await eventStore.append(`order-${command.orderId}`, events);
}
}
And a corresponding query handler:
class OrderQueryHandler {
async getOrder(orderId: string) {
return await orderProjection.get(orderId);
}
}
Have you considered what happens when your event schema needs to change? Event versioning is crucial here. I always include a version number in events and use upcasters to transform old events to new formats.
interface OrderCreatedV1 {
version: 1;
customerId: string;
}
interface OrderCreatedV2 {
version: 2;
customerId: string;
createdAt: Date;
}
function upcastV1ToV2(event: OrderCreatedV1): OrderCreatedV2 {
return {
...event,
version: 2,
createdAt: new Date()
};
}
Performance can become a concern when replaying thousands of events. That’s where snapshots help. Periodically, we save the current state, so we only need to replay events after the last snapshot.
class OrderSnapshot {
constructor(
public readonly orderId: string,
public readonly state: any,
public readonly version: number
) {}
}
Testing event-sourced systems requires a different approach. I focus on testing the behavior through events:
describe('Order', () => {
it('should confirm order', () => {
const order = Order.create('order-1', 'customer-1');
order.confirm();
expect(order.getUncommittedEvents()).toContainEqual(
expect.objectContaining({ eventType: 'OrderConfirmed' })
);
});
});
Throughout my journey with event sourcing, I’ve found it particularly valuable for complex business domains. The initial setup requires more thought, but the long-term benefits in debugability and business intelligence are immense.
What challenges have you faced with traditional data persistence? Could storing events instead of states solve some of those problems? I’d love to hear your thoughts in the comments.
If this approach to building systems resonates with you, please share this article with your team or colleagues. Let’s continue the conversation about building more transparent and resilient applications together. Your experiences and insights could help others in our community—feel free to leave a comment below!