I’ve been thinking about building modern, scalable systems that can handle complexity while remaining maintainable. The combination of event sourcing with microservices offers a robust solution, particularly when performance is critical. Let me walk you through creating a high-performance event-driven service using Fastify, EventStore, and TypeScript.
Why consider this approach? Traditional CRUD systems often lose valuable historical context. What if you could track every change to your data, replay events to rebuild state, and maintain a complete audit trail without complex database migrations? Event sourcing provides exactly that capability.
Let’s start by setting up our environment. I prefer using a structured approach that separates concerns clearly.
// package.json dependencies
{
"dependencies": {
"fastify": "^4.0.0",
"@eventstore/db-client": "^3.0.0",
"typescript": "^5.0.0",
"uuid": "^9.0.0"
}
}
The domain model forms the heart of our system. How do we ensure our events properly represent business operations? We define clear event types with TypeScript’s type system.
interface OrderCreatedEvent {
type: 'OrderCreated';
data: {
orderId: string;
customerId: string;
totalAmount: number;
};
}
Connecting to EventStore requires careful configuration. I’ve found that a dedicated client wrapper helps manage connections and retries effectively.
class EventStoreClient {
private client: EventStoreDBClient;
constructor() {
this.client = EventStoreDBClient.connectionString(
'esdb://localhost:2113?tls=false'
);
}
async appendToStream(streamName: string, events: EventData[]) {
return this.client.appendToStream(streamName, events);
}
}
Fastify serves as our HTTP layer, offering excellent performance out of the box. The plugin system allows us to organize code cleanly.
const server = fastify({
logger: {
level: 'info',
transport: {
target: 'pino-pretty'
}
}
});
server.post('/orders', async (request, reply) => {
const command = request.body as CreateOrderCommand;
const events = await handleCreateOrder(command);
await eventStore.appendToStream(`order-${command.orderId}`, events);
return { success: true, orderId: command.orderId };
});
Error handling in distributed systems requires special attention. What happens when EventStore is temporarily unavailable? We implement retry logic with exponential backoff.
async function withRetry<T>(
operation: () => Promise<T>,
maxAttempts: number = 3
): Promise<T> {
let attempt = 0;
while (attempt < maxAttempts) {
try {
return await operation();
} catch (error) {
attempt++;
if (attempt === maxAttempts) throw error;
await new Promise(resolve =>
setTimeout(resolve, Math.pow(2, attempt) * 1000)
);
}
}
throw new Error('Max retry attempts exceeded');
}
Projections transform our events into readable views. They enable efficient querying without affecting the event stream’s integrity.
class OrderProjection {
private orders: Map<string, Order> = new Map();
applyEvent(event: OrderEvent) {
switch (event.type) {
case 'OrderCreated':
this.orders.set(event.data.orderId, {
...event.data,
status: 'created'
});
break;
// Handle other event types
}
}
getOrder(orderId: string): Order | undefined {
return this.orders.get(orderId);
}
}
Monitoring proves crucial in production systems. We integrate logging and metrics to track performance and identify issues early.
server.addHook('onResponse', (request, reply, done) => {
const responseTime = reply.getResponseTime();
metrics.observeResponseTime(responseTime);
logger.info({ responseTime }, 'Request completed');
done();
});
Testing event-sourced systems requires a different approach. We focus on verifying event sequences and state rebuilding.
describe('Order Aggregate', () => {
it('should create order with correct events', async () => {
const command: CreateOrderCommand = { /* test data */ };
const events = await handleCreateOrder(command);
expect(events).toHaveLength(1);
expect(events[0].type).toBe('OrderCreated');
expect(events[0].data.orderId).toBe(command.orderId);
});
});
Performance optimization becomes important at scale. We use connection pooling, batch processing, and efficient serialization.
const eventBatch: EventData[] = events.map(event =>
jsonEvent({
type: event.type,
data: event.data,
metadata: { timestamp: new Date() }
})
);
await eventStore.appendToStream(streamName, eventBatch);
This architecture provides exceptional scalability and maintainability. The clear separation between write and read models, combined with full historical tracking, creates systems that can evolve with business needs.
I’d love to hear your thoughts on this approach. Have you implemented event sourcing in your projects? What challenges did you face? Share your experiences in the comments below, and if you found this useful, please consider liking and sharing with others who might benefit from this information.