Ever wonder how e-commerce giants track user behavior instantly during flash sales? That question kept nagging at me while building analytics for a high-traffic online marketplace. We needed real-time insights without slowing down transactions—a perfect case for event-driven architecture. I’ll show you how to implement this using NestJS, Redis, and MongoDB. Stick around to see how these technologies form a powerhouse trio for live analytics.
First, let’s address why traditional approaches fall short. In synchronous systems, analytics tracking blocks critical operations like payment processing. Imagine your checkout freezing because an analytics service is overloaded! Event-driven patterns decouple these processes. When a user places an order, we emit an event instead of calling services directly:
// Order service - Event producer
async createOrder(orderData: CreateOrderDto) {
const order = await this.orderRepository.save(orderData);
await this.eventBus.publish(new OrderCreatedEvent(order)); // Non-blocking
return order;
}
This simple shift prevents cascading failures. But how do we ensure events reach their destinations reliably? Redis acts as our central nervous system. Its pub/sub model handles message routing while streams enable event sourcing. Here’s our event bus implementation:
// Redis Event Bus Core
async publish<T extends BaseEvent>(event: T) {
const channel = `events:${event.getEventName()}`;
await this.publisher.publish(channel, JSON.stringify({
eventName: event.getEventName(),
payload: event.getPayload(),
timestamp: new Date()
}));
// Event sourcing storage
await this.publisher.xadd(`event-stream`, '*',
'event', JSON.stringify(event));
}
Notice the dual approach? Events broadcast immediately via pub/sub while persisting in Redis Streams. This gives us replayability—crucial for debugging or regenerating analytics after failures. Now, what happens when services need to react? Consumers subscribe to specific channels:
// Analytics service - Event consumer
this.eventBus.subscribe('OrderCreatedEvent', async (event) => {
await this.calculateLifetimeValue(event.customerId);
this.updateRealTimeDashboard(event); // WebSocket push
});
Here’s where things get interesting. For real-time dashboards, we pair Redis with WebSockets. When the analytics service processes events, it pushes updates to connected clients via Socket.IO:
// WebSocket gateway for live dashboards
@WebSocketGateway()
export class AnalyticsGateway {
@WebSocketServer() server: Server;
updateDashboard(data: AnalyticsDTO) {
this.server.emit('analyticsUpdate', data); // Client-side updates
}
}
But how do we prevent data loss during outages? Our MongoDB event store provides durability. By persisting events in a collection, we create an audit trail:
// Event sourcing schema
@Schema({ versionKey: false })
export class EventStoreDocument {
@Prop({ required: true })
aggregateId: string;
@Prop({ required: true, index: true })
eventName: string;
@Prop({ type: Object })
payload: Record<string, any>;
@Prop({ default: Date.now })
timestamp: Date;
}
What about performance under load? We optimize through:
- Batching: Process events in groups
- Parallelism: Redis consumers in worker threads
- Selective indexing: Only index queried fields in MongoDB
Testing is critical—especially for failure scenarios. We simulate network partitions and Redis downtime using Jest:
// Failure recovery test
test('replays events after service restart', async () => {
await publishTestEvents(100); // Simulate downtime
const consumer = startEventConsumer();
expect(consumer.processedCount).toEqual(100); // Verifies replay
});
Common pitfalls? Watch for:
- Event ordering issues (use Redis Streams’ FIFO)
- Overly chatty services (batch small events)
- Missing idempotency (include event IDs in handlers)
After implementing this for several e-commerce platforms, I’ve seen 40% faster transaction times with real-time analytics. The beauty lies in Redis’ speed for routing, MongoDB’s reliability for storage, and NestJS’ structure for clean separation of concerns.
This architecture transformed how we handle peak traffic events. What challenges have you faced with real-time analytics? Share your experiences below—I’d love to hear different approaches. If this helped you, pass it along to others wrestling with these systems!