I’ve been thinking a lot about how we can build systems that not only process data quickly but also present it in a way that feels immediate and responsive. In today’s fast-paced digital environment, waiting for data to refresh feels outdated. That’s why I want to share a practical approach to building a high-performance real-time analytics dashboard using WebSockets, Redis Streams, and React Query.
Let’s start with the foundation. Why use Redis Streams instead of traditional message queues? Traditional queues often struggle with high-throughput scenarios where you need to process thousands of events per second while maintaining order and ensuring no data loss. Redis Streams provides exactly what we need: persistent storage with consumer groups that allow multiple processors to read from the same stream without missing messages.
Here’s how I set up the Redis client and stream management:
// services/redis-client.ts
import Redis from 'ioredis';
export class RedisStreamManager {
private redis: Redis;
private readonly STREAM_KEY = 'analytics:events';
constructor() {
this.redis = new Redis({
host: process.env.REDIS_HOST,
port: parseInt(process.env.REDIS_PORT),
retryDelayOnFailover: 100,
maxRetriesPerRequest: 3
});
}
async addEvent(event: AnalyticsEvent): Promise<string> {
const eventData = {
type: event.type,
userId: event.userId,
timestamp: event.timestamp.toString(),
metadata: JSON.stringify(event.metadata)
};
return await this.redis.xadd(
this.STREAM_KEY,
'*',
...Object.entries(eventData).flat()
);
}
}
Now, what happens when you need to process these events and push them to connected clients in real-time? This is where WebSockets come into play. But have you considered how to handle backpressure when clients can’t keep up with the data flow?
I built a WebSocket server that acts as a bridge between Redis Streams and the frontend. It consumes events from the stream, aggregates them into meaningful metrics, and broadcasts updates to all connected clients. The key insight here is to batch process events rather than sending each one individually – this reduces the overhead of WebSocket communication while maintaining real-time feel.
// server/websocket-server.ts
import WebSocket from 'ws';
export class AnalyticsWebSocketServer {
private wss: WebSocket.Server;
private connectedClients: Set<WebSocket> = new Set();
constructor(port: number) {
this.wss = new WebSocket.Server({ port });
this.wss.on('connection', (ws) => {
this.connectedClients.add(ws);
ws.on('close', () => this.connectedClients.delete(ws));
});
}
broadcastMetrics(metrics: MetricsSnapshot): void {
const data = JSON.stringify(metrics);
this.connectedClients.forEach(client => {
if (client.readyState === WebSocket.OPEN) {
client.send(data);
}
});
}
}
On the frontend, React Query makes managing server state incredibly efficient. Instead of manually handling WebSocket messages and updating component state, we can use React Query’s subscription model to keep our UI in sync with the server. The beauty of this approach is that it handles caching, background updates, and error recovery out of the box.
Here’s how I set up the React dashboard to consume WebSocket data:
// hooks/useLiveMetrics.ts
import { useQuery } from 'react-query';
export const useLiveMetrics = () => {
return useQuery(
'live-metrics',
() => {
// This function won't be called directly
// We'll update the cache via WebSocket messages
return null;
},
{
staleTime: Infinity,
cacheTime: Infinity
}
);
};
// components/WebSocketManager.tsx
useEffect(() => {
const ws = new WebSocket(WS_URL);
ws.onmessage = (event) => {
const data = JSON.parse(event.data);
queryClient.setQueryData('live-metrics', data);
};
return () => ws.close();
}, []);
Performance optimization becomes crucial when dealing with high-frequency updates. I found that throttling UI updates to 60fps while processing data at full speed provides the best user experience. The dashboard feels smooth while still processing every event behind the scenes.
What about error handling and reconnection strategies? Network issues are inevitable, so I implemented exponential backoff for reconnections and built mechanisms to catch up on missed data when reconnecting. Redis Streams’ message persistence ensures we never lose data during connection drops.
The final piece is aggregation. Instead of sending raw events to the client, the server aggregates data into minute-by-minute snapshots that include computed metrics like active users, conversion rates, and revenue totals. This reduces the data transfer size by orders of magnitude while providing exactly what the dashboard needs.
Building this system taught me that real-time analytics isn’t just about speed – it’s about building resilient systems that can handle scale while providing meaningful insights. The combination of Redis Streams for reliable event processing, WebSockets for real-time communication, and React Query for efficient state management creates a robust foundation for any real-time dashboard.
I’d love to hear your thoughts on this approach. What challenges have you faced with real-time data? Share your experiences in the comments below, and if you found this useful, please like and share with others who might benefit from these techniques.