js

Why Server-Sent Events Might Be the Real-Time Solution You Need

Discover how Server-Sent Events offer a simpler, scalable way to push real-time updates without the complexity of WebSockets.

Why Server-Sent Events Might Be the Real-Time Solution You Need

I was building a dashboard for a client last week when I hit a wall. The page needed to show live updates—new user registrations, system alerts, that sort of thing. I immediately thought of WebSockets. But then I paused. Did I really need a full two-way communication channel? The server was sending data; the client was just listening. Setting up a WebSocket server felt like using a sledgehammer to crack a nut. That’s when I decided to take a closer look at Server-Sent Events, or SSE. It turned out to be the perfect, simpler tool for the job. If you’ve ever needed to push updates from your server without the complexity of a persistent two-way connection, this is for you. Let’s build something.

SSE is a web standard that lets a server send automatic updates to a client over a single, long-held HTTP connection. Think of it as a one-way street for data, from your server to the user’s browser. The client opens a connection and just listens. Whenever you have new data, you write it to that connection. The browser’s native EventSource API handles reconnecting if the link drops. It’s surprisingly straightforward and works over plain HTTP, which makes it friendly with most networks and proxies.

So, when should you choose SSE over something like WebSockets? It comes down to the direction of travel. Do you need the client to talk back to the server constantly, like in a chat app or a game? Use WebSockets. Is the flow mostly one-way, with the server broadcasting events like notifications, stock ticks, or live scores? SSE is your answer. It’s simpler to implement and manages reconnection logic for you. Have you considered how many of your app’s “real-time” features are actually just one-way notifications?

Let’s set up a project. Create a new directory and initialize it. We’ll use Express and TypeScript.

mkdir sse-app
cd sse-app
npm init -y
npm install express
npm install -D typescript @types/express @types/node ts-node nodemon

Next, create a tsconfig.json file for TypeScript settings.

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "commonjs",
    "outDir": "./dist",
    "rootDir": "./src",
    "strict": true,
    "esModuleInterop": true
  }
}

Now, let’s write the core server code. Create a file called server.ts in a src folder. First, we’ll define what a message looks like.

// src/types.ts
export interface ServerEvent {
  event?: string;  // e.g., 'message', 'update'
  data: string;    // The actual content
  id?: string;     // For tracking the last received event
  retry?: number;  // How long to wait before reconnecting (ms)
}

The format is specific. Each message is just text, with lines prefixed by event:, data:, id:, or retry:. A blank line signals the end of one event. Let’s make a helper function to build these messages correctly.

// src/utils.ts
export function formatEventMessage(message: ServerEvent): string {
  let output = '';
  if (message.event) {
    output += `event: ${message.event}\n`;
  }
  if (message.id) {
    output += `id: ${message.id}\n`;
  }
  if (message.retry) {
    output += `retry: ${message.retry}\n`;
  }
  // Data can be multiple lines. Each line must be prefixed.
  const dataLines = message.data.split('\n');
  dataLines.forEach(line => {
    output += `data: ${line}\n`;
  });
  output += '\n'; // Crucial blank line
  return output;
}

With our tools ready, we can build the Express endpoint. The key is setting the correct headers to tell the browser this is an event stream.

// src/server.ts
import express, { Request, Response } from 'express';
import { formatEventMessage } from './utils';

const app = express();
const PORT = 3000;

// Store active client connections
const clients: Response[] = [];

app.get('/events', (req: Request, res: Response) => {
  // Set headers for SSE
  res.writeHead(200, {
    'Content-Type': 'text/event-stream',
    'Cache-Control': 'no-cache',
    'Connection': 'keep-alive',
  });

  // Send a comment initially to establish the connection
  res.write(': connection established\n\n');

  // Add this response object to our list of clients
  clients.push(res);

  // Remove client when connection closes
  req.on('close', () => {
    const index = clients.indexOf(res);
    if (index !== -1) {
      clients.splice(index, 1);
    }
    console.log(`Client disconnected. ${clients.length} active.`);
  });
});

// A route to simulate sending a message to all clients
app.post('/broadcast', (req: Request, res: Response) => {
  const message = {
    event: 'update',
    data: JSON.stringify({ time: new Date().toISOString(), message: 'Hello, client!' }),
    id: Date.now().toString(),
  };

  const formattedMessage = formatEventMessage(message);
  
  // Send to every connected client
  clients.forEach(client => {
    client.write(formattedMessage);
  });

  res.json({ sent: true, clients: clients.length });
});

app.listen(PORT, () => {
  console.log(`SSE server listening on http://localhost:${PORT}`);
});

Run npx ts-node src/server.ts and navigate to http://localhost:3000/events in your browser. You’ll see a hanging connection. Open another tab and POST to /broadcast using a tool like curl (curl -X POST http://localhost:3000/broadcast). Watch the event appear in your first tab. It’s alive!

But we have a problem. Our simple array doesn’t scale. What if we need to send a message to just one user? Or handle authentication? We need a better manager. Let’s create a service class.

// src/SSEManager.ts
import { Response } from 'express';
import { formatEventMessage } from './utils';

type ClientId = string;

export class SSEManager {
  private clients: Map<ClientId, Response> = new Map();

  addClient(clientId: ClientId, res: Response): void {
    // Set headers
    res.writeHead(200, {
      'Content-Type': 'text/event-stream',
      'Cache-Control': 'no-cache',
      'Connection': 'keep-alive',
    });

    this.clients.set(clientId, res);
    this.sendToClient(clientId, { data: 'Welcome' });

    // Clean up on close
    res.on('close', () => {
      this.removeClient(clientId);
    });
  }

  sendToClient(clientId: ClientId, message: any): boolean {
    const client = this.clients.get(clientId);
    if (!client) return false;
    
    const eventMessage = formatEventMessage({
      data: JSON.stringify(message)
    });
    
    return client.write(eventMessage);
  }

  broadcast(message: any): void {
    const eventMessage = formatEventMessage({
      data: JSON.stringify(message)
    });
    
    this.clients.forEach(client => {
      client.write(eventMessage);
    });
  }

  removeClient(clientId: ClientId): void {
    this.clients.delete(clientId);
  }
}

// Singleton instance
export const sseManager = new SSEManager();

Now, our endpoint becomes cleaner and more powerful.

// Updated /events endpoint in server.ts
import { sseManager } from './SSEManager';

app.get('/events', (req: Request, res: Response) => {
  // In a real app, get this from a user session or token
  const userId = req.query.userId as string || 'anonymous';
  sseManager.addClient(userId, res);
});

What about keeping the connection alive? Some proxies might close idle connections. A common trick is to send a comment line as a heartbeat.

// Inside SSEManager constructor
constructor() {
  setInterval(() => {
    this.clients.forEach(client => {
      client.write(': heartbeat\n\n');
    });
  }, 15000); // Every 15 seconds
}

This sends a simple comment (: heartbeat) every 15 seconds, which keeps the TCP connection active without being a formal event the browser will process. It’s like a gentle nudge to say, “I’m still here.”

Now, imagine your app runs on multiple servers. A client connects to Server A, but an event is triggered on Server B. How does Server B tell Server A to send a message? This is where a pub/sub system like Redis comes in. Each server subscribes to a channel. When Server B has an event, it publishes it. Server A hears it and sends it to its connected client. It’s a crucial step for horizontal scaling.

// Example using ioredis library
import Redis from 'ioredis';

const redis = new Redis();
const publisher = new Redis();

// When you need to broadcast from any server
publisher.publish('sse-channel', JSON.stringify({ userId: 'user123', data: 'Update!' }));

// In your server setup, listen for messages
redis.subscribe('sse-channel', (err, count) => {
  if (err) console.error('Subscription failed');
});

redis.on('message', (channel, message) => {
  const { userId, data } = JSON.parse(message);
  sseManager.sendToClient(userId, data); // This only works for clients on THIS server
});

The client-side code is beautifully simple. The browser’s EventSource API handles the heavy lifting.

<!DOCTYPE html>
<html>
<body>
  <div id="events"></div>
  <script>
    const eventSource = new EventSource('http://localhost:3000/events?userId=alice');
    
    eventSource.onmessage = (e) => {
      const data = JSON.parse(e.data);
      document.getElementById('events').innerHTML += `<p>${data.message}</p>`;
    };
    
    eventSource.addEventListener('update', (e) => {
      console.log('Custom event received:', e.data);
    });
    
    eventSource.onerror = (err) => {
      console.error('EventSource failed:', err);
    };
  </script>
</body>
</html>

Notice how you can listen to the generic onmessage or to specific event types using addEventListener. This gives you fine-grained control on the front end. What kind of live data could you visualize in your own projects using this simple pattern?

There are a few things to watch for. Always set timeouts on your server-side responses to avoid hanging resources. Clean up your client list when connections close. Be mindful of the data format; your data field must be a string. For structured data, just JSON.stringify it first. And remember, SSE has a limit on concurrent connections per browser (usually 6 per domain). For most applications, this is plenty.

I went from seeing a problem to having a working, scalable solution in an afternoon. SSE removed so much unnecessary complexity. It feels like discovering a secret door in a familiar room—a simpler path was there all along. The next time you need live updates, ask yourself: “Does this need a conversation, or just an announcement?” If it’s the latter, give Server-Sent Events a try.

Did this guide help you see a simpler path for your real-time features? If you built something cool with SSE, I’d love to hear about it. Share your thoughts or questions in the comments below. If you found this useful, please pass it along to another developer who might be overcomplicating their live data stream. Happy coding


As a best-selling author, I invite you to explore my books on Amazon. Don’t forget to follow me on Medium and show your support. Thank you! Your support means the world!


101 Books

101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.

Check out our book Golang Clean Code available on Amazon.

Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!


📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!


Our Creations

Be sure to check out our creations:

Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools


We are on Medium

Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva

Keywords: server-sent events,sse,real-time updates,nodejs,websockets alternative



Similar Posts
Blog Image
Build Type-Safe Full-Stack Apps: Complete Next.js and Prisma Integration Guide for TypeScript Developers

Learn how to integrate Next.js with Prisma for type-safe full-stack TypeScript apps. Build seamless database operations with complete type safety from frontend to backend.

Blog Image
Build Production-Ready Event-Driven Microservices with NestJS, RabbitMQ, and Redis: Complete Guide

Learn to build scalable event-driven microservices with NestJS, RabbitMQ & Redis. Master message queues, caching, error handling & production deployment strategies.

Blog Image
Build Type-Safe Event-Driven Architecture with TypeScript, NestJS, and RabbitMQ

Learn to build type-safe event-driven architecture with TypeScript, NestJS & RabbitMQ. Master microservices, error handling & scalable messaging patterns.

Blog Image
Build Production-Ready GraphQL APIs with NestJS, Prisma, and Redis: Complete Performance Optimization Guide

Learn to build scalable GraphQL APIs with NestJS, Prisma ORM, and Redis caching. Master authentication, performance optimization, and production deployment.

Blog Image
Complete Guide to Integrating Next.js with Prisma ORM for Type-Safe Full-Stack Development

Learn how to integrate Next.js with Prisma ORM for type-safe full-stack development. Build modern web apps with seamless database operations and improved DX.

Blog Image
How to Build a Production-Ready GraphQL API with NestJS, Prisma, and Redis: Complete Guide

Learn to build a production-ready GraphQL API using NestJS, Prisma & Redis caching. Complete guide with authentication, optimization & deployment tips.