Published on

Server-Sent Events in Production — Simpler Than WebSockets for Most Use Cases

Authors

Introduction

You need real-time updates from the server. Your first instinct: WebSockets. Your second thought: SSE might be simpler.

Server-Sent Events (SSE) are HTTP. No persistent TCP connection, no upgrade handshake, no complex state management. Just a GET request that streams. Browser automatically reconnects. Use SSE when you only need server→client updates. Save WebSockets for bidirectional chat and collaboration.

SSE vs WebSockets Decision Framework

SSE wins when:

  • Server sends, client receives (no bidirectional traffic)
  • Reconnection and retry are acceptable
  • You want minimal infrastructure
  • You're streaming AI responses, logs, or dashboard updates

WebSockets wins when:

  • Bidirectional: client and server both send independently
  • Latency <100ms is critical
  • Dozens of message exchanges per second
  • You're building chat, gaming, or collaborative features

Most real-time features (dashboards, live feeds, AI responses) are server→client. SSE wins in production simplicity.

Implementing SSE in Express, Fastify, Hono

Express:

import express from 'express';

const app = express();

app.get('/events', (req, res) => {
  res.setHeader('Content-Type', 'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');
  res.setHeader('Connection', 'keep-alive');

  // Send initial connection event
  res.write(': connected\n\n');

  // Send an event every second
  const interval = setInterval(() => {
    const data = JSON.stringify({
      timestamp: new Date().toISOString(),
      value: Math.random()
    });
    res.write(`data: ${data}\n\n`);
  }, 1000);

  // Clean up on disconnect
  req.on('close', () => {
    clearInterval(interval);
    res.end();
  });
});

app.listen(3000);

Fastify:

import Fastify from 'fastify';

const app = Fastify();

app.get('/events', async (request, reply) => {
  reply.type('text/event-stream');
  reply.header('Cache-Control', 'no-cache');
  reply.header('Connection', 'keep-alive');

  reply.raw.write(': connected\n\n');

  const interval = setInterval(() => {
    const data = JSON.stringify({
      timestamp: new Date().toISOString(),
      value: Math.random()
    });
    reply.raw.write(`data: ${data}\n\n`);
  }, 1000);

  request.socket.on('close', () => {
    clearInterval(interval);
  });
});

app.listen({ port: 3000 });

Hono:

import { Hono } from 'hono';

const app = new Hono();

app.get('/events', async (c) => {
  return c.streamText(async (stream) => {
    // Send heartbeat every 30 seconds
    const interval = setInterval(async () => {
      await stream.write(': heartbeat\n\n');
    }, 30000);

    // Send data
    for (let i = 0; i &lt; 10; i++) {
      const data = JSON.stringify({
        timestamp: new Date().toISOString(),
        value: Math.random()
      });
      await stream.write(`data: ${data}\n\n`);
      await stream.sleep(1000);
    }

    clearInterval(interval);
  });
});

export default app;

All three share the same pattern: set headers, write events in data: {JSON}\n\n format, clean up on disconnect.

SSE Headers and Keep-Alive

Critical headers:

  • Content-Type: text/event-stream: Tells browser this is SSE
  • Cache-Control: no-cache: Prevent caching
  • Connection: keep-alive: Keep TCP connection open

Heartbeats prevent proxies from timing out idle connections:

const heartbeat = setInterval(() => {
  res.write(': heartbeat\n\n');
}, 30000);

The colon prefix (:) marks comments; browsers ignore them. Proxies see traffic and keep the connection alive.

Client Reconnection With Last-Event-ID

The browser automatically reconnects on disconnect. But it might miss events. Use Last-Event-ID to resume from where it left off.

On the server, track event IDs and replay:

app.get('/events', (req, res) => {
  res.setHeader('Content-Type', 'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');

  const lastEventId = parseInt(req.headers['last-event-id'] as string) || 0;

  // Replay events from `lastEventId`
  const events = getStoredEvents();
  for (const event of events) {
    if (event.id &gt; lastEventId) {
      res.write(`id: ${event.id}\n`);
      res.write(`data: ${JSON.stringify(event)}\n\n`);
    }
  }

  // Send new events
  const subscription = subscribeToNewEvents((event) => {
    res.write(`id: ${event.id}\n`);
    res.write(`data: ${JSON.stringify(event)}\n\n`);
  });

  req.on('close', () => {
    subscription.unsubscribe();
  });
});

On the client:

const eventSource = new EventSource('/events');

eventSource.onmessage = (event) => {
  const data = JSON.parse(event.data);
  console.log('Received:', data);
};

eventSource.onerror = () => {
  console.log('Connection lost, browser will reconnect');
  // Browser auto-reconnects with Last-Event-ID
};

The browser tracks the last id: field and includes it in the reconnection request. No client code needed.

Event Types and Namespacing

Send different event types:

res.write(`event: user-update\n`);
res.write(`data: ${JSON.stringify(userData)}\n\n`);

res.write(`event: product-alert\n`);
res.write(`data: ${JSON.stringify(productAlert)}\n\n`);

On the client, listen to specific event types:

const eventSource = new EventSource('/events');

eventSource.addEventListener('user-update', (event) => {
  const user = JSON.parse(event.data);
  updateUserUI(user);
});

eventSource.addEventListener('product-alert', (event) => {
  const alert = JSON.parse(event.data);
  showAlertNotification(alert);
});

Namespace events for clarity and easy filtering.

SSE Behind Nginx

Nginx must not buffer SSE responses. Add to your config:

location /events {
  proxy_pass http://backend:3000;
  proxy_buffering off;
  proxy_cache off;
  proxy_set_header Connection "";
}

proxy_buffering off: Send chunks immediately, don't buffer.

Without this, events queue up and burst in chunks. You'll see latency spikes.

SSE Behind Cloudflare

Cloudflare buffers by default. Workarounds:

Option 1: Upgrade to Enterprise and use Cache Rules to disable buffering.

Option 2: Use Cloudflare Workers to proxy SSE:

export default {
  async fetch(request) {
    if (request.url.includes('/events')) {
      return new Response(null, {
        status: 101,
        statusText: 'Switching Protocols'
      });
    }
    return fetch(request);
  }
};

Actually, Cloudflare Workers have their own stream limitations. Better option:

Option 3: Serve SSE from Cloudflare Workers directly:

export default {
  async fetch(request) {
    return new Response(
      new ReadableStream({
        async start(controller) {
          controller.enqueue(': connected\n\n');

          for (let i = 0; i &lt; 10; i++) {
            const data = JSON.stringify({
              timestamp: new Date().toISOString(),
              value: Math.random()
            });
            controller.enqueue(`data: ${data}\n\n`);
            await new Promise((r) => setTimeout(r, 1000));
          }

          controller.close();
        }
      }),
      {
        headers: {
          'Content-Type': 'text/event-stream',
          'Cache-Control': 'no-cache'
        }
      }
    );
  }
};

This avoids Cloudflare's buffering by serving directly from Workers.

Scaling SSE With Redis Pub/Sub

One server handles hundreds of SSE connections. For thousands, use Redis pub/sub:

import redis from 'redis';

const redisPub = redis.createClient();
const redisSub = redis.createClient();

app.get('/events/:userId', (req, res) => {
  const userId = req.params.userId;

  res.setHeader('Content-Type', 'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');
  res.setHeader('Connection', 'keep-alive');

  res.write(': connected\n\n');

  // Subscribe to user's event channel
  const subscription = redisSub.subscribe(
    `events:${userId}`,
    (message) => {
      res.write(`data: ${message}\n\n`);
    }
  );

  req.on('close', () => {
    subscription.unsubscribe();
  });
});

// From another process, publish events
await redisPub.publish(
  'events:user-123',
  JSON.stringify({ type: 'update', data: 'new-value' })
);

Now events published to Redis reach all connected clients, even across multiple server instances.

SSE for AI Streaming Responses

When calling OpenAI Realtime or Anthropic Claude streaming, forward the stream to the client:

import Anthropic from '@anthropic-ai/sdk';

app.post('/chat', express.json(), async (req, res) => {
  const { message } = req.body;

  res.setHeader('Content-Type', 'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');
  res.setHeader('Connection', 'keep-alive');

  const stream = await client.messages.stream({
    model: 'claude-3-5-sonnet-20241022',
    max_tokens: 1024,
    messages: [{ role: 'user', content: message }]
  });

  for await (const event of stream) {
    if (event.type === 'content_block_delta') {
      const delta = event.delta;
      if (delta.type === 'text_delta') {
        res.write(`data: ${JSON.stringify({ text: delta.text })}\n\n`);
      }
    }
  }

  res.write(': done\n\n');
  res.end();
});

On the client, accumulate tokens:

let fullText = '';

const eventSource = new EventSource('/chat?message=Hello');

eventSource.onmessage = (event) => {
  const { text } = JSON.parse(event.data);
  fullText += text;
  updateDisplay(fullText);
};

This pattern powers every AI chat UI using streaming: low latency, simple protocol, great UX.

SSE for Live Dashboard Updates

Dashboards need server→client updates: metrics, alerts, status.

app.get('/dashboard/:userId/events', (req, res) => {
  const userId = req.params.userId;

  res.setHeader('Content-Type', 'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');

  // Send initial state
  const dashboard = await fetchDashboard(userId);
  res.write(`data: ${JSON.stringify({ type: 'initial', data: dashboard })}\n\n`);

  // Subscribe to metric updates
  const subscription = metrics.subscribe(userId, (metric) => {
    res.write(
      `data: ${JSON.stringify({ type: 'metric-update', data: metric })}\n\n`
    );
  });

  req.on('close', () => {
    subscription.unsubscribe();
  });
});

Dashboard updates without polling. Users see fresh data immediately.

Browser Support and Polyfills

SSE is supported in all modern browsers. For IE11 or very old browsers, use a polyfill:

<script src="https://cdn.jsdelivr.net/npm/eventsource@2.0.2/lib/eventsource.min.js"></script>

This polyfills EventSource for older browsers. Most teams can drop IE11 support; polyfill only if you need it.

Checklist

  • Identify server→client updates in your application
  • Design your event schema
  • Implement SSE endpoint in your backend
  • Add event IDs for reconnection replay
  • Configure proxies (Nginx, Cloudflare) correctly
  • Test reconnection scenarios
  • Set up Redis pub/sub for horizontal scaling
  • Monitor connection count and event throughput
  • Document event types and formats

Conclusion

SSE is simpler than WebSockets for one-way streaming. Use it for dashboards, AI responses, live feeds, and server-to-client updates. Save WebSockets for chat and collaboration.

Get the headers right (especially proxy_buffering off), handle reconnection with Last-Event-ID, and scale with Redis pub/sub. You'll have production-grade real-time updates without the complexity.