js

Build a Distributed Rate Limiter with Redis, Express and TypeScript: Complete Implementation Guide

Learn to build a scalable distributed rate limiter using Redis, Express & TypeScript. Implement Token Bucket, Sliding Window algorithms with complete code examples & deployment guide.

Build a Distributed Rate Limiter with Redis, Express and TypeScript: Complete Implementation Guide

I recently worked on a project where our API started getting hammered by unexpected traffic spikes. We needed a way to protect our services without slowing down legitimate users. That’s when I decided to build a robust rate limiter using Redis, Express, and TypeScript. If you’ve ever faced similar challenges, you’ll find this guide practical and insightful.

Have you considered what happens when your API suddenly gets flooded with requests? Rate limiting acts as a traffic cop for your application. It ensures fair usage and prevents system overload. In distributed environments, this becomes trickier because multiple servers must coordinate their limits.

Let me show you how to set things up. First, install the necessary packages. Run npm install express redis ioredis and npm install -D @types/express typescript ts-node. This gives you the core tools. Now, create a basic project structure with folders for algorithms, storage, middleware, and services.

Here’s a simple TypeScript configuration to get started:

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "commonjs",
    "strict": true,
    "outDir": "./dist"
  }
}

Why use Redis? It allows multiple servers to share rate limit data. Without it, each server would track limits independently, leading to inconsistencies. Redis provides fast, distributed storage that scales well.

Let’s implement the Token Bucket algorithm. Imagine a bucket that holds tokens. Each request consumes a token, and tokens refill over time. This method smooths out bursts and allows for flexible rate control.

class TokenBucket {
  private tokens: number;
  private lastRefill: number;

  constructor(private capacity: number, private refillRate: number) {
    this.tokens = capacity;
    this.lastRefill = Date.now();
  }

  consume(): boolean {
    this.refill();
    if (this.tokens >= 1) {
      this.tokens--;
      return true;
    }
    return false;
  }

  private refill(): void {
    const now = Date.now();
    const timePassed = (now - this.lastRefill) / 1000;
    this.tokens = Math.min(this.capacity, this.tokens + timePassed * this.refillRate);
    this.lastRefill = now;
  }
}

But how do you make this work across servers? Store the bucket state in Redis using a Lua script for atomic operations. This prevents race conditions when multiple processes update the same key.

Next, the Sliding Window Log algorithm tracks exact request timestamps within a time window. It’s precise but can use more memory. Here’s a basic idea:

async function checkSlidingWindow(key: string, windowMs: number, maxRequests: number): Promise<boolean> {
  const now = Date.now();
  const timestamps = await redis.lrange(key, 0, -1);
  const validTimestamps = timestamps.filter(ts => now - parseInt(ts) < windowMs);
  
  if (validTimestamps.length >= maxRequests) {
    return false;
  }
  
  await redis.lpush(key, now.toString());
  await redis.ltrim(key, 0, maxRequests - 1);
  await redis.pexpire(key, windowMs);
  return true;
}

Ever wondered which algorithm fits your needs? Token Bucket is great for burst handling, while Sliding Window offers accuracy. Fixed Window Counter is simpler but can allow double the limit at window edges.

Now, integrate this into Express with middleware. Middleware checks each request and applies limits before passing control to your route handlers.

import { Request, Response, NextFunction } from 'express';

function rateLimitMiddleware(limiter: RateLimiter) {
  return async (req: Request, res: Response, next: NextFunction) => {
    const key = req.ip; // or use a custom key based on user ID
    const result = await limiter.consume(key);
    
    if (!result.allowed) {
      res.status(429).json({ error: 'Too many requests' });
      return;
    }
    
    res.set('X-RateLimit-Remaining', result.remaining.toString());
    next();
  };
}

What about edge cases? Handle scenarios like failed requests or exemptions for certain endpoints. Always include graceful degradation—if Redis fails, fall back to a local limit or allow all requests temporarily.

Testing is crucial. Use tools like Jest to simulate high traffic and verify limits. Monitor performance with metrics on request counts and denial rates. In production, deploy with proper Redis clustering and health checks.

I’ve found that starting with a simple Fixed Window Counter and evolving based on metrics works well. Remember, the goal is to protect your API while maintaining a good user experience.

If this helps you build a safer application, please like and share this article. Your comments and experiences would be valuable—drop them below to continue the conversation!

Keywords: distributed rate limiter, Redis rate limiting, Express TypeScript middleware, token bucket algorithm, sliding window rate limiting, fixed window counter, API rate limiting implementation, scalable rate limiter, Redis distributed systems, TypeScript rate limiter tutorial



Similar Posts
Blog Image
Build Type-Safe GraphQL APIs: Complete TypeGraphQL, Prisma & PostgreSQL Guide for Modern Developers

Learn to build type-safe GraphQL APIs with TypeGraphQL, Prisma & PostgreSQL. Step-by-step guide covering setup, schemas, resolvers, testing & deployment.

Blog Image
How to Build Real-Time Web Apps with Svelte and Supabase Integration in 2024

Learn to integrate Svelte with Supabase for real-time web apps. Build reactive applications with live data sync, authentication, and minimal setup time.

Blog Image
Build a High-Performance Redis Rate Limiter with Node.js: Complete Implementation Guide

Learn to build a production-ready rate limiter with Redis and Node.js. Master sliding window algorithms, Express middleware, and distributed rate limiting patterns for high-performance APIs.

Blog Image
Next.js Prisma Integration Guide: Build Type-Safe Full-Stack Apps with Modern Database ORM

Learn how to integrate Next.js with Prisma ORM for type-safe, full-stack applications. Build powerful database-driven apps with seamless TypeScript support.

Blog Image
How to Build Type-Safe GraphQL APIs with NestJS, Prisma, and Code-First Development

Learn to build type-safe GraphQL APIs with NestJS code-first approach, Prisma ORM integration, authentication, optimization, and testing strategies.

Blog Image
Build High-Performance GraphQL API with NestJS, Prisma and Redis Caching Complete Tutorial

Learn to build a high-performance GraphQL API with NestJS, Prisma, and Redis caching. Master real-time subscriptions, authentication, and optimization techniques.