I’ve been thinking about gateways lately. Not the physical kind with hinges and locks, but the digital ones that manage the flow of information in our applications. In a world where a single app might talk to dozens of microservices, how do you keep things organized, secure, and fast? The answer often lies in a central piece of software: the API gateway. Today, I want to walk you through building one from the ground up. We’ll use modern tools to create something robust, flexible, and ready for production. If you find this guide helpful, I’d love for you to share your thoughts in the comments at the end.
Let’s start with the basics. An API gateway is like a receptionist for your backend services. Instead of clients needing to know the address of every single service, they talk to the gateway. The gateway then figures out where the request needs to go. It handles the messy details, like finding which service instance is healthy, balancing load between them, checking user permissions, and even translating between different data formats. This centralization simplifies client code and lets you manage security and traffic in one place.
Why choose Fastify for this job? Speed is a major factor. Fastify is built with performance as a core principle, capable of handling a massive number of requests per second with very little overhead. But it’s not just fast. It has excellent built-in support for validating data using JSON Schema, which is crucial for a gateway that must trust but verify incoming requests. Its plugin system is a joy to use, making our gateway modular and easy to extend. And, of course, it works beautifully with TypeScript, giving us confidence in our code as we build.
So, what will our gateway actually do? Imagine a user wants to see their order history. Their app sends a request to /api/orders. Our gateway receives this. It first checks if the user is logged in. Then, it looks up where the “orders” service lives, perhaps finding three running instances. It picks one, forwards the request, gets the response, and sends it back to the user. All this happens seamlessly. Now, what if another part of the app uses GraphQL or needs a live WebSocket connection? A good gateway should handle those too, acting as a single entry point for all communication styles.
Setting up the project is our first concrete step. We’ll use a clear folder structure to keep our code manageable. We need a place for configuration, plugins, different protocol handlers, and utilities. Let’s create the foundation.
mkdir api-gateway
cd api-gateway
npm init -y
npm install fastify @fastify/cors @fastify/helmet
npm install -D typescript @types/node tsx
npx tsc --init
Our tsconfig.json file ensures TypeScript is set up for a modern Node.js project.
{
"compilerOptions": {
"target": "ES2022",
"module": "NodeNext",
"moduleResolution": "NodeNext",
"outDir": "./dist",
"rootDir": "./src",
"strict": true
}
}
With the project skeleton ready, we can create the heart of our gateway: the Fastify server. We’ll start simple and add features step by step. Here’s a basic server in src/server.ts.
import Fastify, { FastifyInstance } from 'fastify';
import cors from '@fastify/cors';
import helmet from '@fastify/helmet';
export async function buildServer(): Promise<FastifyInstance> {
const server = Fastify({
logger: true
});
// Register security and CORS plugins
await server.register(helmet);
await server.register(cors, {
origin: true // Configure appropriately for production
});
// A simple health check route
server.get('/health', async () => {
return { status: 'ok', timestamp: new Date().toISOString() };
});
return server;
}
And an entry point in src/index.ts to start it all.
import { buildServer } from './server';
async function start() {
const server = await buildServer();
try {
await server.listen({ port: 3000, host: '0.0.0.0' });
console.log(`Gateway running on ${server.server.address()}`);
} catch (err) {
server.log.error(err);
process.exit(1);
}
}
start();
Run npx tsx src/index.ts and visit http://localhost:3000/health. You should see a JSON response. Congratulations, the gateway is alive! But it doesn’t do much yet. It’s a receptionist sitting at an empty desk. How do we tell it about the services it needs to manage? This is where service discovery comes in.
In dynamic environments, services can come and go. Their IP addresses change. Manually updating gateway configurations is not feasible. We need a system where services can announce themselves and where the gateway can find them. This is a perfect job for HashiCorp Consul. Consul maintains a registry of healthy services. Our gateway will ask Consul, “Where can I find the ‘user-service’?” and get a list of addresses back.
Let’s add Consul to our project. First, we need the Node.js client.
npm install consul
Now, let’s create a plugin that wraps the Consul logic. This plugin will attach a discover function to our Fastify server instance. We’ll put it in src/plugins/service-discovery.ts.
import { FastifyPluginAsync } from 'fastify';
import Consul from 'consul';
declare module 'fastify' {
interface FastifyInstance {
discover: (serviceName: string) => Promise<{ address: string; port: number }[]>;
}
}
const serviceDiscoveryPlugin: FastifyPluginAsync = async (server) => {
const consul = new Consul({
host: process.env.CONSUL_HOST || 'localhost',
port: process.env.CONSUL_PORT || '8500'
});
server.decorate('discover', async (serviceName: string) => {
try {
const instances = await consul.catalog.service.nodes(serviceName);
return instances.map((instance: any) => ({
address: instance.ServiceAddress || instance.Address,
port: instance.ServicePort
}));
} catch (error) {
server.log.error(`Failed to discover service ${serviceName}:`, error);
return [];
}
});
server.addHook('onClose', async () => {
// Cleanup if needed
});
};
export default serviceDiscoveryPlugin;
We register this plugin in our server build function. Now, anywhere in our gateway code, we can call fastify.discover('orders') to get a list of live order service instances. But getting an address is only half the battle. What do we do with it? We need to forward the incoming request. Let’s build a generic REST proxy handler.
Think about it: we need to take the original request—its method, path, headers, and body—and send it to a chosen backend service. We’ll use the undici library, which is a fast and reliable HTTP client. Let’s install it and create a handler.
npm install undici
Here’s a simplified version of a REST proxy in src/handlers/rest-proxy.ts. This would be part of a larger routing setup.
import { FastifyRequest, FastifyReply } from 'fastify';
import { request, Pool } from 'undici';
export async function proxyRestRequest(
req: FastifyRequest,
reply: FastifyReply,
serviceUrl: string
) {
const { method, url, headers, body } = req;
const targetUrl = `${serviceUrl}${url}`;
try {
const { statusCode, headers: resHeaders, body: resBody } = await request(targetUrl, {
method,
headers: { ...headers, host: undefined }, // Remove the gateway's host header
body: method !== 'GET' && method !== 'HEAD' ? body : undefined,
});
reply.code(statusCode);
reply.headers(resHeaders as any);
reply.send(resBody);
} catch (error) {
req.log.error(`Proxy error for ${targetUrl}:`, error);
reply.code(502).send({ error: 'Bad Gateway' });
}
}
This handler takes the request and forwards it. But we’re missing a key piece: choosing which instance to send it to from the list we got from Consul. This is load balancing. The simplest strategy is round-robin, picking the next one in line each time. Let’s implement a basic version.
We create a small load balancer service. It keeps track of the last used index for each service.
// src/services/load-balancer.ts
export class LoadBalancer {
private counters: Map<string, number> = new Map();
pickInstance(instances: { address: string; port: number }[]): string {
if (instances.length === 0) {
throw new Error('No healthy instances available');
}
const serviceKey = instances.map(i => `${i.address}:${i.port}`).join(',');
let index = this.counters.get(serviceKey) || 0;
const picked = instances[index];
index = (index + 1) % instances.length;
this.counters.set(serviceKey, index);
return `http://${picked.address}:${picked.port}`;
}
}
Now, we can tie it all together in a route. When a request comes to /api/orders/*, we discover the ‘orders’ service, use the load balancer to pick an instance, and proxy the request. But what happens if the picked instance is down? The user would get an error. We need a way to avoid sending requests to failing services. This is where the circuit breaker pattern comes in.
A circuit breaker monitors failures. If a service fails too many times in a short period, the circuit “trips.” For a cooldown period, all requests to that service fail immediately without even trying, giving the failing service time to recover. After the cooldown, it lets a test request through to see if the service is healthy again. It’s a crucial pattern for building resilient systems. Fastify has a plugin for this.
npm install @fastify/circuit-breaker
We register it and can then wrap our proxy calls. The plugin will automatically track failures and open the circuit when a threshold is crossed. This prevents a single failing service from causing cascading failures and wasting resources.
We’ve covered REST, but modern apps often use other protocols. GraphQL, for instance, typically has a single endpoint. Proxying it is simpler in one way—we always forward to the same path—but we still need service discovery and load balancing. WebSockets are trickier because they are persistent connections. The gateway must upgrade the initial HTTP request and then pipe data back and forth between the client and the chosen backend instance. This requires stateful connection management.
Security cannot be an afterthought. Every request should be authenticated and authorized. We can use a Fastify plugin to verify a JWT token on incoming requests before they even reach our proxy logic. Rate limiting is another essential feature to protect our backend from being overwhelmed, whether by a bug or a bad actor. Fastify’s rate limit plugin makes this easy to add.
Finally, we need to see what’s happening. Structured logging with Pino (which Fastify uses) is a great start. We should also expose metrics, like request counts and durations, in a format Prometheus can scrape. This allows us to set up alerts and dashboards to monitor the gateway’s health and performance.
Putting it all together, we start with a simple, fast HTTP server. We teach it to find services dynamically with Consul. We give it the ability to forward requests, balance load, and break circuits to prevent failures from spreading. We layer on security and observability. The result is a powerful, central piece of infrastructure that makes our entire system more reliable and easier to manage.
Building this piece by piece helps you understand the role of each component. You start to see the gateway not as magic, but as a collection of sensible, programmable patterns. It’s a fascinating project that touches on many core concepts of distributed systems. What challenges have you faced when connecting clients to multiple services? Have you tried building or configuring a gateway before? I’d be very interested to hear about your experiences. If this guide sparked some ideas, please feel free to like, share, or comment below. Let’s keep the conversation going.
As a best-selling author, I invite you to explore my books on Amazon. Don’t forget to follow me on Medium and show your support. Thank you! Your support means the world!
101 Books
101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.
Check out our book Golang Clean Code available on Amazon.
Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!
📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!
Our Creations
Be sure to check out our creations:
Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools
We are on Medium
Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva