js

How to Build an HLS Video Streaming Server with Node.js and FFmpeg

Learn how to create your own adaptive bitrate video streaming server using Node.js, FFmpeg, and HLS. Step-by-step guide included.

How to Build an HLS Video Streaming Server with Node.js and FFmpeg

I was watching a movie last night when my internet connection stuttered. The video quality dropped instantly, but the playback never stopped. That seamless experience got me thinking: how do streaming services make this magic happen? The answer is adaptive bitrate streaming, specifically using a protocol called HLS. Today, I want to show you how to build this technology yourself with Node.js. It’s simpler than you might think, and the results are incredibly powerful.

Have you ever wondered what happens behind the scenes when you hit play on a video? The server doesn’t just send one big file. Instead, it uses a smart system that breaks the video into many small pieces. This method is called HTTP Live Streaming, or HLS. It was created by Apple, but now everyone uses it, from Netflix to your local news website.

The core idea is preparation. Before you even click play, the original video file is processed into several different quality versions. Think of it like having a 4K, HD, and mobile version of the same movie. Each of these versions is then chopped into short segments, usually between 2 and 10 seconds long. A special file called a playlist acts as a map, telling the player where to find each segment.

Here’s a basic look at the folder structure this process creates.

video_assets/
├── master.m3u8
├── 1080p/
│   ├── segment1.ts
│   ├── segment2.ts
│   └── playlist.m3u8
└── 480p/
    ├── segment1.ts
    ├── segment2.ts
    └── playlist.m3u8

The player’s job is to monitor your internet speed in real time. If it detects a slowdown, it fetches the next segment from a lower-quality playlist. If your connection improves, it switches back up. This all happens so quickly you rarely notice the change. It’s a brilliant solution to the unpredictable nature of networks.

So, how do we turn a standard MP4 file into this adaptive stream? We use a tool called FFmpeg. It’s a command-line powerhouse for video and audio processing. With just one command, we can instruct it to create multiple outputs. Let’s set up a Node.js service to manage this transcoding process.

First, we define the quality profiles we want. This is like creating a recipe for each video version.

// qualityProfiles.js
const qualityProfiles = [
  {
    name: '1080p',
    videoBitrate: '5000k',
    height: 1080,
    audioBitrate: '192k'
  },
  {
    name: '720p',
    videoBitrate: '2800k',
    height: 720,
    audioBitrate: '128k'
  },
  {
    name: '480p',
    videoBitrate: '1400k',
    height: 480,
    audioBitrate: '128k'
  }
];

Next, we write a function that uses these profiles. It loops through each one and tells FFmpeg to create the corresponding video stream and segments. We use the fluent-ffmpeg npm package to run FFmpeg commands from our Node.js code cleanly.

const ffmpeg = require('fluent-ffmpeg');

async function transcodeToHLS(inputPath, outputDir) {
  const promises = qualityProfiles.map((profile) => {
    return new Promise((resolve, reject) => {
      ffmpeg(inputPath)
        .outputOptions([
          `-c:v h264`,
          `-b:v ${profile.videoBitrate}`,
          `-c:a aac`,
          `-b:a ${profile.audioBitrate}`,
          `-f hls`,
          `-hls_time 6`,
          `-hls_playlist_type vod`,
          `-hls_segment_filename ${outputDir}/${profile.name}/segment_%03d.ts`
        ])
        .output(`${outputDir}/${profile.name}/playlist.m3u8`)
        .on('end', resolve)
        .on('error', reject)
        .run();
    });
  });

  await Promise.all(promises);
  console.log('Transcoding finished for all profiles.');
}

Did you notice the hls_time 6 option? That tells FFmpeg to make each segment 6 seconds long. Shorter segments mean quicker quality switches, but they also mean more small files for the server to manage. It’s a classic trade-off between responsiveness and overhead.

Once we have our segments, we need to create the master playlist. This file is the entry point for the video player. It lists all the available quality streams and their bandwidth requirements. We can generate this dynamically after transcoding.

const fs = require('fs').promises;

async function createMasterPlaylist(outputDir) {
  let masterContent = '#EXTM3U\n';

  qualityProfiles.forEach((profile) => {
    // Calculate bandwidth roughly: video bitrate + audio bitrate
    const bandwidth = parseInt(profile.videoBitrate) * 1000 + parseInt(profile.audioBitrate) * 1000;

    masterContent += `#EXT-X-STREAM-INF:BANDWIDTH=${bandwidth},RESOLUTION=${profile.height}p\n`;
    masterContent += `${profile.name}/playlist.m3u8\n`;
  });

  await fs.writeFile(`${outputDir}/master.m3u8`, masterContent);
  console.log('Master playlist created.');
}

Now we have our HLS packages ready. But we need a way to serve them. This is where Node.js and Express come in. We’ll build a simple server that does two main things: allows video uploads and streams the processed HLS files.

Setting up the server is straightforward. We need endpoints for uploading a video file and for serving the HLS playlists and segments. For uploads, we’ll use the multer middleware to handle the file.

const express = require('express');
const multer = require('multer');
const path = require('path');
const app = express();

// Configure storage for uploaded files
const storage = multer.diskStorage({
  destination: './uploads/raw',
  filename: (req, file, cb) => {
    const uniqueName = `${Date.now()}-${file.originalname}`;
    cb(null, uniqueName);
  }
});
const upload = multer({ storage });

// Upload endpoint
app.post('/upload', upload.single('video'), async (req, res) => {
  const videoId = path.parse(req.file.filename).name;
  const outputDir = `./uploads/processed/${videoId}`;

  try {
    // 1. Transcode the uploaded file
    await transcodeToHLS(req.file.path, outputDir);
    // 2. Create the master playlist
    await createMasterPlaylist(outputDir);

    res.json({
      videoId: videoId,
      masterPlaylist: `/stream/${videoId}/master.m3u8`
    });
  } catch (error) {
    res.status(500).json({ error: 'Processing failed' });
  }
});

The streaming endpoint is even simpler. It just needs to statically serve the folder containing our HLS files. We use Express’s static middleware for this.

// Serve the processed HLS files
app.use('/stream/:videoId', (req, res, next) => {
  const videoDir = `./uploads/processed/${req.params.videoId}`;
  express.static(videoDir)(req, res, next);
});

app.listen(3000, () => console.log('Streaming server ready on port 3000'));

With this, you have a working backend. You can upload a video via /upload and then play it using an HLS-compatible player like Video.js or hls.js by pointing it to /stream/{videoId}/master.m3u8.

But what about scale? Storing and serving thousands of video segments from a single Node.js server won’t work for long. This is where cloud storage and a Content Delivery Network (CDN) become essential. After processing, you would upload the entire processed/{videoId} folder to a cloud bucket like AWS S3.

Then, instead of serving files directly, your Node.js server becomes a manager. It handles uploads, triggers processing, and stores the final cloud URL in a database. The actual video segments are streamed globally from the CDN, which is built for this exact purpose.

You might ask, is the player smart enough to handle all this switching? Absolutely. Modern JavaScript players like hls.js handle the complex logic of monitoring bandwidth and fetching the correct segment. Your server just provides the roadmap—the master playlist—and the CDN delivers the content. It’s a beautiful separation of concerns.

Building this has been a fascinating journey from a simple video file to a resilient streaming system. The principles behind HLS are a testament to solving a hard problem with an elegant, file-based solution. It doesn’t require special server software, just HTTP and well-organized files.

I hope walking through this process demystifies how your favorite videos reach you without buffering. Try building it yourself; start with a simple Express server and a single quality profile. The feeling of streaming your own video adaptively is incredibly rewarding.

If you found this guide helpful, please share it with someone who might be curious about how streaming works. Have you tried implementing something similar? What challenges did you face? Let me know in the comments below—I’d love to hear about your projects and answer any questions


As a best-selling author, I invite you to explore my books on Amazon. Don’t forget to follow me on Medium and show your support. Thank you! Your support means the world!


101 Books

101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.

Check out our book Golang Clean Code available on Amazon.

Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!


📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!


Our Creations

Be sure to check out our creations:

Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools


We are on Medium

Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva

Keywords: hls streaming,nodejs video server,ffmpeg tutorial,adaptive bitrate,video streaming



Similar Posts
Blog Image
Complete Guide to Integrating Prisma with Next.js for Type-Safe Database Operations

Learn how to integrate Prisma with Next.js for type-safe database operations. Build powerful full-stack apps with seamless ORM integration and TypeScript support.

Blog Image
Build a Scalable Distributed Task Queue with BullMQ, Redis, and Node.js Clustering

Learn to build a scalable distributed task queue with BullMQ, Redis, and Node.js clustering. Complete guide with error handling, monitoring & production deployment tips.

Blog Image
Complete Event-Driven Microservices Architecture: NestJS, RabbitMQ, and MongoDB Integration Guide

Learn to build scalable event-driven microservices with NestJS, RabbitMQ & MongoDB. Master async communication, event sourcing & production deployment.

Blog Image
Complete Guide to Next.js Prisma Integration: Build Type-Safe Full-Stack Applications in 2024

Learn how to integrate Next.js with Prisma ORM for type-safe, scalable web apps. Build efficient database operations with seamless query APIs. Start today!

Blog Image
Complete Guide: Integrating Next.js with Prisma ORM for Type-Safe Database-Driven Applications

Learn how to integrate Next.js with Prisma ORM for type-safe, database-driven web apps. Build scalable applications with seamless data flow and TypeScript support.

Blog Image
Complete Guide to Next.js Prisma Integration: Build Type-Safe Full-Stack Applications in 2024

Learn to build powerful full-stack apps by integrating Next.js with Prisma ORM for type-safe database operations. Boost productivity with seamless TypeScript support.