Node.js Streams — Complete Guide

Sanjeev SharmaSanjeev Sharma
1 min read

Advertisement

Streams process data in chunks, not all at once. Essential for large files and real-time data.

Types of Streams

import fs from "fs";

// Readable
const readable = fs.createReadStream("large-file.txt");
readable.on("data", (chunk) => {
  console.log(`Read ${chunk.length} bytes`);
});

// Writable
const writable = fs.createWriteStream("output.txt");
writable.write("Hello");
writable.end();

// Transform
const { Transform } = require("stream");
const uppercase = new Transform({
  transform(chunk, encoding, callback) {
    this.push(chunk.toString().toUpperCase());
    callback();
  },
});

readable.pipe(uppercase).pipe(writable);

Piping

// Chain streams
fs.createReadStream("input.txt")
  .pipe(fs.createWriteStream("output.txt"));

FAQ

Q: When to use streams? A: Large files, network I/O, real-time data.


Streams are fundamental to efficient Node.js applications.

Advertisement

Sanjeev Sharma

Written by

Sanjeev Sharma

Full Stack Engineer · E-mopro