Handle Data Streams in Node.js

Node.js·2 min read·Jan 1, 2025

In Node.js, a stream is an abstract interface that inherits from the EventEmitter class used to read from or write to a source in a continuous fashion.

Unlike buffers, streams allow you to efficiently process data in chunks instead of loading it into memory all at once.

This is particularly useful for large amounts of data that would exceed the available memory space, such as large files, or for data that is consumed over time, such as network packets.

Beyond that, streams can be piped together — just like command-line interface programs on Unix-like operating systems — to create more complex data flows by connecting the output of one stream to the input of another.

In Node.js, there are 4 fundamental stream types:

  • Readable: to read data from a file or any readable stream source.
  • Writable: to write data to a file or any writable stream destination.
  • Duplex: to both read and write.
  • Transform: to transform the data as it is read and written.

Create a readable stream

To read data from a file or file descriptor in continuous fashion, chunk by chunk, you can create a readable stream using the fs.createReadStream() static method:

const fs = require('node:fs');const stream = fs.createReadStream(path, {  fd?,  encoding?,  highWaterMark?

Unlock the program 🚀

Pay once, own it forever.

€79

30-day money-back guarantee

  • 13 modules
  • 113 lessons with full-code examples
  • 29 projects with commented solutions
  • All future lesson and project updates
  • Lifetime access

By submitting this form, you agree to the Terms & Conditions and Privacy Policy.