Author - Sahil Kumar Post Views - 20 views

Streams in Node.js: Readable, Writable, Duplex, Transform

When working with Node.js, one of the most powerful features you’ll come across is streams. They help you handle data in chunks instead of loading the entire data into memory at once. This makes them especially useful when working with large files, network requests, or real-time applications.

In this blog, we’ll break down the four main types of streams in Node.js: Readable, Writable, Duplex, and Transform. Don’t worry—we’ll also see examples to make it simple and clear.

What are Streams in Node.js?

Streams are like pipelines. Instead of waiting for all the data to be available, you can start processing it as it comes in. Think of watching YouTube: you don’t need to download the full video before watching. Data keeps flowing, and you can use it immediately—that’s how streams work.

1. Readable Streams

A Readable stream is used to read data. It produces data that can be consumed piece by piece.

Example: Reading a File

const fs = require(‘fs’);

const readableStream = fs.createReadStream(‘example.txt’, { encoding: ‘utf8’ });

readableStream.on(‘data’, (chunk) => {
console.log(‘New chunk received:’, chunk);
});

readableStream.on(‘end’, () => {
console.log(‘No more data to read.’);
});

Here, the file content is read in chunks, not all at once.

2. Writable Streams

A Writable stream is used to write data. Instead of writing everything in one go, you can write data gradually.

Example: Writing into a File

 const fs = require(‘fs’); 

 const writableStream = fs.createWriteStream(‘output.txt’);

 

 writableStream.write(‘Hello, this is the first line.\n);

 writableStream.write(‘And this is the second line.\n);

 

 writableStream.end(() => {

   console.log(‘Writing finished.’);

 });

This writes content into a file line by line.

3. Duplex Streams

A Duplex stream is both Readable and Writable. It can read and write data at the same time.

Example: Using a Duplex Stream

 const { Duplex } = require(‘stream’); 

 const duplexStream = new Duplex({

   read(size) {

     this.push(‘Hello from Read side!\n);

     this.push(null); // No more data

   },

   write(chunk, encoding, callback) {

     console.log(‘Writing:’, chunk.toString());

     callback();

   }

 });

 

 duplexStream.on(‘data’, (chunk) => {

   console.log(‘Received:’, chunk.toString());

 });

 

 duplexStream.write(‘Message to Duplex Stream’);

 duplexStream.end();

Here, the same stream can output data and also accept data.

4. Transform Streams

A Transform stream is a specialized form of Duplex stream.. It can modify the data while reading and writing.

Example: Converting Text to Uppercase

 
 const { Transform } = require(‘stream’);  const transformStream = new Transform({   transform(chunk, encoding, callback) {     const upperCase = chunk.toString().toUpperCase();     this.push(upperCase);

     callback();

   }

 });

 

 process.stdin.pipe(transformStream).pipe(process.stdout);

In this example, when you type something in the terminal, it will be converted to uppercase before being shown.

Benefit of Streams:

  • Memory efficient: Data is processed in chunks, not fully loaded.
  • Faster: Start using data as soon as it arrives.
  • Best for large data: File reading, video streaming, logs, etc.

Summary: 

Streams in Node.js are like water pipes—you don’t wait for the whole tank to fill before using water. You use it while it’s flowing.

  • Use Readable to get data.
  • Use Writable to send data.
  • Use Duplex for both.

Use Transform to modify data on the fly.

Leave a Reply

Your email address will not be published. Required fields are marked *

fiteesports.com rivierarw.com cratosroyalbet betwoon grandpashabet grandpashabet giriş deneme bonusu veren siteler casino siteleri