Language: EN

como-trabajar-con-streams-nodejs

How to work with Streams in Node.js

Streams are data flows that are read or written continuously, allowing for faster and more efficient handling of large amounts of data.

Streams allow us to work with large amounts of data without having to load all the content into memory (this is especially useful when working with large files or network communications)

In Node.js, streams are divided into different types:

  • Readable Streams: Allow reading data from a source, such as a file or an HTTP request
  • Writable Streams: Allow writing data to a destination, such as a file or an HTTP response
  • Duplex Streams: Allow reading and writing data, as in a TCP network connection

Examples of Stream Usage

Reading a File with Readable Streams

To read a file using a Readable Stream in Node.js, we can do the following:

import fs from 'node:fs';

const readableStream = fs.createReadStream('archivo.txt', 'utf8');

readableStream.on('data', (chunk) => {
  console.log('Data received:', chunk);
});

readableStream.on('end', () => {
  console.log('File read complete');
});

readableStream.on('error', (err) => {
  console.error('Error reading the file:', err);
});

::::

In this example, we create a Readable Stream using the createReadStream() method from the fs module. This method takes two arguments: the name of the file we want to read (archivo.txt in this case) and the character encoding (utf8 in this case).

On the other hand, we manage three events:

  • data: This event is triggered every time a chunk of data is received from the file.
  • end: This event is emitted when the reading of the file is complete.
  • error: This event is triggered if an error occurs during the reading of the file.

Writing to a File with Writable Streams

To write to a file using a Writable Stream in Node.js, we can do the following:

import fs from 'node:fs';

const writableStream = fs.createWriteStream('nuevoArchivo.txt', 'utf8');

writableStream.write('This is a text that will be written to the file.\n');
writableStream.write('We can write data incrementally.\n');

writableStream.end('Finishing writing to the file.\n');

writableStream.on('finish', () => {
  console.log('File write complete');
});

writableStream.on('error', (err) => {
  console.error('Error writing to the file:', err);
});

In this example, we create a Writable Stream using the createWriteStream() method from the fs module. This method takes two arguments: the name of the file we want to write to (nuevoArchivo.txt in this case) and the character encoding (utf8 in this case).

We use the write() method to write data to the file. Finally, we call the end() method to indicate that we have finished writing to the file.

On the other hand, we manage two events:

  • finish: This event is emitted when the writing to the file has been completed successfully.
  • error: This event is emitted if an error occurs during the writing to the file.

Piping between streams

Piping between streams is a common technique in Node.js that allows redirecting the output of one stream to the input of another. For example, we can copy a file using piping as follows:

import { createReadStream, createWriteStream } from 'node:fs';

const readStream = createReadStream('archivo.txt');
const writeStream = createWriteStream('copiaArchivo.txt');

readStream.pipe(writeStream);

writeStream.on('finish', () => {
  console.log('File copied successfully.');
});

writeStream.on('error', (err) => {
  console.error('Error copying the file:', err);
});

In this example, we create a Readable Stream using createReadStream() and a Writable Stream using createWriteStream(). Then, we use the pipe() method to redirect the data from the read stream to the write stream.

Combining multiple streams

We can also combine multiple streams to perform more complex operations. For example, we can transform a file to uppercase while copying it using transform streams:

import { createReadStream, createWriteStream } from 'node:fs';
const { Transform } = require('node:stream');

const readStream = createReadStream('archivo.txt', 'utf8');
const writeStream = createWriteStream('mayusculas.txt');

const transformStream = new Transform({
  transform(chunk, encoding, callback) {
    this.push(chunk.toString().toUpperCase());
    callback();
  }
});

readStream.pipe(transformStream).pipe(writeStream);

writeStream.on('finish', () => {
  console.log('File transformed and written in uppercase.');
});

writeStream.on('error', (err) => {
  console.error('Error transforming and writing the file:', err);
});

In this example, we create a transform stream using the Transform class and define the transformation logic in the transform() method. Then, we use the pipe() method to chain the read, transform, and write streams.

Download the code

All the code for this post is available for download on Github github-full