Language: EN

como-trabajar-con-streams-nodejs

How to work with Streams in Node.js

Streams are continuous flows of data that are read or written, allowing for faster and more efficient handling of large amounts of data.

Streams allow us to work with large amounts of data without having to load all the content into memory. This is especially useful when working with large files or network streams.

In Node.js, streams are divided into different types:

  • Readable Streams: They allow reading data from a source, such as a file or an HTTP request.
  • Writable Streams: They allow writing data to a destination, such as a file or an HTTP response.
  • Duplex Streams: They allow reading and writing data, such as in a TCP network connection.

For working with streams, we will use both their methods and their events intensively.

Examples of using Streams

Reading a File with Readable Streams

To read a file using a Readable Stream in Node.js, we can do the following:

import fs from 'node:fs';

const readableStream = fs.createReadStream('archivo.txt', 'utf8');

readableStream.on('data', (chunk) => {
  console.log('Data received:', chunk);
});

readableStream.on('end', () => {
  console.log('File reading complete');
});

readableStream.on('error', (err) => {
  console.error('Error reading the file:', err);
});

In this example, we create a Readable Stream using the createReadStream() method from the fs module. This method takes two arguments: the name of the file we want to read (archivo.txt in this case) and the character encoding (utf8 in this case).

On the other hand, we handle three events:

  • data: This event is triggered each time a data chunk is received from the file.
  • end: This event is emitted when the file reading is complete.
  • error: This event is triggered if an error occurs during the file reading.

Writing to a File with Writable Streams

To write to a file using a Writable Stream in Node.js, we can do the following:

import fs from 'node:fs';

const writableStream = fs.createWriteStream('newFile.txt', 'utf8');

writableStream.write('This is text that will be written to the file.\n');
writableStream.write('We can write data incrementally.\n');

writableStream.end('Finishing writing to the file.\n');

writableStream.on('finish', () => {
  console.log('File writing complete');
});

writableStream.on('error', (err) => {
  console.error('Error writing to the file:', err);
});

In this example, we create a Writable Stream using the createWriteStream() method from the fs module. This method takes two arguments: the name of the file we want to write to (newFile.txt in this case) and the character encoding (utf8 in this case).

We use the write() method to write data to the file. Finally, we call the end() method to indicate that we have finished writing to the file.

On the other hand, we handle two events:

  • finish: This event is emitted when the writing to the file is completed successfully.
  • error: This event is emitted if an error occurs during the writing to the file.

Piping between streams

Piping between streams is a common technique in Node.js that allows redirecting the output of one stream to the input of another. For example, we can copy a file using piping as follows:

import { createReadStream, createWriteStream } from 'node:fs';

const readStream = createReadStream('archivo.txt');
const writeStream = createWriteStream('copiaArchivo.txt');

readStream.pipe(writeStream);

writeStream.on('finish', () => {
  console.log('File copied successfully.');
});

writeStream.on('error', (err) => {
  console.error('Error copying the file:', err);
});

In this example, we create a Readable Stream using createReadStream() and a Writable Stream using createWriteStream(). Then, we use the pipe() method to redirect the data from the read stream to the write stream.

Combining multiple streams

We can also combine multiple streams to perform more complex operations. For example, we can transform a file to uppercase while copying it using transformation streams:

import { createReadStream, createWriteStream, Transform } from 'node:fs';

const readStream = createReadStream('archivo.txt', 'utf8');
const writeStream = createWriteStream('uppercase.txt');

const transformStream = new Transform({
  transform(chunk, encoding, callback) {
    this.push(chunk.toString().toUpperCase());
    callback();
  }
});

readStream.pipe(transformStream).pipe(writeStream);

writeStream.on('finish', () => {
  console.log('File transformed and written in uppercase.');
});

writeStream.on('error', (err) => {
  console.error('Error transforming and writing the file:', err);
});

In this example, we create a transformation stream using the Transform class and define the transformation logic in the transform() method. Then, we use the pipe() method to chain the read, transformation, and write streams.