Patrones de dise帽o en Node.js

1

Qu茅 es Node.js y c贸mo impulsa tu negocio

2

Patrones de dise帽o esenciales en Node.js

3

Patr贸n Singleton y Factory en JavaScript

4

Implementaci贸n pr谩ctica de Singleton y Factory en JavaScript

5

Implementaci贸n del patr贸n Observer con EventEmitter en Node.js

6

Implementaci贸n de Middlewares en Node.js sin Express

7

Decorators e inyecci贸n de dependencias en JavaScript

Flujo de Datos con Node.js

8

Aprende qu茅 son Buffer y Streams en Node.js

9

C贸mo utilizar streams y pipelines en Node.js

10

C贸mo funciona el Event Loop en Node.js

11

Qu茅 es Libuv y c贸mo maneja la asincron铆a en Node.js

12

Estrategias para ejecutar c贸digo as铆ncrono en Node.js

Debugging y Diagn贸stico en Node.js

13

C贸mo utilizar el Debugger en Node.js para solucionar problemas

14

Uso de Diagnostic Channels en Node.js para observabilidad y diagn贸stico

15

Instrumentaci贸n y m茅tricas clave en performance para aplicaciones Node.js

16

Control de errores globales y manejo de se帽ales en Node.js

17

Implementaci贸n Eficiente de Logs con Pino en Node.js

Performance en Node.js

18

An谩lisis del event loop en aplicaciones Node.js usando Nsolid

19

C贸mo Diagnosticar y Solucionar Memory Leaks en Aplicaciones Node.js

20

Optimizar rendimiento en Node.js con Worker Threads y Child Processes

21

Optimiza y Escala Aplicaciones Node.js con T茅cnicas de Caching

Creando CLIs con Node.js

22

C贸mo crear aplicaciones CLI con Node.js

23

C贸mo Crear un CLI con Minimist y Manejar Argumentos en Node.js

24

Creaci贸n de un CLI con Node.js y Google Generative AI

25

Creaci贸n de Chat con IA usando CLI en Node

26

C贸mo Crear e Instalar tu Propio CLI de Node con npm

You don't have access to this class

Keep learning! Join and start boosting your career

Aprovecha el precio especial y haz tu profesi贸n a prueba de IA

Antes: $249

Currency
$209
Suscr铆bete

Termina en:

0 D铆as
12 Hrs
45 Min
58 Seg

C贸mo utilizar streams y pipelines en Node.js

9/26
Resources

Node.js allows efficient operations with streams, a key tool for improving performance when handling large amounts of data. Knowing the types of streams available, such as read, write, duplex and transform, opens up multiple possibilities. A very practical utility is the pipeline operation, which allows you to connect several processes in a clear and controlled sequence.

What types of streams are there and how do they work?

Streams can be categorized into four main types:

  • Read: To read information from a source.
  • Write: To write information to a destination.
  • Duplex: Combination that allows both reading and writing simultaneously.
  • Transform: To modify data during transfer.

How to transform content using streams?

To convert text to uppercase using streams, a stream type called transform is used, specifically designed to modify the information as it flows.

const  { Transform, pipeline  }  = require('stream');const  fs  = require('fs');const toUpperCase  = new Transform({  transform(chunk, encoding, callback)  {    this.push(chunk.toString().toUpperCase());    callback();   }});pipeline(   fs.createReadStream('input.txt'),  toUpperCase,   fs.createWriteStream('output.txt'),  (err)  =>  {    if (err) console.error('Error:', err);   });

In this operation:

  • A file (input.txt) is read.
  • Its contents are converted to uppercase.
  • The result is saved in another file (output.txt).
  • Any possible errors are handled correctly.

How to read files line by line with Node.js?

The readline utility allows to process a file line by line asynchronously, making it easy to read in an organized way, especially useful with large files.

const  fs  = require('fs');const readline  = require('readline');async function readLines()  {  const fileStream  =  fs.createReadStream('content.txt');  const  rl  = readline.createInterface({    input: fileStream,    crlfDelay: Infinity   });  try  {    for await (const line of  rl)  {      console.log(line);     }   } catch (error)  {    console.error(error);   }}readLines();

This method allows:

  • Iterate over individual lines in the file.
  • Perform specific actions on each line read, shown here in the console.

What are async iterables and how do they work?

Async iterables and for await simplify the handling of asynchronous operations, maintaining order in the results by creating linear and clear processes. In practice, this allows you to easily process streams in Node.js, preserving the structure and sequence of the content.

Go ahead and practice combining these methods! You can perform an operation that reads lines from a file, transforms them to uppercase and display or save them. Share how you solved the exercise in the comments.

Contributions 0

Questions 0

Sort by:

Want to see more contributions, questions and answers from the community?