Patrones de dise帽o en Node.js
Qu茅 es Node.js y c贸mo impulsa tu negocio
Patrones de dise帽o esenciales en Node.js
Patr贸n Singleton y Factory en JavaScript
Implementaci贸n pr谩ctica de Singleton y Factory en JavaScript
Implementaci贸n del patr贸n Observer con EventEmitter en Node.js
Implementaci贸n de Middlewares en Node.js sin Express
Decorators e inyecci贸n de dependencias en JavaScript
Flujo de Datos con Node.js
Aprende qu茅 son Buffer y Streams en Node.js
C贸mo utilizar streams y pipelines en Node.js
C贸mo funciona el Event Loop en Node.js
Qu茅 es Libuv y c贸mo maneja la asincron铆a en Node.js
Estrategias para ejecutar c贸digo as铆ncrono en Node.js
Debugging y Diagn贸stico en Node.js
C贸mo utilizar el Debugger en Node.js para solucionar problemas
Uso de Diagnostic Channels en Node.js para observabilidad y diagn贸stico
Instrumentaci贸n y m茅tricas clave en performance para aplicaciones Node.js
Control de errores globales y manejo de se帽ales en Node.js
Implementaci贸n Eficiente de Logs con Pino en Node.js
Performance en Node.js
An谩lisis del event loop en aplicaciones Node.js usando Nsolid
C贸mo Diagnosticar y Solucionar Memory Leaks en Aplicaciones Node.js
Optimizar rendimiento en Node.js con Worker Threads y Child Processes
Optimiza y Escala Aplicaciones Node.js con T茅cnicas de Caching
Creando CLIs con Node.js
C贸mo crear aplicaciones CLI con Node.js
C贸mo Crear un CLI con Minimist y Manejar Argumentos en Node.js
Creaci贸n de un CLI con Node.js y Google Generative AI
Creaci贸n de Chat con IA usando CLI en Node
C贸mo Crear e Instalar tu Propio CLI de Node con npm
You don't have access to this class
Keep learning! Join and start boosting your career
Node.js allows efficient operations with streams, a key tool for improving performance when handling large amounts of data. Knowing the types of streams available, such as read, write, duplex and transform, opens up multiple possibilities. A very practical utility is the pipeline operation, which allows you to connect several processes in a clear and controlled sequence.
Streams can be categorized into four main types:
To convert text to uppercase using streams, a stream type called transform is used, specifically designed to modify the information as it flows.
const { Transform, pipeline } = require('stream');const fs = require('fs');const toUpperCase = new Transform({ transform(chunk, encoding, callback) { this.push(chunk.toString().toUpperCase()); callback(); }});pipeline( fs.createReadStream('input.txt'), toUpperCase, fs.createWriteStream('output.txt'), (err) => { if (err) console.error('Error:', err); });
In this operation:
The readline utility allows to process a file line by line asynchronously, making it easy to read in an organized way, especially useful with large files.
const fs = require('fs');const readline = require('readline');async function readLines() { const fileStream = fs.createReadStream('content.txt'); const rl = readline.createInterface({ input: fileStream, crlfDelay: Infinity }); try { for await (const line of rl) { console.log(line); } } catch (error) { console.error(error); }}readLines();
This method allows:
Async iterables and for await simplify the handling of asynchronous operations, maintaining order in the results by creating linear and clear processes. In practice, this allows you to easily process streams in Node.js, preserving the structure and sequence of the content.
Go ahead and practice combining these methods! You can perform an operation that reads lines from a file, transforms them to uppercase and display or save them. Share how you solved the exercise in the comments.
Contributions 0
Questions 0
Want to see more contributions, questions and answers from the community?