Introducci贸n a Node.js
Introducci贸n Node.js
Instalaci贸n y configuraci贸n del entorno de Node.js
Primer proyecto con Node.js
Quiz: Introducci贸n a Node.js
M贸dulos y gesti贸n de paquetes
Tipos de M贸dulos en Node.js
Gesti贸n de Paquetes con NPM
Creaci贸n de un Paquetes con NPM
Publicaci贸n de Paquetes con NPM
Quiz: M贸dulos y gesti贸n de paquetes
M贸dulos nativos en Node.js
Introducci贸n al M贸dulo FS de Node.js
Leer y escribir archivos en Node.js
M贸dulo fs: Implementar transcripci贸n de audio con OpenAI
M贸dulo Console: info, warn, error, table
M贸dulo Console: group, assert, clear, trace
M贸dulo OS: informaci贸n del sistema operativo en Node.js
M贸dulo Crypto: cifrado y seguridad en Node.js
M贸dulo Process: manejo de procesos en Node.js
Timers: setTimeout, setInterval en Node.js
Streams: manejo de datos en tiempo real en Node.js
Buffers: manipulaci贸n de datos binarios en Node.js
Quiz: M贸dulos nativos en Node.js
Servidores con Node.js
HTTP: fundamentos de servidores en Node.js
Servidor nativo y streaming de video en Node.js
You don't have access to this class
Keep learning! Join and start boosting your career
Programming with streams in Node.js represents a powerful tool for handling large volumes of data without overloading the system memory. This technique is especially useful when we need to process large files, video or audio streams, allowing us to work with fragments of information instead of loading all the content at once.
Node.js streams are collections of data that can be processed in chunks, rather than having to load all the content into memory. This is extremely efficient when working with large files, as it allows us to:
In essence, streams work like pipes that allow data to flow from a source to a destination, processing it along the way according to our needs.
To work with streams in Node.js, we will use the fs
(file system) module, which provides us with specific methods to create read and write streams. Let's see how to implement it step by step:
First, we need to import the fs
module and configure our read and write streams:
const fs = require('fs');
// Create a read streamconst readStream = fs.createReadStream('js.txt', { encoding: 'utf8'});
// Create a write streamconst writeStream = fs.createWriteStream('output-js.txt');
In this code we are:
fs
module to work with files.createReadStream()
, specifying the source file and the UTF-8 encoding.createWriteStream()
for the target fileStreams in Node.js work with an event system. The main events we need to handle are:
// Event to process each data chunkreadStream.on('data', (chunk) => { console.log('Reading chunk'); writeStream.write(chunk);});
// Event when reading endsreadStream.on('end', () => { console.log('Finished reading file'); writeStream.end();});
// Error handling on readreadStream.on('error', (err) => { console.log('Error reading file', err);});
// Error handling on writewriteStream.on('error', (err) => { console.log('Error writing file', err);});
This code sets up:
'data'
event that is triggered each time a fragment of the file is read.'end'
event that is triggered when the read is completedWhen running our program with node strings.js
, the following will occur:
It is important to note that each time the program is run, a new empty target file is created. If the program fails before processing data (for example, if the source file does not exist), the destination file will be empty.
When implementing stream-based solutions, there are several important aspects to keep in mind:
Streams are particularly useful when:
Node.js streams represent a fundamental tool for developing efficient applications that handle large volumes of data. Mastering this technique will allow you to create more robust and optimized solutions. Have you used streams in any of your projects? Share your experience and any questions you have about this powerful Node.js feature.
Contributions 0
Questions 0
Want to see more contributions, questions and answers from the community?