Working with files using streams in Node.js is a powerful way to handle large files efficiently without loading the entire content into memory. Streams allow you to read and write data in chunks, which is beneficial for processing large files or handling network operations. Here's an overview of working with files using streams in Node.js:
Reading Files with Streams:-
Reading a File using Readable Streams:
const fs = require('fs'); const readStream = fs.createReadStream('input.txt', { encoding: 'utf8' }); readStream.on('data', (chunk) => { // Process each chunk of data console.log(chunk); }); readStream.on('end', () => { // File reading is complete console.log('File reading finished.'); }); readStream.on('error', (err) => { // Handle any errors console.error('Error reading file:', err); });
-
Writing to a File using Writable Streams:
const fs = require('fs'); const writeStream = fs.createWriteStream('output.txt', { encoding: 'utf8' }); writeStream.write('Hello, '); writeStream.write('Node.js Streams!'); writeStream.end(); // Close the stream after writing writeStream.on('finish', () => { // Writing to the file is complete console.log('File writing finished.'); }); writeStream.on('error', (err) => { // Handle any errors console.error('Error writing to file:', err); });
-
Piping Streams:
You can pipe data from a readable stream to a writable stream to simplify the process.
const fs = require('fs'); const readableStream = fs.createReadStream('input.txt', { encoding: 'utf8' }); const writableStream = fs.createWriteStream('output.txt', { encoding: 'utf8' }); readableStream.pipe(writableStream); readableStream.on('error', (err) => { console.error('Error reading file:', err); }); writableStream.on('finish', () => { console.log('File writing finished.'); }); writableStream.on('error', (err) => { console.error('Error writing to file:', err); });
-
Transform Streams:
Transform streams are duplex streams where the output is computed based on the input. They can be used to modify the data while it's being read or written.
Using streams in Node.js helps in efficient handling of file operations, especially for large files or when dealing with I/O-bound tasks. It improves performance by processing data in smaller, manageable chunks. Depending on your use case, you can choose between readable, writable, duplex, or transform streams to achieve efficient file handling in Node.js.