Streams in Node.js are a powerful way to handle reading and writing data in a continuous manner. They allow you to process data efficiently, especially when dealing with large amounts of information or I/O operations. This guide will cover the types of streams, how to use them, and practical examples to help you understand how streams work in Node.js.
Streams are objects that allow you to read data from a source or write data to a destination in a continuous manner. They are ideal for processing data piece-by-piece rather than reading or writing entire files or buffers at once. This is especially useful when working with large datasets, as it can significantly reduce memory usage.
Readable Streams: These streams allow you to read data. Examples include fs.createReadStream() and http.IncomingMessage.
Writable Streams: These streams allow you to write data. Examples include fs.createWriteStream() and http.ServerResponse.
Duplex Streams: These streams can read and write data. Examples include TCP sockets and net.Duplex.
Transform Streams: These are a type of duplex stream that can modify the data as it is being read or written. Examples include zlib.createGzip() for compression.
You can create a readable stream using the built-in fs module to read files or using stream.Readable to create custom readable streams.
const fs = require('fs'); // Create a readable stream const readableStream = fs.createReadStream('example.txt', { encoding: 'utf8' }); // Handling the 'data' event readableStream.on('data', (chunk) => { console.log('New chunk received:', chunk); }); // Handling the 'end' event readableStream.on('end', () => { console.log('No more data to read.'); });
const { Readable } = require('stream'); class MyReadableStream extends Readable { constructor(options) { super(options); this.current = 0; } _read(size) { if (this.current { console.log('Received:', chunk.toString()); });
You can create writable streams using the fs module or by extending the stream.Writable class.
const fs = require('fs'); // Create a writable stream const writableStream = fs.createWriteStream('output.txt'); // Write data to the stream writableStream.write('Hello, World!\n'); writableStream.write('Writing to a file using streams.\n'); // End the stream writableStream.end(() => { console.log('Finished writing to file.'); });
const { Writable } = require('stream'); class MyWritableStream extends Writable { _write(chunk, encoding, callback) { console.log('Writing:', chunk.toString()); callback(); // Call when done } } const myWritableStream = new MyWritableStream(); myWritableStream.write('Hello, World!\n'); myWritableStream.write('Writing to custom writable stream.\n'); myWritableStream.end();
Duplex streams can read and write data simultaneously. A common use case is TCP sockets.
const { Duplex } = require('stream'); class MyDuplexStream extends Duplex { _read(size) { this.push('Data from duplex stream\n'); this.push(null); // No more data } _write(chunk, encoding, callback) { console.log('Received:', chunk.toString()); callback(); } } const myDuplexStream = new MyDuplexStream(); myDuplexStream.on('data', (chunk) => { console.log('Reading:', chunk.toString()); }); // Write to the duplex stream myDuplexStream.write('Hello, Duplex!\n'); myDuplexStream.end();
Transform streams are useful for modifying data as it flows through the stream. For example, you might use a transform stream to compress data.
const { Transform } = require('stream'); class MyTransformStream extends Transform { _transform(chunk, encoding, callback) { const upperChunk = chunk.toString().toUpperCase(); this.push(upperChunk); callback(); } } const myTransformStream = new MyTransformStream(); myTransformStream.on('data', (chunk) => { console.log('Transformed:', chunk.toString()); }); // Pipe data through the transform stream process.stdin.pipe(myTransformStream).pipe(process.stdout);
One of the powerful features of streams is the ability to pipe them together. Piping allows you to connect a readable stream to a writable stream, which makes it easy to transfer data.
const fs = require('fs'); // Create a readable stream const readableStream = fs.createReadStream('input.txt'); // Create a writable stream const writableStream = fs.createWriteStream('output.txt'); // Pipe the readable stream to the writable stream readableStream.pipe(writableStream); writableStream.on('finish', () => { console.log('Data has been written to output.txt'); });
Readable streams emit several important events that help you manage data flow:
const fs = require('fs'); const readableStream = fs.createReadStream('example.txt'); readableStream.on('data', (chunk) => { console.log('Received chunk:', chunk.toString()); }); readableStream.on('end', () => { console.log('No more data to read.'); }); readableStream.on('error', (err) => { console.error('Error occurred:', err); }); readableStream.on('close', () => { console.log('Stream closed.'); });
Writable streams also emit several events:
const fs = require('fs'); const writableStream = fs.createWriteStream('output.txt'); writableStream.on('finish', () => { console.log('All data has been written to output.txt'); }); writableStream.on('error', (err) => { console.error('Error occurred:', err); }); // Writing data writableStream.write('Hello, World!\n'); writableStream.write('Writing to a file using streams.\n'); writableStream.end(); // Call end to finish the writing process
Transform streams inherit events from both readable and writable streams, and they emit:
const { Transform } = require('stream'); class MyTransformStream extends Transform { _transform(chunk, encoding, callback) { const upperChunk = chunk.toString().toUpperCase(); this.push(upperChunk); callback(); } } const myTransformStream = new MyTransformStream(); myTransformStream.on('data', (chunk) => { console.log('Transformed chunk:', chunk.toString()); }); myTransformStream.on('end', () => { console.log('No more data to transform.'); }); myTransformStream.on('error', (err) => { console.error('Error occurred:', err); }); // Write data to the transform stream myTransformStream.write('Hello, World!\n'); myTransformStream.write('Transforming this text.\n'); myTransformStream.end(); // End the stream
Streams in Node.js provide a powerful and efficient way to handle data in a continuous manner. They allow you to read and write data piece-by-piece, making them particularly useful for I/O operations and working with large datasets. Understanding how to create and use different types of streams, as well as how to handle events, will help you build more efficient and scalable applications in Node.js.
Whether you're creating readable, writable, duplex, or transform streams, the flexibility of the stream API allows you to handle data processing in a way that best suits your application's needs.
Disclaimer: All resources provided are partly from the Internet. If there is any infringement of your copyright or other rights and interests, please explain the detailed reasons and provide proof of copyright or rights and interests and then send it to the email: [email protected] We will handle it for you as soon as possible.
Copyright© 2022 湘ICP备2022001581号-3