Node.jsStreams & Buffers

Working with Streams in Node.js

Streams are collections of data that might not be available all at once and don't have to fit in memory. They're perfect for handling large files or real-time data.

Types of Streams

  • Readable: Streams from which data can be read (e.g., fs.createReadStream)
  • Writable: Streams to which data can be written (e.g., fs.createWriteStream)
  • Duplex: Streams that are both Readable and Writable (e.g., TCP sockets)
  • Transform: Duplex streams that can modify data (e.g., zlib.createGzip)
  • Reading Files with Streams

    const fs = require('fs');

    const readStream = fs.createReadStream('large-file.txt', { encoding: 'utf8', highWaterMark: 64 * 1024 // 64KB chunks });

    readStream.on('data', (chunk) => { console.log(\Received \${chunk.length} bytes\); });

    readStream.on('end', () => { console.log('Finished reading'); });

    readStream.on('error', (err) => { console.error('Error:', err); });

    Piping Streams

    const fs = require('fs');
    const zlib = require('zlib');

    // Compress a file fs.createReadStream('input.txt') .pipe(zlib.createGzip()) .pipe(fs.createWriteStream('input.txt.gz'));

    Creating Custom Streams

    const { Transform } = require('stream');

    const upperCaseTransform = new Transform({ transform(chunk, encoding, callback) { this.push(chunk.toString().toUpperCase()); callback(); } });

    process.stdin .pipe(upperCaseTransform) .pipe(process.stdout);

    Buffers

    Buffers are used to handle binary data:

    // Create a buffer
    const buf = Buffer.from('Hello World');

    // Convert to different encodings console.log(buf.toString('hex')); console.log(buf.toString('base64'));

    // Allocate buffer const empty = Buffer.alloc(10); // 10 bytes, filled with zeros