<vetted />
Node.js
Senior
Question 6 of 6

When would you use streams in Node.js instead of loading everything into memory?

Quick Answer

Streams process data piece by piece rather than loading everything into memory, ideal for handling large files or data transfers.

Detailed Answer5 paragraphs

Streams in Node.js are collections of data that might not be available all at once. Instead of reading entire files or datasets into memory, streams allow you to process data piece by piece, making them essential for handling large amounts of data efficiently.

There are four types of streams: Readable (sources you read from, like fs.createReadStream), Writable (destinations you write to, like fs.createWriteStream), Duplex (both readable and writable, like TCP sockets), and Transform (duplex streams that modify data passing through, like zlib compression).

Streams are event-based. Readable streams emit 'data', 'end', 'error', and 'close' events. The pipe() method connects a readable stream to a writable stream, handling backpressure automatically: fs.createReadStream('large.txt').pipe(fs.createWriteStream('copy.txt'))

Backpressure is crucial: if you write faster than the destination can handle, data queues up in memory. Streams handle this by pausing the readable stream until the writable stream drains. Using pipe() handles this automatically.

Use streams when: reading or writing large files, handling HTTP request/response bodies, processing data in real-time, working with compression/encryption, or when memory efficiency is important. Modern async iteration (for await...of) works with streams, making them easier to use with async/await patterns.

Key Takeaway

Streams process data piece by piece rather than loading everything into memory, ideal for handling large files or data transfers.

Ace your interview

Ready to Land Your Dream Job?

Join our network of elite AI-native engineers.