Node Streams Module
#
WhyWe’ve used streams, but you may not have a complete grasp of what they are. Let’s dive a little deeper into streams and how we use them.
Streams in Node.js allow you to improve your data handling capabilities with better memory and time efficiency.
Streams are memory efficient because you load and use small amounts of your data as it comes, rather than load all of the data in memory before using it.
Since you can use data when it arrives in chunks, it takes far less time to process the data. Without streams, you would request the data, and wait until all of the data arrived before beginning to process and use it.
#
WhatStreams are fundamental to Node.js, and we’ve actually been using them this whole time. Any time we have worked with network communications (server requests and responses) or file system operations (reading and writing files), we’ve used streams.
In computing, streams have been around a while. The concept of streams consists of the ability to process chunks, as opposed to the whole, of data while being memory and time efficient as mentioned above.
Node.js provides us with a stream core module to access streaming APIs in our applications. Something to keep in mind, all streams are instances of EventEmitters. Remember our Request ReadableStream and Response WritableStream being EventEmitters? Well that goes for several of our file system methods and operations as well.
#
Four Types of Streams- Readable: a stream you can pipe data from, but not pipe into.
- Writable: a stream you can pipe data into, but not pipe from
- Duplex: a stream you can both pipe data into and pipe data from
- Transform: similar to a Duplex, but the data output is transformed from its input
pipe()
#
The pipe()
method let’s us directly send data from one source to another. In the first case below, our first source is readStream where we use the file system to read the contents of the popular-articles.json
file. Our second source that receives each chunk is our response body to the client. This is how the pipe()
method works: sourceOne.pipe(sourceTwo)
.
#
How#
Readable StreamWhen we used the Node.js file system module, we read and sent a file in response to a request on our server:
The readFile()
method reads the specified file in full, then sends the contents to the client. This is fine for json, html, or other smaller, lightweight files. However, bigger files would take significantly longer to load and then send.
When can alternatively write the same functionality with streams:
We can use the createReadStream()
method to write, or pipe, chunks of data to the client as soon as each chunk is read.
#
Writable StreamLet’s say we use writeFile()
to write a list of pokemon JSON objects to a new file:
Above, we specify the path, data, and callback into our writeFile()
method. The file is created and data written to the file in full. Once that process is complete, the callback is invoked.
We can alternatively use createWriteStream()
:
Writable streams have a write()
method that takes in data (chunks) to be written to the specified path.
#
Echo ExamplePreviously, we did an exercise that involved receiving a request body, url, and method, and echoing that information back to the client. It resembled:
Since our request and response are both streams (Readable and Writable respectively), we can use our new pipe()
method to refactor our echo.
#
Takeaways- Streams in Node.js allow you to improve your data handling capabilities with better memory and time efficiency
- We can utilize read and write streams in most of our I/O operations
- Read streams can use the
readSrc.pipe(writeSrc)
method to pipe (send) contents from a read stream to a write stream