In 2014 I presented a talk called “The Power of Streams in JS” for the BogotaJS meetup group.
Recently I found the slides and realized that they may be useful for developers learning how streams work in node.
Streams in node are great, quoting Dominic Tarr:
Streams in node are one of the rare occasions when doing something the fast way is actually easier. SO USE THEM. not since bash has streaming been introduced into a high level language as nicely as it is in node.” @dominictarr
Readable streams emit data generated from a data source
Writable streams are used to write chunks to a data source
Transform streams read and process chunks of data and emit them
Duplex Streams have two “modes” they can act as writtable streams by receiving chunks and writing them to some datasource, and they can emit chunks of data read from another datasource.
input and output are independent of each other.
Concat streams read all the input chunks and create one grouped chunk of data (e.g from a chunk of words concatenate into a String)
Filter streams work similar to .filter function for arrays. The only chunks that “pass” are the ones who satisfay some condition.
Split streams split a chunk of data using some separator.
A Join stream intersperse stream chunks with separators
With trough/through2 you can easily create transform streams. In this example the stream receives an array (each chunk is an array) and transform it into a JSON object.
Mux Demux is a technique that allows sending the data chunks of multiple (and different) streams over one channel and read them at the other side of the stream.
In the diagram a sample websocket using the Shoe package is used as the ‘channel’ that will transport the data chunks from diverse streams.