The Basics of Node.js Streams

By Sandeep Panda

Node.js is asynchronous and event driven in nature. As a result, it’s very good at handling I/O bound tasks. If you are working on an app that performs I/O operations, you can take advantage of the streams available in Node.js. So, let’s explore Streams in detail and understand how they can simplify I/O.

What are Streams

Streams are unix pipes that let you easily read data from a source and pipe it to a destination. Simply put, a stream is nothing but an EventEmitter and implements some specials methods. Depending on the methods implemented, a stream becomes Readable, Writable, or Duplex (both readable and writable). Readable streams let you read data from a source while writable streams let you write data to a destination.

If you have already worked with Node.js, you may have come across streams. For example, in a Node.js based HTTP server, request is a readable stream and response is a writable stream. You might have used fs module which lets you work with both readable and writable file streams.

Now that you know the basics, lets understand different type of streams. In this article, we will discuss readable and writable streams. Duplex streams are beyond the scope of this article.

Readable Stream

A readable stream lets you read data from a source. The source can be anything. It can be a simple file on your file system, a buffer in memory or even another stream. As streams are EventEmitters, they emit several events at various points. We will use these events to work with the streams.

Reading From Streams

The best way to read data from a stream is to listen to data event and attach a callback. When a chunk of data is available, the readable stream emits a data event and your callback executes. Take a look at the following snippet:

var fs = require('fs');
var readableStream = fs.createReadStream('file.txt');
var data = '';

readableStream.on('data', function(chunk) {

readableStream.on('end', function() {

The function call fs.createReadStream() gives you a readable stream. Initially, the stream is in a static state. As soon as you listen to data event and attach a callback it starts flowing. After that, chunks of data are read and passed to your callback. The stream implementor decides how often data event is emitted. For example, an HTTP request may emit a data event once a few KB of data are read. When you are reading data from a file you may decide you emit data event once a line is read.

When there is no more data to read (end is reached), the stream emits an end event. In the above snippet, we listen to this event to get notified when the end is reached.

There is also another way to read from stream. You just need to call read() on the stream instance repeatedly until every chunk of data has been read.

var fs = require('fs');
var readableStream = fs.createReadStream('file.txt');
var data = '';
var chunk;

readableStream.on('readable', function() {
    while (( != null) {
        data += chunk;

readableStream.on('end', function() {

The read() function reads some data from the internal buffer and returns it. When there is nothing to read, it returns null. So, in the while loop we check for null and terminate the loop. Note that the readable event is emitted when a chunk of data can be read from the stream.

Setting Encoding

By default the data you read from a stream is a Buffer object. If you are reading strings this may not be suitable for you. So, you can set encoding on the stream by calling Readable.setEncoding(), as shown below.

var fs = require('fs');
var readableStream = fs.createReadStream('file.txt');
var data = '';


readableStream.on('data', function(chunk) {

readableStream.on('end', function() {

In the above snippet we set the encoding to utf8. As a result, the data is interpreted as utf8 and passed to your callback as string.


Piping is a great mechanism in which you can read data from the source and write to destination without managing the flow yourself. Take a look at the following snippet:

var fs = require('fs');
var readableStream = fs.createReadStream('file1.txt');
var writableStream = fs.createWriteStream('file2.txt');


The above snippet makes use of the pipe() function to write the content of file1 to file2. As pipe() manages the data flow for you, you should not worry about slow or fast data flow. This makes pipe() a neat tool to read and write data. You should also note that pipe() returns the destination stream. So, you can easily utilize this to chain multiple streams together. Let’s see how!


Assume that you have an archive and want to decompress it. There are a number of ways to achieve this. But the easiest and cleanest way is to use piping and chaining. Have a look at the following snippet:

var fs = require('fs');
var zlib = require('zlib');


First, we create a simple readable stream from the file input.txt.gz. Next, we pipe this stream into another stream zlib.createGunzip() to un-gzip the content. Lastly, as streams can be chained, we add a writable stream in order to write the un-gzipped content to the file.

Additional Methods

We discussed some of the important concepts in readable streams. Here are some more stream methods you need to know:

  1. Readable.pause() – This method pauses the stream. If the stream is already flowing, it won’t emit data events anymore. The data will be kept in buffer. If you call this on a static (non-flowing) stream, the stream starts flowing, but data events won’t be emitted.
  2. Readable.resume() – Resumes a paused stream.
  3. readable.unpipe() – This removes destination streams from pipe destinations. If an argument is passed, it stops the readable stream from piping into the particular destination stream. Otherwise, all the destination streams are removed.

Writable Streams

Writable streams let you write data to a destination. Like readable streams, these are also EventEmitters and emit various events at various points. Let’s see various methods and events available in writable streams.

Writing to Streams

To write data to a writable stream you need to call write() on the stream instance. The following snippet demonstrates this technique.

var fs = require('fs');
var readableStream = fs.createReadStream('file1.txt');
var writableStream = fs.createWriteStream('file2.txt');


readableStream.on('data', function(chunk) {

The above code is straightforward. It simply reads chunks of data from an input stream and writes to the destination using write(). This function returns a Boolean value indicating if the operation was successful. If true, then the write was successful and you can keep writing more data. If false is returned, it means something went wrong and you can’t write anything at the moment. The writable stream will let you know when you can start writing more data by emitting a drain event.

End of Data

When you don’t have more data to write you can simply call end() to notify the stream that you have finished writing. Assuming res is an HTTP response object, you often do the following to send the response to browser:

res.write('Some Data!!');

When end() is called and every chunk of data has been flushed, a finish event is emitted by the stream. Just note that you can’t write to the stream after calling end(). For example, the following will result in an error.

res.write('Some Data!!');
res.write('Trying to write again'); //Error!

Here are some important events related to writable streams:

  1. error – Emitted to indicate that an error has occurred while writing/piping.
  2. pipe – When a readable stream is piped into a writable stream, this event is emitted by the writable stream.
  3. unpipe – Emitted when you call unpipe on the readable stream and stop it from piping into the destination stream.


This was all about the basics of streams. Streams, pipes, and chaining are the core and most powerful features in Node.js. If used responsibly, streams can indeed help you write neat and performant code to perform I/O.

Did you like the article? Do let us know what you think via comments.

  • nastgeraldcha

    I don’t have that big ideas about this, but I would love to learn more, gotta visit your blog Sir Sandeep. thanks

  • WooDzu

    Mmmm. I’d be interested to know if it’s be possible to use this kind of streaming to pipe rtcp stream from an IP camera to WebRTC stream in short emulate WebRTC client.

  • Kevdev

    Nice clear and concise article Sandeep, thank you.

    • Sandeep Panda

      Thanks! Glad that you liked it. :)

  • Wayne

    Great article, even better if you can explain how pipe works in gulp.js

  • pixelBender67

    better yet when are streams coming to Gruntjs

  • TheMarchMyth

    This is the best “Intro to Node.js Streams” article I have encountered. Something finally clicked!

  • Colin May

    This article seems like a first or second draft to me. It could be improved a lot. Try adding some links to other sources as you go along tossing about terminology, especially if you don’t intend to explain things yourself. The article heading and introduction imply that this is a sort of novice-level article, but what exactly are your assumptions about your readers’ levels of knowledge? Near the beginning you describe Node streams as “unix pipes” without explaining what that means. Are your readers mostly Unix programmers? Later you provide examples of two ways to read from a stream, but no hints about why one or the other might be favorable. Give us some context! Why read a stream instead of a whole file?

    • dotmagic

      If you’re reading some manual or following a tutorial it will always happen your read something you don’t understand. Explaining every detail is out of scope of such a post. Regarding “unix pipes” this is something every student have learned, exspecially nowadays since Linux is so popular. And your last question is better placed in a more general introduction about nodejs and asynchronous programming.

      • JakeInDC

        Not all of us cared to learn much about Linux (for better or worse). I’ve been in programming 10 years now, only learned about piping and chaining a few years ago with node. I may be in the minority (maybe not), but when I read that I just assumed they were the same thing.

        • Swivelgames

          I would argue that regardless of whether or not a student is interested in or “cares” to learn about things like piping and chaining, the understanding of such concepts are imperatively relevant. The choice to not learn something is entirely up to the developer, just as it is not the author’s job to explain every little detail. Google is a click a way, where you can learn more about “unix pipes”. If a basic concept like piping or chaining is esoteric to you, make sure you’re at the level the tutorial requires. If not, go research, and come back when you’re freshened up on piping and chaining :)

    • Kurt Farao

      Streams in Nodejs is not a novice concept to grasp, I think you expecting too much from a Blog post. This article is fine the way it is. Node docs can answer the missing holes.

    • Klassy

      I am a beginner and this article was exactly what I was looking for. All the other ones were too in depth. When I reached a topic I didn’t understand I simply googled it in another tab really quick, read that article, then went back to this.

  • David Chase

    I agree with Colin, this seems a bit lacking… nothing about streams1 vs streams2 such as adding event listeners to data switches to the classic stream… theres no talk about those fundamentals which IMHO are key when learning about streams in the beginning

  • Connor James Leech

    readableStream.on(‘readable’, function() {
    while (( != null) {
    data += chunk;

    you need to pass chunk as argument to callback right? Great article!

  • Andy

    great help, thks

  • Nastya

    thank you for this post.

  • bolddane

    Misses the whole point of streams, as in the following snippet of code from the article:

    readableStream.on(‘data’, function(chunk) {

    readableStream.on(‘end’, function() {

    You don’t read from a stream into your app simply to buffer it there, and write it all out in one fell swoop. Streams are intended to work very differently: you should send the data to console as you get it. (Otherwise, replace the readable stream with a synchronous file read, which is effectively what is happening in the snippet above.) To fix this, do the following:

    readableStream.on(‘data’, function(chunk) {

    readableStream.on(‘end’, function() {

  • Rodrigo

    It is a great article for novices in nodejs like me. Thanks

  • jinmatt

    pipe and writable streams can be used almost similar, right? except that writable streams gives you more control on what you do with the data

  • MH Raihan

    This is not for beginner and need to improve many thing :/

  • Carol Chung

    thanks for posting this article. I have been going through the node school learnyounode tutorials and think this article is a good accompanying piece to put together the high level concepts. also I think it may be a good intro to the stream-adventure node tutorial since I had no prior experience with unix pipes.
    I think I will still need to look for an intro on unix pipes.

  • Deepak

    thank you for this post :)

  • Alex Mills

    One thing I am curious about with reading/writing streams is how much data usually is consumed at a time by default, and how do you change those defaults?

    • Bob Craver

      You can pass an options object to createReadStream:

      var readableStream = fs.createReadStream(‘file.txt’, { encoding: ‘utf8’, highWaterMark: 32 * 1024 });

      The highWaterMark sets the size of the buffer. So I am saying the buffer size is 32 KB (32 * 1024 bytes).

  • Ravi Guru

    great article, it help me a lot

  • roflmyeggo

    Great article, helped me a lot. Thanks.

  • Klassy

    Very helpful, thank you.

  • Tanel Tammik

    How can i know, if net.Socket stream chunk is a partial waiting for the next one or not? I could check for the nr at the end of the line, but what if it just waits for the user input? The goal is to read a net stream line by line, and the stream line not always ends with a newline.

  • Johny Thinker

    Very good for beginners as me! Thanks a lot.

  • praveen kumar

    good article sandeep panda..

  • Kutsan Kaplan

    Seems make sense now. Thank you for this. Good tutorial.



Because We Like You
Free Ebooks!

Grab SitePoint's top 10 web dev and design ebooks, completely free!

Get the latest in JavaScript, once a week, for free.