Simple Chunked Data Handling in NodeJS/ExpressJS

There are often cases where you actually get streamed data from certain systems and you actually want to stream the same to your clients. One good use-case is where Elasticsearch provides data streams, where you only get chunks of the original result at your application server layer. This data, you can either hold in-memory and then send to clients as a whole. But, this can risk your application server going out-of-memory if the data handled is really large size. Besides, most of the http clients supports chunked response handling. They can orchestrate the data together at the end, or in certain cases process directly on the chunk of data available.

The following example shows a simple chunked data server. This express app waits 500ms before sending each piece of the response data.

var express = require('express');
var app = express();
var sentence = ["The", "quick", "brown", "fox", "jumps", "over", "the", "lazy", "dog"];

app.get('/', (req, res) => {
   var i = 0;
   var regId = setInterval(() => {
       if(i >= sentence.length) {
   }, 500);

app.listen(11000, () => {});

Just try running this server on your machine and see how it works perfectly fine with your existing browsers or clients such as curl. On the other hand, you can actually leverage the chunked-ness of your server, and actually work with chunks if you are really keen about performance.

var http = require('http');

   hostname: 'localhost',
   port: 11000,
   path: '/',
   agent: false
}).on('response', (response) => {
   response.on('data', (chunk) => {
       // Do your processing on the chunk here

Very simple and straight forward as you would expect. You may also put together these chunks and get the final sentence in the response callback. But, most clients does this automatically for you, unless you are using something as low level as the http module.

Notice the server response
Notice the server response

Thanks for reading.