Multiple Consumption of single stream

Saransh Mohapatra picture Saransh Mohapatra · Jul 30, 2013 · Viewed 7.8k times · Source

I want to know if its possible that multiple functions can consume single stream in node.js. If yes How can this done? Is it possible to pipe to multiple destinations?

I want to use the stream in two different functions which are parallel. I am doing the parallel flow using the async module. So will it possible to say issue the pipe() statement inside each of these functions?

Thanks in advance.

Answer

EhevuTov picture EhevuTov · Jul 31, 2013

Yes, it's possible, easy and common. The following is a piped data stream from a single source to multiple sources. It shows you the one anonymous callback function that gets placed on the event loop that contains the write function streams that do the actual write work:

var fs  = require('fs');

var rs1 = fs.createReadStream ('input1.txt');                      
var ws1 = fs.createWriteStream('output1.txt');     
var ws2 = fs.createWriteStream('output2.txt');

rs1.on('data', function (data) {                                  
  console.log(data.toString('utf8'));                              
  ws1.write('1: ' + data);                                       
  ws2.write('2: ' + data);                                       
});

An easier way is to use the .pipe() functions.

var fs  = require('fs');

var rs1 = fs.createReadStream ('input1.txt');                      
var ws1 = fs.createWriteStream('output1.txt');     
var ws2 = fs.createWriteStream('output2.txt');

rs1.pipe(ws1);
rs1.pipe(ws2);

The .pipe() allows you to do nifty things like pipeline chaining in the future for pipeline manipulation, very similar to the unix concept of something like du . | sort -rn | less where you can use multiple pipes to handlers.