Stdout of Node.js child_process exec is cut short

Alex H Hadik picture Alex H Hadik · Jan 17, 2014 · Viewed 15.6k times · Source

In Node.js I'm using the exec command of the child_process module to call an algorithm in Java that returns a large amount of text to standard out which I then parse and use. I'm able to capture it mostly, but when it exceeds a certain number of lines, the content is cutoff.

exec("sh target/bin/solver "+fields.dimx+" "+fields.dimy, function(error, stdout, stderr){
    //do stuff with stdout
}

I've tried using setTimeouts and callbacks but haven't succeeded but I do feel this is occurring because I'm referencing stdout in my code before it can be retrieved completely. I have tested that stdout is in-fact where the data loss first occurs. It's not an asynchronous issue further down the line. I've also tested this on my local machine and Heroku, and the exact same issue occurs, truncating at the exact same line number every time.

Any ideas or suggestions as to what might help with this?

Answer

olamotte picture olamotte · Apr 15, 2014

I had exec.stdout.on('end') callbacks hung forever with @damphat solution.

Another solution is to increase the buffer size in the options of exec: see the documentation here

{ encoding: 'utf8',
  timeout: 0,
  maxBuffer: 200*1024, //increase here
  killSignal: 'SIGTERM',
  cwd: null,
  env: null }

To quote: maxBuffer specifies the largest amount of data allowed on stdout or stderr - if this value is exceeded then the child process is killed. I now use the following: this does not require handling the separated parts of the chunks separated by commas in stdout, as opposed to the accepted solution.

exec('dir /b /O-D ^2014*', {
    maxBuffer: 2000 * 1024 //quick fix
    }, function(error, stdout, stderr) {
        list_of_filenames = stdout.split('\r\n'); //adapt to your line ending char
        console.log("Found %s files in the replay folder", list_of_filenames.length)
    }
);