node and Error: EMFILE, too many open files

xaverras picture xaverras · Jan 23, 2012 · Viewed 164.7k times · Source

For some days I have searched for a working solution to an error

Error: EMFILE, too many open files

It seems that many people have the same problem. The usual answer involves increasing the number of file descriptors. So, I've tried this:

sysctl -w kern.maxfiles=20480,

The default value is 10240. This is a little strange in my eyes, because the number of files I'm handling in the directory is under 10240. Even stranger, I still receive the same error after I've increased the number of file descriptors.

Second question:

After a number of searches I found a work around for the "too many open files" problem:

var requestBatches = {};
function batchingReadFile(filename, callback) {
  // First check to see if there is already a batch
  if (requestBatches.hasOwnProperty(filename)) {
    requestBatches[filename].push(callback);
    return;
  }

  // Otherwise start a new one and make a real request
  var batch = requestBatches[filename] = [callback];
  FS.readFile(filename, onRealRead);
  
  // Flush out the batch on complete
  function onRealRead() {
    delete requestBatches[filename];
    for (var i = 0, l = batch.length; i < l; i++) {
      batch[i].apply(null, arguments);
    }
  }
}

function printFile(file){
    console.log(file);
}

dir = "/Users/xaver/Downloads/xaver/xxx/xxx/"

var files = fs.readdirSync(dir);

for (i in files){
    filename = dir + files[i];
    console.log(filename);
    batchingReadFile(filename, printFile);

Unfortunately I still recieve the same error. What is wrong with this code?

Answer

blak3r picture blak3r · Jan 12, 2014

For when graceful-fs doesn't work... or you just want to understand where the leak is coming from. Follow this process.

(e.g. graceful-fs isn't gonna fix your wagon if your issue is with sockets.)

From My Blog Article: http://www.blakerobertson.com/devlog/2014/1/11/how-to-determine-whats-causing-error-connect-emfile-nodejs.html

How To Isolate

This command will output the number of open handles for nodejs processes:

lsof -i -n -P | grep nodejs
COMMAND     PID    USER   FD   TYPE    DEVICE SIZE/OFF NODE NAME
...
nodejs    12211    root 1012u  IPv4 151317015      0t0  TCP 10.101.42.209:40371->54.236.3.170:80 (ESTABLISHED)
nodejs    12211    root 1013u  IPv4 151279902      0t0  TCP 10.101.42.209:43656->54.236.3.172:80 (ESTABLISHED)
nodejs    12211    root 1014u  IPv4 151317016      0t0  TCP 10.101.42.209:34450->54.236.3.168:80 (ESTABLISHED)
nodejs    12211    root 1015u  IPv4 151289728      0t0  TCP 10.101.42.209:52691->54.236.3.173:80 (ESTABLISHED)
nodejs    12211    root 1016u  IPv4 151305607      0t0  TCP 10.101.42.209:47707->54.236.3.172:80 (ESTABLISHED)
nodejs    12211    root 1017u  IPv4 151289730      0t0  TCP 10.101.42.209:45423->54.236.3.171:80 (ESTABLISHED)
nodejs    12211    root 1018u  IPv4 151289731      0t0  TCP 10.101.42.209:36090->54.236.3.170:80 (ESTABLISHED)
nodejs    12211    root 1019u  IPv4 151314874      0t0  TCP 10.101.42.209:49176->54.236.3.172:80 (ESTABLISHED)
nodejs    12211    root 1020u  IPv4 151289768      0t0  TCP 10.101.42.209:45427->54.236.3.171:80 (ESTABLISHED)
nodejs    12211    root 1021u  IPv4 151289769      0t0  TCP 10.101.42.209:36094->54.236.3.170:80 (ESTABLISHED)
nodejs    12211    root 1022u  IPv4 151279903      0t0  TCP 10.101.42.209:43836->54.236.3.171:80 (ESTABLISHED)
nodejs    12211    root 1023u  IPv4 151281403      0t0  TCP 10.101.42.209:43930->54.236.3.172:80 (ESTABLISHED)
....

Notice the: 1023u (last line) - that's the 1024th file handle which is the default maximum.

Now, Look at the last column. That indicates which resource is open. You'll probably see a number of lines all with the same resource name. Hopefully, that now tells you where to look in your code for the leak.

If you don't know multiple node processes, first lookup which process has pid 12211. That'll tell you the process.

In my case above, I noticed that there were a bunch of very similar IP Addresses. They were all 54.236.3.### By doing ip address lookups, was able to determine in my case it was pubnub related.

Command Reference

Use this syntax to determine how many open handles a process has open...

To get a count of open files for a certain pid

I used this command to test the number of files that were opened after doing various events in my app.

lsof -i -n -P | grep "8465" | wc -l
# lsof -i -n -P | grep "nodejs.*8465" | wc -l
28
# lsof -i -n -P | grep "nodejs.*8465" | wc -l
31
# lsof -i -n -P | grep "nodejs.*8465" | wc -l
34

What is your process limit?

ulimit -a

The line you want will look like this:

open files                      (-n) 1024

Permanently change the limit:

  • tested on Ubuntu 14.04, nodejs v. 7.9

In case you are expecting to open many connections (websockets is a good example), you can permanently increase the limit:

  • file: /etc/pam.d/common-session (add to the end)

      session required pam_limits.so
    
  • file: /etc/security/limits.conf (add to the end, or edit if already exists)

      root soft  nofile 40000
      root hard  nofile 100000
    
  • restart your nodejs and logout/login from ssh.

  • this may not work for older NodeJS you'll need to restart server

  • use instead of if your node runs with different uid.