node and Error: EMFILE, too many open files


For some days I have searched for a working solution to an error

Error: EMFILE, too many open files

It seems that many people have the same problem. The usual answer involves increasing the number of file descriptors. So, I've tried this:

sysctl -w kern.maxfiles=20480,

The default value is 10240. This is a little strange in my eyes, because the number of files I'm handling in the directory is under 10240. Even stranger, I still receive the same error after I've increased the number of file descriptors.

Second question:

After a number of searches I found a work around for the "too many open files" problem:

var requestBatches = {};
function batchingReadFile(filename, callback) {
  // First check to see if there is already a batch
  if (requestBatches.hasOwnProperty(filename)) {

  // Otherwise start a new one and make a real request
  var batch = requestBatches[filename] = [callback];
  FS.readFile(filename, onRealRead);

  // Flush out the batch on complete
  function onRealRead() {
    delete requestBatches[filename];
    for (var i = 0, l = batch.length; i < l; i++) {
      batch[i].apply(null, arguments);

function printFile(file){

dir = "/Users/xaver/Downloads/xaver/xxx/xxx/"

var files = fs.readdirSync(dir);

for (i in files){
    filename = dir + files[i];
    batchingReadFile(filename, printFile);

Unfortunately I still recieve the same error. What is wrong with this code?

One last question (I'm new to javascript and node), I'm in the process of developping a web application with a lot of requests for about 5000 daily users. I've many years of experience in programming with other languages like python and java. so originally I thought to developp this application with django or play framework. Then I discovered node and I must say that the idea of non-blocking I/O model is really nice, seductive, and most of all very fast!

But what kind of problems should I expect with node? Is it a production proven web server? What are your experiences?

1/23/2012 5:05:02 AM

For when graceful-fs doesn't work... or you just want to understand where the leak is coming from. Follow this process.

(e.g. graceful-fs isn't gonna fix your wagon if your issue is with sockets.)

From My Blog Article:

How To Isolate

This command will output the number of open handles for nodejs processes:

lsof -i -n -P | grep nodejs

nodejs    12211    root 1012u  IPv4 151317015      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1013u  IPv4 151279902      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1014u  IPv4 151317016      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1015u  IPv4 151289728      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1016u  IPv4 151305607      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1017u  IPv4 151289730      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1018u  IPv4 151289731      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1019u  IPv4 151314874      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1020u  IPv4 151289768      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1021u  IPv4 151289769      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1022u  IPv4 151279903      0t0  TCP> (ESTABLISHED)
nodejs    12211    root 1023u  IPv4 151281403      0t0  TCP> (ESTABLISHED)

Notice the: 1023u (last line) - that's the 1024th file handle which is the default maximum.

Now, Look at the last column. That indicates which resource is open. You'll probably see a number of lines all with the same resource name. Hopefully, that now tells you where to look in your code for the leak.

If you don't know multiple node processes, first lookup which process has pid 12211. That'll tell you the process.

In my case above, I noticed that there were a bunch of very similar IP Addresses. They were all 54.236.3.### By doing ip address lookups, was able to determine in my case it was pubnub related.

Command Reference

Use this syntax to determine how many open handles a process has open...

To get a count of open files for a certain pid

I used this command to test the number of files that were opened after doing various events in my app.

lsof -i -n -P | grep "8465" | wc -l

# lsof -i -n -P | grep "nodejs.*8465" | wc -l
# lsof -i -n -P | grep "nodejs.*8465" | wc -l
# lsof -i -n -P | grep "nodejs.*8465" | wc -l

What is your process limit?

ulimit -a

The line you want will look like this: open files (-n) 1024

Permanently change the limit:

  • tested on Ubuntu 14.04, nodejs v. 7.9

In case if you are expecting to open many connections (websockets is a good example), you can permanently increase the limit:

  • file: /etc/pam.d/common-session (add to the end)

    session required
  • file: /etc/security/limits.conf (add to the end, or edit if already exists)

    root soft  nofile 40000
    root hard  nofile 100000
  • restart your nodejs and logout/login from ssh.

  • this may not work for older NodeJS you'll need to restart server
  • use instead of if your node runs with different uid.
6/12/2017 3:37:23 PM

Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow