Sharing one port among multiple node.js HTTP processes

buschtoens picture buschtoens · May 29, 2012 · Viewed 7.5k times · Source

I have a root server running with several node.js projects on it. They are supposed to run separately in their own processes and directories. Consider this file structure:

/home
+-- /node
    +-- /someProject      | www.some-project.com
    |   +-- index.js
    |   +-- anotherFile.img
    |   +-- ...
    +-- /anotherProject   | www.another-project.com
    |   +-- /stuff
    |   +-- index.js
    |   +-- ...
    +-- /myWebsite        | www.my-website.com
    |   +-- /static
    |   +-- index.js
    |   +-- ...
    +-- ...               | ...

Each index.js should be started as an individual process with its cwd set to its parent-folder (someProject, anotherProject, etc.).

Think ov vHosts. Each project starts a webserver which listens on its own domain. And there's the problem. Only one script can start since, they all try to bind to port 80. I digged to into the node.js API and looked for a possible solution: child_process.fork().

Sadly this doesn't work very well. When I try to send a server instance to the master process (to emit a request later on) or an object consiting of request and response from the master to the salve I get errors. This is because node.js internally tries to convert these advanced objects to a JSON string and then reconverts it to its original form. This makes all the objects loose their reference and functionality.

Seccond approach child.js

var http = require("http");

var server = http.createServer(function(req, res) {
    // stuff...
});
server.listen(80);

process.send(server); // Nope

First approach master.js

var http = require("http"),
    cp = require("child_process");

var child = cp.fork("/home/node/someProject/index.js", [], { env: "/home/node/someProject" });

var router = http.createServer(function(req, res) {
    // domaincheck, etc...
    child.send({ request: req, response: res }); // Nope
});
router.listen(80);

So this is a dead end. But, hey! Node.js offers some kind of handles, which are sendable. Here's an example from the documentation:

master.js

var server = require('net').createServer();
var child = require('child_process').fork(__dirname + '/child.js');
// Open up the server object and send the handle.
server.listen(1337, function() {
  child.send({ server: true }, server._handle);
});

child.js

process.on('message', function(m, serverHandle) {
  if (serverHandle) {
    var server = require('net').createServer();
    server.listen(serverHandle);
  }
});

Here the child directly listens to the master's server. So there is no domaincheck inbetween. So here's a dead end to.

I also thought about Cluster, but this uses the same technology as the handle and therefore has the same limitations.

So... are there any good ideas?

What I currently do is rather hack-ish. I've made a package called distroy. It binds to port 80 and internally proxies all requests to Unix domain socket paths like /tmp/distroy/http/www.example.com, on which the seperate apps listen. This also (kinda) works for HTTPS (see my question on SNI). The remaining problem is, that the original IP address is lost, as it's now always 127.0.0.1. I think I can circumvent this by monkeypatching the net.Server so that I can transmit the IP address before opening the connection.

Answer

almypal picture almypal · May 29, 2012

If you are interested in a node.js solution check out bouncy, a websocket and https-capable http router proxy/load balancer in node.js.

Define your routes.json like

 {
      "beep.example.com" : 8000,
      "boop.example.com" : 8001
 }

and then run bouncy using

 bouncy routes.json 80