I have built a simple HttpServer following tutorials i have found online, using Sun's lightweight HttpServer.
Basically the main function looks like this:
public static void main(String[] args) throws Exception {
HttpServer server = HttpServer.create(new InetSocketAddress(8000), 0);
//Create the context for the server.
server.createContext("/", new BaseHandler());
server.setExecutor(null); // creates a default executor
server.start();
}
And I have implemented the BaseHandler Interface's method to process the Http request and return a response.
static class BaseHandler implements HttpHandler {
//Handler method
public void handle(HttpExchange t) throws IOException {
//Implementation of http request processing
//Read the request, get the parameters and print them
//in the console, then build a response and send it back.
}
}
I have also created a Client that sends multiple requests via threads. Each thread sends the following request to the server:
http://localhost:8000/[context]?int="+threadID
On Each client run, The requests seem to arrive in different order to the server, but they are served in a serial manner.
What i wish to acomplish is for the requests to be processed in a parallel manner if that is possible.
Is it possible, for example, to run each handler in a seperate thread, and if so, is it a good thing to do.
Or should i just drop using Sun's lightweight server altogether and focus an building something from scratch?
Thanks for any help.
As you can see in ServerImpl, the default executor just "run" the task :
157 private static class DefaultExecutor implements Executor {
158 public void execute (Runnable task) {
159 task.run();
160 }
161 }
you must provide a real executor for your httpServer, like that :
server.setExecutor(java.util.concurrent.Executors.newCachedThreadPool());
and your server will run in parallel. Carefull, this is a non-limited Executor, see Executors.newFixedThreadPool to limit the number of Thread.