I was running a few tests with sockets, and I encountered some strange behavior: A ServerSocket will refuse connections after the 50th client Socket connects to it, even if that client socket is closed before the next one is opened, and even if a delay is added between connections.
The following program is my experimental code, which in its current state, throws no exceptions and terminates normally. However, if the array size of Socket[] clients
is increased beyond 50, any client sockets attempting to connect after the 50th connection are refused by the server socket.
Question: Why is 50 the count at which socket connections are refused by a server socket?
public static void main(String[] args) {
try (ServerSocket server = new ServerSocket(2123)) {
Socket[] clients = new Socket[50];
for (int i = 0; i < clients.length; i++) {
clients[i] = new Socket("localhost", 2123);
System.out.printf("Client %2d: " + clients[i] + "%n", i);
clients[i].close();
}
} catch (Exception e) {
e.printStackTrace();
}
}
I have run tests where another 50 sockets connect to another local server, and no issue occurred with 100 sockets being opened and closed, so I've deduced that its the server socket is refusing connections, and not some limit of opening client sockets, but I have been unable to discover why the server socket is limited to 50 connections, even though they are not connected simultaneously.
It's all in the JavaDoc:
The maximum queue length for incoming connection indications (a request to connect) is set to 50. If a connection indication arrives when the queue is full, the connection is refused.
Apparently your ServerSocket
never accepts any connections, just listens. You must either call accept()
and start handling the connection or increase the backlog queue size:
new ServerSocket(port, 100)