I have been working with the following code published on msdn:
http://msdn.microsoft.com/en-us/library/fx6588te.aspx
I understand that the server application is not blocked whilst the application is waiting for new clients.
However can this application (or even sockets) for that matter handle multiple concurrent requests?
What would happen if client A and B connected at the same time?
If client A connects and the handling of its request takes 5 seconds, if client B connects a second later must it wait for client A to finish before its processing can start?
Or will client A and client B's requests be handled concurrently?
I have done some testing with this by putting Thread.Sleep(n) commands in between the receive/send data in the socket listener code. I can then send multiple requests to the socket and they appear to be handled. However the socket always handles them on the same thread id - which makes me believe that it isnt actually happening concurrently.
Especially given the description by microsoft that this app simply doesnt block whilst awaiting for new connections - does that mean it can handle concurrent connections?
[Update 2014]: It seems that the example has been modified since this answer was posted, as noted in this thread. The MSDN example now handles multiple incoming connections properly. Anyway, the general approach described here is correct and perhaps it can provide additional clarification.
When doing socket communication, you basically have a single listener socket for all incoming connections, and multiple handler sockets for each connected client.
When you start listening to a port, you create a socket with a callback method for incoming connections (this is referencing the example you mentioned). That's the one-and-only listener socket for that port number:
listener.BeginAccept(new AsyncCallback(AcceptCallback), listener);
This line tells the listener to invoke the AcceptCallback
method whenever a new client is connected (new connection callback). That method is the one which should do its work quickly, since it blocks other incoming connections.
That is also why AcceptCallback
must immediately create a dedicated "handler" socket with its own background data callback method (ReadCallback
):
// inside AcceptCallback, we switch to the handler socket for communication
handler.BeginReceive(state.buffer, 0, StateObject.BufferSize, 0,
new AsyncCallback(ReadCallback), state); // fired on a background thread
From that moment on, ReadCallback
method is invoked whenever some data is received by your newly connected client.
Also, before returning, AcceptCallback
needs to call listener.BeginAccept
again, to continue listening to new incoming connections:
// this is the same server socket we opened previously, which will now
// continue waiting for other client connections: it doesn't care about
// the actual data transmission between individual clients
listener.BeginAccept(new AsyncCallback(AcceptCallback), listener);
This part is omitted from the MSDN example, meaning it can only receive a single connection.
As soon as you get a packet of data from your client, ReadCallback
method will be invoked. So, inside this data callback method, you need to read and process the received data, and then invoke the same BeginReceive
method again (again, with ReadCallback
as its data callback method).
[Edit]
The problem with MSDN example is that it allows connection of only a single client (listener.BeginAccept
is called only once). To allow mulitple concurrent connections, you need to create a receive socket using handler.BeginReceive
, and then call listener.BeginAccept
to start listening to new clients.