How to specify SSL protocol to use for WebClient class

JoeMjr2 picture JoeMjr2 · May 27, 2015 · Viewed 40.4k times · Source

I have an application that sends data to a server using an HTTPS POST. I use a System.Net.WebClient object to do this. Here is a function that sends some data:

    private byte[] PostNameValuePairs(string uri, NameValueCollection pairs)
    {
        byte[] response;
        String responsestring = "";
        using (WebClient client = new WebClient())
        {
            client.Headers = GetAuthenticationHeader();

            string DataSent = GetNameValueCollectionValuesString(pairs);

            try
            {
                response = client.UploadValues(uri, pairs);
                responsestring = Encoding.ASCII.GetString(response);
            }
            catch (Exception e)
            {
                responsestring = "CONNECTION ERROR: " + e.Message;
                return Encoding.ASCII.GetBytes(responsestring);
            }
            finally
            {
                _communicationLogger.LogCommunication(uri, client.Headers.ToString(), DataSent, responsestring);
            }
        }

        return response;
    }

We are passing in a URI beginning with https://

This has been working great for a long time. Today, we started getting the following connection error: "The underlying connection was closed: An unexpected error occurred on a send". We did some troubleshooting with the owner of the server, and they finally narrowed it down to the following. They made a change to their server to block TLS 1.0, and said that we now need to send our data using either TLS 1.1 or 1.2.

What do I need to set in my WebClient object (or elsewhere in my function) to make it use TLS 1.1 or 1.2 instead of TLS 1.0?

We are using .NET Framework 4.5 if that makes a difference.

Answer

JoeMjr2 picture JoeMjr2 · Jul 1, 2015

From the suggested other questions, I was able to solve it by adding the following line to my code:

System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;

This disabled TLS 1.0 from the client, and then the server accepted the connection.

Hope this helps someone else with the same issue. Although the answer is similar to those other questions, it wasn't obvious from the questions asked that this was the case, so I don't feel that this is a duplicate.