Calculating SHA1 hash of a 'nvarchar' string using T-SQL

Xaqron picture Xaqron · Mar 17, 2011 · Viewed 7.1k times · Source

I'm trying to calculate SHA1 hash of a unicode string using T-SQL. The below code works fine with ASCII strings:

declare @input varchar(50)
set @input = 'some text'
print 'SHA1 Hash: ' + UPPER(master.dbo.fn_varbintohexsubstring(0, HashBytes('SHA1', @input), 1, 0))

but it calculates wrong hash when I replace first line of code with declare @input nvarchar(50).

Calculated hash (nvarchar): BBA91B680CE2685E9465DE24967E425CF055B10F
Calculated hash by a tool : 37AA63C77398D954473262E1A0057C1E632EDA77

How can I calculate SHA1 hash of a nvarchar ?

[EDIT]:

Below C# code generate same hash as the tool I use for hashing:

// Computes SHA1 hash of a given string
string ComputeHash(string input)
{
    string result = string.Empty;
    byte[] hash;
    byte[] bytes = Encoding.GetBytes(input);

    using (var sha = SHA1Managed.Create())
        hash = sha.ComputeHash(bytes);

    foreach (var b in hash)
        result += b.ToString("X2");

    return result;
}

Answer

Lamak picture Lamak · Mar 17, 2011

Are you sure that the hash returned by your tool is using UTF16 or Unicode encoding when you compare it with the one returned by SQL Server?...SHA1 (and other encoding formats) depends on the data type, so it should return different values when given as an input. Take a look at this link for a more detailed explanation.