I've tried the example from wikipedia: http://en.wikipedia.org/wiki/Longitudinal_redundancy_check
This is the code for lrc (C#):
/// <summary>
/// Longitudinal Redundancy Check (LRC) calculator for a byte array.
/// ex) DATA (hex 6 bytes): 02 30 30 31 23 03
/// LRC (hex 1 byte ): EC
/// </summary>
public static byte calculateLRC(byte[] bytes)
{
byte LRC = 0x00;
for (int i = 0; i < bytes.Length; i++)
{
LRC = (LRC + bytes[i]) & 0xFF;
}
return ((LRC ^ 0xFF) + 1) & 0xFF;
}
It said the result is "EC" but I get "71", what I'm doing wrong?
Thanks.
Here's a cleaned up version that doesn't do all those useless operations (instead of discarding the high bits every time, they're discarded all at once in the end), and it gives the result you observed. This is the version that uses addition, but that has a negation at the end - might as well subtract and skip the negation. That's a valid transformation even in the case of overflow.
public static byte calculateLRC(byte[] bytes)
{
int LRC = 0;
for (int i = 0; i < bytes.Length; i++)
{
LRC -= bytes[i];
}
return (byte)LRC;
}
Here's the alternative LRC (a simple xor of bytes)
public static byte calculateLRC(byte[] bytes)
{
byte LRC = 0;
for (int i = 0; i < bytes.Length; i++)
{
LRC ^= bytes[i];
}
return LRC;
}
And Wikipedia is simply wrong in this case, both in the code (doesn't compile) and in the expected result.