What is the minimum number of bits needed to correct all 2 bit errors?

user623879 picture user623879 · Apr 12, 2011 · Viewed 8.9k times · Source

I learned about hamming codes and how to use them to correct 1 bit errors and detect all 2 bit errors, but how extend this to correcting 2 bits, and maybe more?

What is the minimum number of bits needed to correct all 2 bit errors?

Answer

user623879 picture user623879 · Apr 12, 2011

I think I figured it out.

N=number of data bits, k=number error correcting bits(eg parity for hamming)

In any ECC scheme, you have 2^(N+k) possible bit strings.

For single bit error:

You must find k such that the total number of possible bit strings is larger than the possible number of strings with at most 1 bit error for a given string.

The total possible strings with at most 1 bit error is 2^N(n+k+1)

1 string with no error, N+k strings with 1 bit error

2^(N+k)>=(2^N)*(N+k+1)

You simply have to plugin values of k until you find the one that satisfies the above(or however you wish to solve it)

Similarly for 2 bit error, it is

1 string with no error, N+k strings with 1 bit error, N+k choose 2 strings with 2 bit error.

2^(N+k)>=(2^N)*(N+k+1 + (N+k choose 2))