I'm trying to remember how the math is worked out to compute the remainder of an XOR algorithm in Cyclical Redundancy Checks to verify the remainder bits of a network message.
I shouldn't have tossed that text book.
This is easily done in code, but how is it worked out by hand?
I know it looks something like a standard division algorithm, but I can't remember where to go from there to get the remainder.
___________
1010 | 101101000
Note: I did google it, but wasn't able to find a place where they mapped the steps in figuring the remainder.
1010 | 101101000
1010
0001 this result is 1011 XOR 1010 = 0001
1010
1010
0000 thus no remainder.
Thus 101101000 is perfect and no error has occurred in transmission/reception