While studying for a class in computer network, Professor spoke about the distance of hamming between two valid code words in a sample code
I have read about distancing distance, and it is understood from perspective by describing the distances between the two stars, for example:
Code Word 1 = 10110
The sender sends the word code 1, and an error has been entered, and the receiver 101 00 receives. So you see that the fourth bit was corrupt. The reason for this would be a humming distance of 1:
valid code word: 10110 error code word: 10100 ----- XOR 00010
2 string XOR is the result in a 1, so the hamming distance is 1. I understand it till that point. But then the professor asks:
- What is the Humming distance of the standard CRC-16Bit protocol?
- What is the Humming distance of the standard CRC-32-bit protocol?
I'm a bit confused, and wondering if someone can help thank you.
You might have understood it by now, but whatever he asked is the minimum number of CRC code , Which responds to a few errors depending on the width, polynomial and length of the message. For example, the most famous CRC-32 polynomial (0x1EDC6F41) has 5,275 bits (optimization of cyclic redundancy of Castaglione, Brayer, Hermann-probe code 24 and 32 pixels bits, IEEE 6 or more is the distance of a hamming of communication But transaction 41, No. 6, June 1993), which means that 5 flip bits are guaranteed to find 5 flip bits in a message of 5,275 bits or less.
BTW, checksum in code word, so your example is wrong.
Comments
Post a Comment