|
|

1) Since we want to attach n-m check digits to each word, we first select a matrix H(n-m) x n (with entries 0 or 1) in such a manner, that if we take any d-1 or fewer columns of it and add them up, we get a non-zero column. Mathematically, this means that every set of d-1 columns is linearly independent. The matrix H can be chosen in the form
H = (A(n-m) x m | I(n-m)).
|
|
1) Here, d = 3. Take
H = 
|
2) From the matrix H, we construct the matrix
Gm x n = (Im | AT).
(Here, AT is the matrix obtained from A by rewriting the
rows as columns). The check digits will be determined
by the choice of the matrix A.
|
|
2) Here, G =
1 0 1 1 0
0 1 0 1 1
|
3) The word w to be transmitted is first changed to its
corresponding codeword f(w), and it is this codeword that is transmitted.
|
|
3) Suppose w = 01. It is treated as the 1 x 2 matrix (0 1).
f(w) = w.G = (0 1 0 1 1).
This represents the codeword 01011, and this is transmitted.
|
4) When a word x is received, it is first compared with the codewords. If x is indeed a codeword, it is decoded by dropping the last n-m check digits to obtain the original codeword. If x is not a codeword, we know that some error has occurred. We determine the codeword y that is at the least distance from x, and correct x to y. We then decode y as before by discarding the check digits. |
|
4) Suppose one error occurs, and the received word x is 01111. Now, the set of codewords is {00000,01011,10110,11101}. Clearly, x is not a codeword. The codeword that is at the least distance (one) from x is 01011. All other codewords are at a distance at least two from 01111. So, we correct 01111 to 01011, which is decoded as 01, by dropping the last three check digits. |
- We first define the syndrome of a word v in Bn as
syndrome (v) = v.HT
|
|
5) For example,
syndrome (10100) = (0 1 0)
|
- It is easily shown that for every codeword v,
syndrome (v) = 0(n-m), where 0(n-m) = (0 0 & 0)
and conversely,
if syndrome (v ) = 0(n-m), then v must be a codeword.
|
|
6) For example,
syndrome (01011) = (0 0 0).
|
- Suppose we send the codeword w and due to an error, certain digits are changed. We represent this error as a word e in Bn as follows:
If the ith digit of w has been changed, the ith digit of e is 1, else it is 0.
If the received word is v, then clearly v = w + e. Therefore,
syndrome (v)
= v.HT = (w + e).HT = w.HT + e.HT = 0 + e.HT
= syndrome (e)
|
|
7) Message to send : 1 0
Codeword sent (w) : 1 0 1 1 0
Word received (v) : 1 0 1 0 0
Error word (e) : 0 0 0 1 0
syndrome (10100)
= (0 1 0)
= syndrome (00010)
|
8) Now, e is the word obtained when the zero codeword has the same error as the codeword v. Therefore, in view of the above relation, it is enough to store just the acceptable errors in the zero word and their corresponding syndromes. If the syndrome of the received word v is not zero, an error has occurred. We then determine the error e corresponding to this syndrome, and since we know that v = w + e, we can calculate w. As before, we obtain the original message by dropping the n-m check digits at the end of the codeword w.
|
|
8) v = 1 0 1 0 0
e = 0 0 0 1 0
Hence, w = 1 0 1 1 0
Which is the codeword sent.
Discarding the check digits, we get the original message: 10.
|
In real life situations, we cannot waste time comparing the received word with every codeword to determine the one it is "closest" to. So, we build a simple lookup table as follows: The codewords form the top row, and below each codeword we list the words formed (as the result of errors) which have to be corrected to that codeword. When a word is received, we check if it is a codeword, and if it is not, we correct it to the codeword that is above it.
Here, however, we have a potential problem: To correct messages reasonably quickly, we have to maintain a table that has not only all the codewords but also the words formed as the result of various errors. Together, these words form the entire set Bn. In many applications, words are twenty bits or more in length. Now B25, for instance has over 33 million words! Clearly, we need a streamlined error correcting method that does not require so much storage.
Our linear coding technique solves this problem by decoding the received words slightly differently.

Copyright © 2022 ICICI Centre for Mathematical Sciences
All rights reserved. Send us your suggestions at
|
|