Syndrome Decoding

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

DECODING

Proposition (1) (# of errors a code can detect) Let C be an [n, k, d]-code. Then C can
detect ≤ s errors if d ≥ s + 1.
Proof: Assume d ≥ s + 1. If a codeword c ∈ C is sent and s or fewer errors occur
then the received message r cannot be a codeword because if r ∈ C then we must have
dH (c, r) ≥ s + 1. 
Proposition (2) (# of errors a code can correct) Let C be an [n, k, d]-code. Then C
can correct ≤ t errors if d ≥ 2t + 1.
Proof: Suppose that d ≥ 2t + 1. Assume that the codeword c is sent and the received word
r has ≤ t errors, i,e., dH (c, r) ≤ t. We will show that if c1 ∈ C is any other codeword then
dH (c1 , r) ≥ t + 1. Assume that dH (c1 , r) ≤ t. It follows from the definition of d (the minimal
Hamming distance between distinct codewords in C) and the triangle inequality for Hamming
distance that
2t + 1 ≤ d ≤ dH (c, c1 ) ≤ dH (c, r) + dH (r, c1 ) ≤ 2t.
This is a contradiction, so we must have d(c1 , r) ≤ t. It follows that c is the unique codeword
x ∈ C which satisfies dH (x, r) ≤ t, and we can correct the error by replacing r with the
codeword with closest Hamming distance to r. 

SYNDROMES

Let G = Ik , P be a generator for an [n, k]-bilinear code (which we denote by C) where Ik is
the k × k identity matrix and and P is a k × (n − k) matrix with entries in F2 . Let
H = −P T , In−k


be the check matrix for C where P T denotes the transpose of the matrix P .
Definition (Syndrome) Let u ∈ Fn2 . Then the syndrome of u (denoted S(u)) is defined to be
S(u) := u · H T .
Remark: The most important property of syndromes is that S(u) = 0 if and only if u ∈ C.
Examples of Syndromes: Let
 
1 0 0 1 0  
1 1 0 1 0
G = 0 1 0 1 1  , H= ,
0 1 0 0 1
0 0 1 0 0
be the generator and check matrix for a [5,3]-code.
Consider the following vectors in F52 .
u1 = 01010, u2 = 10011, u3 = 11100.
To compute the syndromes
! of u1 , u2 , u3 we first write down the transpose of the matrix H
0 1
T 1 1
given by H = 0 0 . Then
1 0
0 1

S(u1 ) = u1 · H T = (0, 1), S(u2 ) = u2 · H T = (0, 0), S(u3 ) = u3 · H T = (1, 0).


We see that only u2 is a codeword.
1
COSETS OF THE VECTOR SPACE Fn2
Definition (Coset of Fn2 ) Let C be an [n, k]-code. Let u ∈ Fn2 . Then the set of vectors
u + C := {u + c1 , u + c2 , . . . , u + cN } is called a coset of Fn2 .
Remark: Since C is an [n, k]-code it is clear that N = 2k .
Remark: The trivial coset is C itself when u = 000 . . . 0.
Examples of cosets: Let C = {00000, 10010,
 01011,
 00100, 11001, 01111, 10101, 11101} be
1 0 0 1 0
the [5,3]=code with generator matrix G = 0 1 0 1 1 .
0 0 1 0 0

• When u = 00000, then C is the trivial coset.


• When u = 10000 the u + C = {10000, 00010, 11011, 10100, 01001, 11111, 00101, 01101}.
• When u = 01000 the u + C = {01000, 11010, 00011, 01100, 10001, 00111, 11101, 10101}.
Table of all the distinct cosets of the above code C:
Each coset must have the same number of elements as the code C. In the above, each coset
has 8 vectors. Since F52 has 32 vectors it follows that there must be exactly 4 cosets. We have
found already found 3 of the cosets. To determine the remaining one we note that 00001 is
not in any of these 3 cosets, so we can take u = 00001 to determine the fourth coset. We now
list all 4 cosets.

• 00000 + C = {00000, 10010, 01011, 00100, 11001, 01111, 10101, 11101}


• 10000 + C = {10000, 00010, 11011, 10100, 01001, 11111, 00101, 01101}.
• 01000 + C = {01000, 11010, 00011, 01100, 10001, 00111, 11101, 10101}.
• 00001 + C = {00001, 10011, 01010, 00101, 11000, 01110, 10100, 11100}
Definition (Coset Leader) The coset leader of a coset is an element of the coset with
minimal Hamming weight (it has the fewest number of ones).
Remark: In the example above the vectors 00000, 10000, 01000, 00001 are all coset leaders.
Remark: Coset leaders do not have to be unique. Note that in the coset 10000 + C we could
also choose 00010 to be a coset leader.

How to construct all the cosets:


Step 1: The trivial coset is just the code C itself.
Step 2: Choose a vector u1 ∈ Fn2 which is not in C and has smallest Hamming weight.
Construct the coset u1 + C. Note: the vector u1 may not be unique.
Step 2: Choose a vector u2 ∈ Fn2 which is not in C or u1 + C with minimal Hamming
weight. Construct the coset u2 + C.
Step 3: Assuming the cosets C, u1 + C, . . . uk + C are found. Choose a vector uk+1 ∈ Fn2
which is not in any of C, u1 + C . . . , uk + C with minimal Hamming. Construct
the coset uk+1 + C.
Step 3: Keep repeating the above procedure until all the cosets are found.

2
Theorem: Let C be an [n, k]-code. Let u 6= u0 be vectors in Fn2 . Then either u + C = u0 + C
or the two cosets u + C, u0 + C have no elements in common.
Proof: Assume some vector u + c in the coset u + C equals some vector u0 + c0 in the coset
u0 +C 0 . This implies that u+c+C = u0 +c0 +C. But u+c+C = u+C and u0 +c0 +C = u0 +C.
So the two cosets are the same. 
Remark: The above theorem guarantees that the method to construct all the cosets on the
previous page has to work.
An immediate corollary of the above theorem is the following.
Corollary: Two vectors u, v ∈ Fn2 belong to the same coset if and only if they have the same
syndrome, i.e., S(u) = S(v).
Proof: It follows from the above theorem that two vectors u, v ∈ Fn2 belong to the same
coset if and only if u − v ∈ C. This implies that

0 = S(u − v) = (u − v) · H T = u · H T − v · H T = S(u) − S(v). 

Definition (Syndrome of a Coset) The syndrome of a coset u + C is defined to be S(u).


Remark: This definition is well defined since every vector in the coset has the same syndrome.
In particular, even if we have a different coset leader the syndrome will be the same.

SYNDROME OR NEAREST NEIGHBOR DECODING


Let C be an [n, k] bilinear code. Assume that a codeword c ∈ C is transmitted across a
noisy channel and received as r ∈ Fn2 .
Syndrome Decoding Protocol:
• Precomputation: Make a table of coset leaders and their syndromes.
• Step (1) Compute the syndrome S(r) of the received vector r.
• Step (2)Find the coset leader u with the same syndrome as S(r).
• Step (3) Decode r as r − u.

EXAMPLE (Syndrome Decoding in the Hamming [7,4]-code)


The Hamming [7,4]-code (let’s denote it as C ) has 16 codewords

C = 0000000, 1000110, 0100101, 0010011,
0001111, 1100011, 1010101, 1001001,
0110110, 0101010, 0011100, 1110000,

1011010, 1101100, 0111001, 1111111 .

We see that every non-zero codeword has at least 3 one’s in it. This tells us that dH C = 3.
It then follows from Propositions 1, 2 that C can detect up to 2 errors and correct exactly one
error.
3
Step (1) The first step in syndrome decoding is to make a table of coset leaders and their
syndromes. Note that there will be exactly 8 cosets.
Coset Leader Syndrome
0000000 000
0000001 001
0000010 010
0000100 100
0001000 111
0010000 011
0100000 101
1000000 110

Assume the sender transmits the codeword c= 1100011 and it is received as r = 0100011.
1 1 0
1 0 1
0 1 1
Step (2) We compute S(r) = r · H T = (0, 1, 0, 0, 0, 1, 1) ·  1 1 1  = (1, 1, 0). We see that
1 0 0
0 1 0
0 0 1
the coset leader u = 1000000 has the same syndrome 110.
Step (3) We decode r as r − u = 0100011 − 1000000 = 1100011.

Assume the sender transmits the codeword c= 0100101 and it is received as r = 0110101.
0 1 1
1 0 1
0 1 1
Step (2) We compute S(r) = r · H T = (0, 1, 1, 0, 1, 0, 1) ·  1 1 1  = (0, 1, 1). We see that
1 0 0
0 1 0
0 0 1
the coset leader u = 0010000 has the same syndrome 011.
Step (3) We decode r as r − u = 0110101 − 0010000 = 0100101.

DUAL CODES

Definition (Dot Product) The dot product of two vectors u = (u1 , u2 , . . . , un ), v = (v1 , v2 , . . . , vn )
which are in Fn2 is defined to be: u · v = u1 v1 + u2 v2 + · · · + un vn .

Definition (Dual Code) Let C be an [n, k] bilinear code. The dual code (denoted C ⊥ ) is
defined as the set of all vectors u ∈ FN
2 satisfying u · c = 0 for all c ∈ C.

Proposition Let C be an [n, k] bilinear code with generating matrix G = (Ik , P ) and check
matrix H = −P T , In−k . Then the dual code C ⊥ has generating matrix H and check matrix
G.
Proof: See page 415 in Introduction to Cryptography with Coding Theory.

You might also like