ECE4007
INFORMATION THEORY AND
CODING
Dr.Sangeetha R.G
Associate Professor Senior
SENSE
Syllabus
Module:6 Channel Coding I 4 CO: 1
hours
Introduction to Error control codes - Block codes, linear block codes, cyclic codes and their
properties, Encoder and Decoder design- serial and parallel concatenated block code, Convolution
Codes- Properties, Encoder-Tree diagram, Trellis diagram, state diagram, transfer function of
convolutional codes, Viterbi Decoding, Trellis coding, Reed Solomon codes.
Channel Coding
Why channel coding?
The challenge in digital communication system is that of
providing a cost effective facility for transmitting Information,
at a rate and a level of reliability and quality that are acceptable
to a user at the receiver.
The two key system parameters :
1) Transmitted signal power.
2) Channel bandwidth.
Power spectral density of receiver noise (Important parameter)
These parameters determine the signal energy per bit-to-
noise power spectral density ratio Eb/No.
Contd…
In practical, there are a limit on the value that we can assign
to Eb /No
So, it is impossible to provide acceptable data quality (i.e. ,
low enough error performance).
For a fixed Eb/No , the only practical option available for
changing data quality is to use
ERROR-CONTROL CODING
The two main methods of error control are:
i. Forward Error Correction (FEC).
ii. Automatic Repeat request (ARQ).
CHANNEL CODING
Block Diagram
Forward Error Correction (FEC)
The key idea of FEC is to transmit enough redundant data to
allow receiver to recover from errors all by itself. No sender
retransmission required
The major categories of FEC codes are
i. Block codes
ii. Cyclic codes
iii. BCH codes
iv. Reed-Solomon codes
v. Convolutional codes
Forward Error Correction (FEC)
FEC require only a one-way link between the trans
mitter and receiver.
In the use of error-control coding there are trade off
s between:
i. Efficiency & reliability
ii. Encoding /Decoding complexity& Bandwidth
Channel Coding Theorem
The channel coding theorem states that if a discrete memoryless channel
has capacity C and the source generate info at rate less than C ,then there
exists a coding technique that the output of the source may be transmitted
over the channel with an arbitrarily low probability of symbol error.
For the special case of BSC the theorem tell us that it is possible to find a
code that achieves error free transmission over the channel.
The issue that matter not the signal to noise ratio but how the channel input
is encoded.
The theorem asserts the existence of good codes but dose not tell us how to
find them.
By good codes we mean families of channel codes that are capable of
providing reliable (error-free) transmission of info over a noisy channel of
interest at bit rate up to a max value less than the capacity of the channel.
Linear Block Cod
es
The encoder generates a block of n coded bits from k
information bits and we call this as (n, k) block codes. The
coded bits are also called as code word symbols.
Why linear?
A code is linear if the modulo-2 sum of two code words is
also a code word.
Contd…
‘n’ code word symbols can take 2𝑛 possible values. From
that we select 2𝑘 code words to form the code
A block code is said to be useful when there is one to one
mapping between message m and its code word c as shown
above
Generator Matrix
All code words can be obtained as linear combination of ba
sis vectors.
The basis vectors can be designated as {𝑔1, 𝑔2, 𝑔3,….., 𝑔𝑘}
For a linear code, there exists a k by n generator matrix suc
h that
𝑐1∗𝑛 = 𝑚1∗𝑘 . 𝐺𝑘∗𝑛
where c={𝑐1, 𝑐2, ….., 𝑐𝑛} and m={𝑚1, 𝑚2, ……., 𝑚𝑘}
Block Codes in Systematic Form
In this form, the code word consists of (n-k)
parity check bits followed by k bits of the
message.
The structure of the code word in systematic
form is:
The rate or efficiency for this code R= k/n
Contd…
G = [ 𝐼 𝑘 C = m.G = [m mP]
P] Message
part Parity
Example: part
Let us consider (7, 4) linear code where k=
4 and n=7
𝒈𝟎 1101000
m=(1110) and G = 0110100
𝒈𝟏
𝒈𝟐
= 1110010
𝒈𝟑 1010001
Contd…
Let m=(𝑚1, 𝑚2, 𝑚3, 𝑚4) and c= (𝑐1, 𝑐2, 𝑐3, 𝑐4, 𝑐5, 𝑐6, 𝑐7)
1101000
0110100
c=m.G= (𝑚1, 𝑚2, 𝑚3, 𝑚4)
1110010
1010001
By matrix multiplication we obtain :
𝑐1=𝑚1 + 𝑚3 + 𝑚4, 𝑐2=𝑚1 + 𝑚2 + 𝑚3, 𝑐3= 𝑚2 + 𝑚3 + 𝑚4,
𝑐4=𝑚1, 𝑐5=𝑚2, 𝑐6=𝑚3, 𝑐7=𝑚4
The code word corresponding to the message (1110) is (0101110) .
Parity Check Matrix (H)
When G is systematic, it is easy to determine the parity check matri
x H as
𝐻= 𝐼
𝑛−𝑘 𝑃𝑇
The parity check matrix H of a generator matrix is an (n-k)-by-
n matrix satisfying
𝑇 𝑛∗𝑘 =0
𝐻 𝑛−𝑘 ∗𝑛 𝐺
Then the code words should satisfy (n-k) parity check equation
s
𝑛∗(𝑛−𝑘) =0
𝐶1∗𝑛𝐻𝑛∗(𝑛−𝑘) = 𝑚1∗𝑘𝐺𝑘∗𝑛𝐻𝑇
Example:
Consider generator matrix of (7, 4) linear block code
H = [𝐼𝑛−𝑘 𝑃𝑇 ] and G = [𝑃𝐼𝑘]
The corresponding parity check matrix is
1 0 0 1 0 1 1
𝐻= 0 1 0 1 1 1 0
0 0 1 0 1 1 1
1 0 0
1 1 0 1 0 0 0 0 1 0
0 1 1 0 1 0 0 0 0 1
𝐺. 𝐻𝑇 == 1 1 0 =0
1 1 1 0 0 1 0
1 0 1 0 0 0 1 0 1 1
1 1 1
1 0 1