First Edition : 2009
TCU CLL:
Techniques
J. S. Chitode
2
Technical Publications Pune’Information Coding Techniques
ISBN 978-81-89411-69-5
All rights reserved with Technical Publications. No part of this book should be
‘reproduced in any form, Electronic, Mechanicol, Photocopy or any information storage and
retrieval system without prior permission in writing, from Technical Publications, Pune.
Published by :
‘Technical Publications Pune®
‘#1, Anit Residency, 412, Shaniwar Peth, Pune - 411 030, India.
Printer :
‘Alet DTPsiotes
Seno. 10/3,Shaged Reed,
Pine - 411 041Table of Contents
1.2 Uncertainty...
1.3 Definition of Information (Measure of Information)
1.3.1 Properties of Information . sumvysteeen
1.3.2 Physical Interpretation of Amount of Information. ..............:s0:sseseseeeees
1.4 Entropy (Average Information)...
1.4.1 Properties of Entropy .
1.5 Information Rate ..
1.6 Extension of Discrete Memoryless Source...
1.7. Source Coding Theorem (Shannon's First Theorem)
1.7.1 Code Redundancy.
1.7.2 Code Variance. .
1.8 Variable Length Source Coding ‘Abort (Entropy Coding)...
41.8.1 Prefix Coding (Instantaneous Coding) ...
1.8.1.1 Properties of Prefix Code . .
4.8.2 Shannon-Fano Algorithm .
1.8.5 Huffman Coding,
1.9 Discrete Memoryless Channel:
4.9.1 Binary Communication Channel,....ssssssesessseeessesesseeeseteeeeneeees
4.9.2 Equivocation (Conditional Entropy)
1.9.4 Capacity of a Discrete Memoryless Channel1.10 Mutual Information ..
1.10.1 Properties of Mutual Information . . ..
1.10.2 Channel Capacity .
1.11 Differential Entropy and Mutual Information for Continuous
Ensembles ..
1.11.4 Differential Entropy .
1.11.2 Mutual Information. .
+.12 Shannon's Theorems on Channel Capacity
1.12.1 Channel Coding Theorem (Shannon's Second Theorem) . ..
1.12.2 Shannon Hartley Theorem for Gaussian Channel
(Continuous Channel)...
1.12.3 Tradeoff between Bandwidth and Signal to Noise Ratio
1.12.4 Rate/Bandwidth and Signal to Noise Ratio, ee TOMO-OF sree ereeceeneeenes 4-112
i
1.13 Short Answered Questions
2.1 Introduction ...
2.2 Pulse Code Modulation...
2.2.1 PCM Generator.........
2.2.2 Transmission Bandwidth in PCM.
2.2.3 PCM Receiver
2.3 Delta Modulation...
2.3.1 Advantages of Delta Modulation .
2.3.2 Disadvantages of Delta Modulation. ..... .
2.3.2.4 Slope Overload Distortion (Startup Error)...
23.2.2 Granular Noise (Hunting) . .
2.4 Adaptive Delta Modulation .
2.4.1 Advantages of Adaptive Delta Modulation,
2.5 Differential Pulse Code Modulation..
2.6 Comparison of Digital Pulse Modulation Methods .....2.7 Coding Speech at Low Bit Rates ..
2.7.1 ADPCM for Low Bit Rate Speech Coding .
2.8 Short Answered Questions ...
3.1 Introduction ......
3.1.1 Rationale for Coding and Types and Codes.
3.1.2 Types of Codes
3.1.3 Discrete Memoryless Channels..........e ce ceceeeeee ces treeeceeeene eee 3-2
3.1.4 Examples of Error Control Coding. ........00ccccssescseeeseeeesseecasensees 3-3
3.1.5 Methods of Controling Eros ..........0.sccecerseseeeeeeeetseeeneeeeeeee 3-3
3.1.6 Types of Errors...
3.1.7 Some of the important Terms Used in Error Control Coding. . beet Bed
3.2 Linear Block Codes ..
3.2.1 Hamming Codes .
3.2.2 Error Detection and Correction Capabilities of Hamming Codes
3.2.3 Encoder of (7, 4) Hamming Code ...........s0cceceesereteeeeeenes Pec sne
3.2.4 Syndrome Decoding .............
3.2.4.1 Error Correction using Syndrome Vector.
3.2.5 Hamming Bound .
3.2.6 Syndrome Decoder for (n, k) Block Code. .
3.2.7 Other Linear Block Codes .......
3.2.7.4 Single Patty Check BitCode . .
3.2.7.2 Repeated Codes . .
3.2.7.3 Hadamard Code...
3.2.74 Extended Codes . .
3.2.7.5 Dual Code. . . . .
3.3 Cyclic Codes...
3.3.1 Definition of Cydic Code ..
3.3.2 Properties of Cyclic Codes.
3.3.2.1 Linearity Property . .
3.3.22 CycliePropery 6...3.3.3 Representation of Codewords by a Polynomial
3.3.4 Generation of Codevectors in Nonsystematic Form. ...... 6... eee vere BST
3.3.5 Generation of Codevectors in Systematic Form. ............ 3-59
3.3.6 Generator and Parity Check Matrices of Cyclic Codes < 3-65
3.3.6.1 Nonsystematic Form of Generator Matrix © 6. 2. we - . 3-65
3.3.6.2 Systematic Form of Generator Matix. ©... . exes sues ot rome GEOR:
3.3.6.3 Parity Check Matrix .
3.3.7 Encoders for Cyclic Codes .
3.3.8 Syndrome Decoding for Cyclic Codes. .
3.3.8.1 Block Diagram of Syndrome Calculator .
3.3.9 Decoder for Cyclic Codes. .
3.3.10 Advantages and Disadvantages of Cyclic Codes.
3.3.11 BCH Codes (Bose - Chaudhri - Hocquenghem Codes) .
3.3.12 Reed-Soloman (RS) Codes . .
3.3.13 Golay Codes... c6cccccecceseeeessesessceseeceeeeeuesaseateasece
3.3.14 Shortened Cyclic Codes........ 6.0.60 cecee eee eene nsec tees nee eneeeen ee
3.3.15 Burst Error Correcting Codes........ 00. seeeeeeeeeee ere ereeeeeeeeee renee
3.3.16 Interleaving of Coded Data for Burst Error Correction .........2.+++sseeeeeee .3-81
3.3.17 Interlaced Codes for Burst and Random Error Correction
3.3.18 Cyclic Redundancy Check (CRC} Codes
4.3.19 Concatenated Block Codes .........
3.4 Convolutional Codes.
3.4.1. Definition of Convolutional Coding
34.1.1 Code Rate of Convolutional Encoder... eee
3.4.1.2 Constraint Length (K)
3.4.1.3 Dimension of theCode
3.4.2 Time Domain Approach to Analysis of Convolutional Encoder
3.4.3 Transform Domain Approach to Analysis of Convolutional Encoder
3.4.4 Code Tree, Trellis and State Diagram for a Convolution Encoder. . .
344.4 States ofthe Encoder ee
3.4.4.2 Development ofthe Code Tree... . .3.4.4.3 Code Trelis (Represents Steady State Transitions) . . . = 3-122
3.4.4.4 State Diagram 3-125
4.4. Decoding Methods of Convoltional Codes .-...ss.eresevseversvereeee 3-125
‘3.4.5.1 Viterbi Algorithm for Decoding of Convolutional Codes (Maximum Likelihood Decoding) 3 - 125
3.4.5.2 Sequential Decoding for Convolutional Codes. 3-129
3.4.5.3 Free Distance and Coding Gain . - 3-130
3.4.6 Probability of Errors for Soft and Hard Decision Decoding...
3.4.6.1 Probability of Error with Soft Decision Decoding... . .
3.4.6.2 Probability of Error with Hard Decision Decoding...
3.4.7 Transfer Function of the Convolutional Code
3.4.8 Distance Properties of Binary Convolutional Codes .............6.05005
3.4.9 Advantages and Disadvantages of Convolutional Codes
3.4.10 Comparison between Linear Block Codes and Convolutional Codes:
3.5 Short Answered Questions ..
4.1 Principles of Data Compression
4.1.1 Lossless and Lossy Compression. . .
441.2 Entropy Coding .......ceececeees : :
44.24Runlength Encoding... ee 4-3
4.1.2.2 Statistical Encoding . . . . 4-3
4.4.3 Source Encoding...... :
4.4.3: Differential Encoding.
4.1.3.2 Transform Encoding...
4.2 Text Compression...
4.2.4 Static Huffman Coding,
4.2.2 Dynamic Huffman Coding .. ..
4.2.3 Arithmetic Coding
4.24 Lampel-Viz (ZIP) Coding... ..
4.3 Image Compression...
4.3.1 Graphics Interchange Format (GIF)
4.3.2 Tagged Image File Format (TIFF). .4.3.3 Digitized Documents
4.3.4 JPEG Standards ...
43.4.1 Types of PEG... .
1 4342 SPEG Encoder... 2
4.3.4.3 PEG Decoder... 2... 4-23
4-25
4.4 Short Answered Questions
5.1 Introduction
5.2 Linear Predictive Coding Principle
5.2.1 LPC Encoder for Speech...
5.2.2 LPC Decoder ..
5.3 Code Excited LPC.
5.4 Perceptual Coding
5.5 MPEG Audio Coders.
5.5.1. Encoder for MPEG Audio Coding.
5.5.2 Decoder for MPEG Audio Decoding
5.5.3 MPEG Layers 1,2 and 3.
5.5.4 Dolby Audio Coders .
5.5.4.1 DobyAC-1. 2...
55.4.2 DobyAC-2 .. 0...
5.6 Video Compression.
5.6.1 Video Compression Principles
5.6.2 MPEG Algorithm,
5.6.3 H.261...
5.6.4 MPEG Video Standards . .
5.6.4.1 MPEG-1
5.6.4.2 MPEG-2.
5643 MPEG-4..........-. 0005
5.5.4.4 MPEG-3 orMP-3StandantforSound. . .. . . .
5.7 Musical Instrument Digital Interface (MIDI).
5.8 Short Answered Questions ...
REPELS GRAS EE EDInformation Entropy Fundamentals
1.1 Introduction
The performance of the communication system is measured in terms of its error
probability. An errorless transmission is possible when probability of error at the
receiver approaches zero. The performance of the system depends upon available
signal power, channel noise and bandwidth. Based on these parameters it is possible
to establish the condition for errorless transmission. These conditions are referred as
Shannon’s theorems. The information theory is related to the concepts of statistical
Properties of messages/sources, channels, noise interference etc. The information
theory is used for mathematical modeling and analysis of the communication systems.
With the information theory and its modeling for communication systems, following
two main points are resolved :
i) The irreducible complexity below which the signal cannot be compressed.
ii) The transmission rate for reliable communication over a noisy channel.
In this chapter we will study the concepts of Information, Entropy, Channel
capacity, Information rate etc. and some source coding techniques.
1.2 Uncertainty
Consider the source which emits the discrete symbols randomly from the set of
fixed alphabet ie.
Xs bro, Xr, X2, one Xe
The various symbols in ‘X ‘ have probabilities of po, pi, P2, ---
be written as,
PUK=x,) = pe k= 0,1,2,..
This set of probabilities satisfy the following condition,
ket
Sp = «= (1.2.3)
im
.K~1 vo (1.2.2)
(1-4)Information Coding Techniques 1-2 Information Entropy Fundamentals
Such information source we have discussed earlier. It is called discrete information
source. The concept of ‘Information’ produced by the source is discussed in the next
section. This idea of Information is related to ‘Uncertainty’ or ‘Surprise’. Consider the
emission of symbol X = x, from the source. If the probability of x, is py =0. then-such
a symbol is impossible. Similarly when probability p, =1, then such symbol is sure. In
both the cases there is no ‘surprise’ and hence no information is produced when
symbol x; is emitted. As the probability p; is low, there is more surprise or
uncertainty. Before the event X= x, is emitted, there is an amount of uncertainty.
When the symbol X =x, occurs, there is an amount of surprise. After the occurrence of
the symbol X = x,, there is the gain in amount of information.
Review Question
1. What is uncertainty ? Explain the difference between uncertainty and information.
1.3 Definition of Information (Measure of Information)
Let us consider the communication system which transmits messages
my, Mz,™M3,..., With probabilities of occurrence pi,p2,p3,.-. The amount of
information transmitted through the message m, with probability p, is given as,
Amount of Information : I, = tog2 (4) so HBAS
Ke
Unit of information :
In the above equation log2 + = !8%0/P#) Normally the + unit of information
Pe ‘logo
is ‘bit’. We are using the term bit as an abbreviation for binary digit. Hence in this
chapter we will use the new abbreviation binit for binary digit. And we will measure
amount of information, I; in bits. The definition of information will be more clear
through following examples.
1.3.1 Properties of Information
Following properties can be written for information.
i) If there is more uncertainty about the message, information carried is also
more.
ii) If receiver knows the message being transmitted, the amount of information
carried is zero.
iii) If 1, is the information carried by message m, and I is the information
carried by m2, then amount of information carried compontely due to m and
mz is I) +12.
* Unit of information : bit. Binary digit is represented by ‘binit.Information Coding Techniques 4-3 Information Entropy Fundamentals
iv)If there are M=2" equally likely messages, then amount of information
carried by each message will be N bits
These properties are proved in next examples.
mmm Example 1.3.1 : Calculate the amount of information if px
Solution : From equation 1.3.1 we know that amount of information is given as,
togn( 2]
lk = logs (Fal
Pe) logi2
< es puting value’of pe
0
= 2bits wa. (13.2)
a> Example 1.3.2 : Calculate the amount of information if binary digits (binits) occur
with equal likelihood in binary PCM.
Solution : We know that in binary PCM, there are only two binary levels, ie. 1 or 0.
Since they occur with equal likelihood, their probabilities of occurrence will be,
pi (10 level) = pz (1 level) = 4
Hence amount of information carried will be given by equation 1.3.1 as
1 (4
h = logo| — d 12 = loge | —
v= tor (2) amt te (2)
I, = log22 and Iz = log2 2
Jogio 2 .,
t= 1, =S822 21 bit ++ (13.3)
1 2 02 (1.3.3)
Thus the correct identification of binary digit (binit) in binary PCM carries 1 bit of
information.
mm Example 1.3.3: In binary PCM if ‘0’ occur with probability A and ‘I’ occur with
an, 3
probability 5, then calculate amount of information conveyed by each binit.Information Coding Techniques 1-4 Information Entropy Fundamentals
Solution : Here binit ‘0’ has p, =}
and_binit ‘1’ has p>
Then amount of information is given by equation 1.3.1 as,
bom)
with pi = 4,1 =log, 4= 2804 ~2 bits
logio 2 wa (1.3.4)
aa 4)\_ log (4/3) i
ds with == = = |= 2
and) wi pr = 3,1 slog ( 3} be OF =0415 bis
Here observe that binit ‘0’ has probability 4 and it carries 2 bits of information.
Whereas binit ‘1’ has probability ; and it carries 0.415 bits of information. This shows
that if probability of occurrence is less, information carried is more, and vice versa.
Example 1.3.4: If there are M equally likely and independent messages, then prove
that amount of information carried by each message will be,
T=N bits.
where M =2N and N is an inieger.
Solution : Since all the M messages are equally likely and independent, probability of
occurrence of each message will be ;;. From equation 1.3.1 we know that amount of
information is given as,
1
Ik = loge (+)
Here probability of each message is, py -— Hence above equation will be,
Tk = log2M
We know that M=2N, hence above equation will be,
Tk = log2 2%
logio 2
= Nlog22=N
oe logio 2
= Nbits ++ (13.5)Information Coding Techniques 1-5 Information Entropy Fundamentals
Thus amount of information carried by each message will be ‘N’ bits. We know
that M=2N. That is there are ‘N’ binary digits (binits) in each message. This shows
that when the messages are equally likely and coded with equal number of binary
digits (binits), then the information carried by each message (measured in bits) is
numerically same as the number of binits used for each message.
mmm Example 4.3.5 : Prove the following statement,
“If receiver knows the message being transmitted, the amount of information carried is
zero”,
Solution : Here it is stated that receiver “knows” the message. This means only one
message is transmitted. Hence probability of occurrence of this message will be px =1.
This is because only one message and its occurrence is certain (probability of certain
event is 1’). The amount of information carried by this type of message is,
1
us = toss (F,]
logio 1
ES ti =1
jogio 2 putting pi
= 0 bits (1.3.6)
This proves the statement that if receiver knows message, the amount of
information carried is zero.
As px is decreased from 1 to 0, [; increases monotonically from 0 to infinity. This
shows that amount of information conveyed is greater when receiver correctly
identifies less likely messages.
vm Example 1.3.6: If 1; is the information carried by message my andl, is the
information carried by message mz, then prove that the amount of information carried
compositely due to my and m3 is, hh,2 = +12.
Solution : The definition of amount is,
1
Ik = log, —
k B2 Pi
The individual amounts carried by messages m; and mz are,
= and tn =tog: (2) .. (13.7)
Here p; is probability of message m andp2 is probability of message my. Since
messages 1 andim2 are independent, the probability of composite message is pi p2-
Therefore information carried compositely due to m and mz is,Information Coding Techniques 1-6 Information Entropy Fundamentals
ha By definition
"
s
e
—
3
3\-
"
log? (+) tog2(2-) since log[AB]=log A + log 8
1 2
From equation 1.3.7 we can write RHS of above equation as,
Tyr = +l, «++ (13.8)
1.3.2 Physical Interpretation of Amount of Information
We know that Pune University declares large number of results. Suppose that you
received the following messages -
1. Result is declared today
2, Result is declared today
3. Result is declared today
4, Electronics and E & Te results are declared today.
When you receive first three messages, you will say “Oh ! what is new ?, It’s job
of University to declare results” and you will forget it. Thus these three messages are
very common. Since large number of examinations are conducted, every day results
are declared by university. Therefore such message give very less information to you.
But when you receive last ie. 4! message, you will forget every thing and go to
collect the result. Thus the amount of information received by 4! message is very large.
‘And 4! message can occur only two times in a year since results are declared
semesterwise for every branch. This shows that probability of occurrence of 4"
message is very small but amount of information received is great.
Review Questions
1. Explain the concept of amount of information. Also explain what is infinite information and
zero information.
2. With the help of an example give physical interpretation of amount of information.
Unsolved Example
1. A source emits four symbols with probabilities, py = 04, pr = 0.3, p2=02 and ps = 04. Find
‘out the amount of information obtained due to these four symbols [Ans, : 8.703 bits]Information Coding Techniques 4-7 information Entropy Fundamentats
1.4 Entropy (Average Information)
Consider that we have M-different messages. Let these messages be
mt,mz,m3,...14 and they have probabilities of occurrence as p1,P2,P3/.--Pm
Suppose that a sequence of L messages is transmitted. Then if L is very very large,
then we may say that,
pi L messages of m; are transmitted,
P2 L messages of my are transmitted,
ps L messages of m3 are transmitted,
pu L messages of my are transmitted.
Hence the information due to message mm will be,
b= we:(z)
Since there are pi L number of messages of m:, the total information due to all
message of m, will be,
Feo = Pr Log: (3°)
Similarly the total information due to all messages m2 will be,
Toqouay = pa Lloga (+) and so on.
Thus the total information carried due to the sequence of L messages will be,
Total) = Inte) + Eo Qtotay +00 Loa gota vs (14.1)
& Tyoray = pi Llog (f)ov Llog: Ge} w+ pm Llog. ) ess (14:2)
The average information per message will be, wiles
Total information
Average information = formation _
ge ino Number of messages
Totaly
L
Average information is represented by Entropy. It is represented by H. Thus,
.-- (14.3)
Teeote
Entropy (H) = -- (1.4.4). 2
Information Coding Techniques 1-8 Information Entropy Fundamentals
From equation 1.4.2 we can write above equation as,
Entropy (H) = pi loge (fJ+r log: (7) . + Pat log2 (+) (1.45)
We can write above equation using J" sign as follows :
ut i
Entropy: H = 2 Pe toga| w= (14.6)
=
7.
mr
1.4.1. Properties of Entropy
1. Entropy is zero if the event is sure or it is impossivle. ie.,
H=0 if py=Oorl.
2. When py =1/M for all the 'M’ symbols, then the symbols are equally likely.
For such source entropy is given as H =log M.
3. Upper bound on entropy is given as,
Hinax = loga M
Proofs of these properties is given in next examples.
‘um Example 1.4.1: Calculate entropy when px =Oand when py =1.
Solution : Consider equation 1.46,
et 1
H= log2 | —
ue (Fr)
Since p: =1, the above equation will be,
§ toga (1) = $f ee®
i loge (2)
=0 Since logig1=0
e H
"
Now consider the second case when jt, =0. Instead of putting px = 0 directly let us
consider the limiting case i.
H = Spy logs (4) By equation 1.4.6
cI Pk
With p; tending to ‘0’ above equation will be,
1
H= Ii k a
é pera or: (7)Information Coding Techniques 1-9 Information Entropy Fundamentals
The RHS of above equation will be zero when px ->0. Hence entropy will be zero.
ie,
H=0
Thus entropy is zero for both certain and most rare message.
imp Example 1.4.2: A source transmits two independent messages with probabilities of p
and (1 - p) respectively. Prove that the entropy is maximum when both the messages are
equally likely. Plot the variation of entropy (H) as a function of probability ‘p’ of the
messages.
Solution : We know that entropy is given as,
1
= ¥ prtogs()
z (Pe
For two messages above equation will be,
1
w= ¥ mtosa( 2)
pi log2 Je + ba loge (2
Pr P2
Here we have two messages with probabilities py =pandp2 =1—p. Then above
equation becomes,
u
1 1
H = ploga (1) -p)tegs( 45) .. (147)
A plot of H as a function of p is plotted in Fig, 14.1.
Fig. 1.4.1 Plot of entropy ‘H’ with probability ‘p' for two messages.
The maximum of H occurs at p=5, ie. when messages are equally likelyinformation Coding Techniques 1-10 Information Entropy Fundamentals
1
As shown in above figure, Entropy is maximum at p=
. Putting ps5 in
equation 1.4.7 we get,
max
Flog: @) +5 log: (2
logo (2)
1ogiio (2)
This shows that Hmax occurs when both the messages have same probability, ie.
when they are equally likely.
loga (2)= = 1 bit/message.
mm Example 1.4.3: Show that if there are 'M’ number of equally likely messages, then
entropy of the source is log, M.
Solution : We know that for 'M’ number of equally likely messages, probability is,
This probability is same for all ‘M’ messages. ie.,
1
PL = pr =ps=pa=.pM =
. (1.48)
Entropy is given by equation 1.4.6,
$ Px loga (4)
kel Pk
lo (4) los ( }s + paw lo (4)
Pr logs | >] +P2 toga [o> ]+..-+ par loga {
Putting for probabilities from equation 1.4.8 in above equation we get,
H
u
t 1
log (M) + logs (M) +...+77 logs (M)
(Add ‘M’ number of terms)
In the above equation there are ‘M’ number of terms in summation. Hence after
adding these terms above equation becomes,
H = log2(M) .. (1.4.9)
ump Example 1.4.4: Prove that the upper bound on entropy is given a5 Hmay $ log2 M.
Here 'M’ is the number of messages emitted by the source.
Solution : To prove the above property, we will use the following property of
natural logarithm :eelntormation Coding Techniques 1-11 Information Entropy Fundamentals
Inx $ x-1 for x20 (1.4.10)
Let _us consider any two probability distributions {p1,p2,..---pm} and
{91.421-qm} on the alphabet X = {x1,x2....xm} of the discrete memoryless source.
Then let us consider the term $° p; log2 (2) . This term can be written as,
4 q
a
tog ( 2)
& re tops \. + yp — PL
i fa logio2
Multiply the RHS by logioe and rearrange terms as follows :
+ Pr log2{ 2)
Pest
0
togu(
|
$ pr ogo’. Pk
logio2 logioe
au
u)
= logz e-log.| —
Ere toes s(t
Here loge (2) = w( 2]. Hence above equation becomes,
im
a
9k e
log2| 24] = | z h
a on () vee ee af al
From the equation 14.10 we can write
n(#) < (2 4 } Hence above equation becomes,
k ri
4%
loga( 24} < logs e dey
zn (Fr) 5 nh }
Pe
s ogre (qx - Pk)
i
"
Here note that 5 gy = 1.as well as $ px = 1. Hence above equation.
im mn
ive
log (Ze;
becomes,
$ pe loga{ 2) <0 (1.4.11)
aInformation Coding Techniques 1-12 __ Information Entropy Fundamentals»
Now let us consider that qz =} for all k, That is all symbols in the alphabet are
equally likely. Then above equation becomes,
& 1
» viftogs qx + log | <0
u u 7
D, pk logs qx + 2) px loge — <0
kel Pr
=
& 1
D pelog2— < - $ prlog2 qe
kel Pr =
w
Sm log: 1
ats
oy %
Putting qe = = in above equation,
1
+ px log2— + Px log2 M
et Pe AY
s toga MS Pk
kaa
ut
Since’ px = 1, above equation becomes,
it
x :
> Pe loga 5, S logs M 1.4.12)
at
The LHS of above equation is entropy H(X) with arbitrary probability distribution.
ie,
H(X) < log2 M
This is the proof of upper bound on entropy . And maximum value of entropy is,
Hyax(X) =. logs M
imp Example 1.4.5: If ’X’ is random variable assuming values X1,X2,..Xx ; What
should be probability density function of X to get maximum entropy H(X). Determine
the value of H(X)
Solution : In previous example we have seen that upper bound on entropy is given
as,
H(X) < log. MInformation Coding
information Entropy Fundament
Hence maximum value of entropy for 'K' messages will be,
Hynax (X) = loge K
For entropy to be maximum, all the symbols must be equally likely. Hence
probability of each symbol will be,
P(X) = P(X2)
1
P(K)=%
Above result shows that '*X’ must have uniform probability density function. This
is because all values of 'X’ have same probability of occurrence.
feQ) = Ro for =X Xr oueXe
im Example 1.4.6: For a binary memory source with two symbols x, and x2, show that
entropy H(X) is maximum when both x, and x2 are equiprobabie. What are the lower
and upper bounds on H(X) ?
Solution : (1) To prove that x, and x2 are equiprobable :
Let the probability of m be, P(x) =p.
P(x) = 1-p
Following steps are required to solve the problem :
(i) Determine entropy H(X).
(ii) Differentiate H (X) with respect to p, ie. e H(X).
‘it d
(ii) Solve 7 H(X)=0 for p.
(i) To obtain entropy of source i.e. H(X) :
Entropy is given as,
HX) = z P(x) logs ea
1 1
= P(n)log2 Fey +P(x2)log2 Pay
. 1 a
= p loge prt-r) loga TS
We know that log2
H®)
aie
=—log2 x. Hence above equation becomes,
i
—p log2 p-(1-p) log2 (1-p)Information Coding Technique
1-14 Information Entropy Fundamental
(i) To determine & H(X):
pie alr p-(-Plog2 (1-P)]
a3, jp (PB: y-5 ip (1 -P) 1082 (1 -P))
fi pep gloss | -[loso -p)(-1) +01 -)F e2 a |
we toon mec EE sO anemone aqumne neces,
log: 2 Togio2
a 4 186?) [tog sd p)+(1—p) OBA)
Bee -[loeer +r 85 jogs g) [ log2 (1-p)+(1 @ Tog.
We know that 2 log. x=~. Hence above equation will be,
x
i
He H00= iss rept cts|-[-tos0-r+0-mg sd peal
= [logs 5, aa i [-toss(1- age al
1
a loga (1
Togs 2 1087 "op 2 ~
-log2 p +loga (1-p)
on a qd .
(iii) To determine value of p for ap H(X)=0:
When the derivative of H(X)=0, it has maximum value. Hence by Z A(X) =0,
ip
we obtain value of p.
At this value of p, H(X) is maximum. ie,
0 = -log2 p+ log2(1-p)
logz p = log2(1-p)
p=i-p
1
or PESInformation Coding Techniques 1-15 Information Entropy Fundamentals
1
Thus p(x) = P=5
and p(x2)=1 -p=5
Above result shows that entropy is maximum when x and x2 are equiprobable.
{l) To obtain upper and lower bounds on H(X) .
We know that H(X) is non-negative, Hence lower bound on H(X) will be ‘0’. In
- example 1.4.4, it is proved that upper bound on H(X) is log2 M. ie.,
0 < H(X)< logo M
Above equation gives lower and upper bound on H(X). Here 'M' is the number of
messages emitted by the source.
mp Example 1.4.7: For a discrete memoryless source there are three symbols with
probabilities pj =a and pr =ps. Determine the entropy of the source and sketch its
variation for different values of a.
Solution : The three probabilities are,
pr =a and pr=psa
We know that, pi+p2+p3 =1
a+prtp2 = 1 since p2=ps
= bee
P2 z
l-a
Hence p2=pa
Entropy is given as,
f
2
1
H = ¥ pe logs >
fy PROB?
ee
#* os (722)+("*) (25)
4 loge 2 +(1-a) loge ( 2 |
1
= a logr > +
0Information Coding Techniques 1-16 Information Entropy Fundamentals
The sketch of H with respect to a is given in Fig. 1.4.2
Fig. 1.4.2 Plot of H versus a
Review Question
1, What is average information or Entropy ?
Unsolved Example
1. A source generates four’ messages mo,m,,m3 &ms with probabilities 5,25 and 3
respectively. The successive messages emitted by the source are statistically independent.
Calculate entropy of the source. (Ans. : 1959 bits/message)Information Coding Techniques 1-17 _ Information Entropy Fundamentals
1.5 Information Rate
The information rate is represented by R and it is given as,
Information Rate : R = rH (5.1)
Here R is ir‘ nation rate.
His Entr - or average information
andr is rate at. ich messages are generated.
Information rate R is represented in average number of bits of information per
second. It is calculated as follows :
Kon (- " mess) rin formation bits )
second message
= Information bits / second
mm Example 1.5.1: An analog signal is bandlimited to B Hz and sampled at Nyquist
rate. The samples are quantized into 4 levels. Each level represents one message. Thus
there are 4 messages The probabilities of occurrence of these 4 Tevls (messages) are
preps =gand py = Ps 3 Find out information rate of the source.
Solution : (i) To calculate entropy (H) :
We have four messages with probabilities p) =p, =gandp: =pa 3 Average
information H (or entropy) is given by equation 1.4.5 as,
Hep tows ( Jr: tops ( Jars von (E ) +s toss (Zt }
~ Hoge 8+ 2g (8}+ Fogo (8) bine
H = 18 bits / message + (15.2)
To calculate message rate (r) :
We know that the signal is sampled at nyquist rate. Nyquist rate for B Hz
bandlimited signal is,
Nyquist rate = 2B samples / sec.
Since every sample generates one message signal,
Messages per second,
1 = 2B messages / sec.Information Coding Techniques 4-48 Information Entropy Fundamentals
To calculate information rate (R) :
Information rate is given by equation 1.5.1 as,
R= rH
Putting values of r and H in above equation,
R = 2B messages / sec x 1.8 bits / message
= 3.6B bits / sec (Ans) .. (153)
Comment :
In the example we discussed above, there are four levels. Those four levels can be
coded using binary PCM as shown in Table 1.5.1.
Message or level Probability Binary digits
a f 00
S
Q = o1
Q Si 19
8
a 4 11
8
Table 4.5.1
Thus two binary digits (binits) are required to send each message. We know that
messages are sent at the rate of 2B messages / sec. Hence transmission rate of binary
digits will be,
Binary digits (binits) rate = 2 binits / message x 2B messages / sec.
= 4B binits /sec.
Since one binit is capable of conveying 1 bit of information, the above coding
scheme is capable of conveying 4B bits of information per second. But in example 1.5.1
we have obiained that we are transmitting 3.6 B bits of information per second
(see equation 1.5.3). This shows that the information carrying ability of binary PCM is
not completely utilized by the transmission scheme discussed in example 15.1. This
situation is improved in next example.
‘mm Example 1.5.2: In the transmission scheme of example 1.5.1 calculate information
rate if all messages are equally likely. Comment on the result you obtained.Information Coding Techniques: 1-19 information Entropy Fundamentals
Sol
nm: We know that there are four messages. Since they are equally likely, their
probabilities will be equal to ¢ ie,
1
Pi = Pasps=ps=a
Average information per message (Entropy) is given by equation 1.4.5 as,
H = pi log2 (A) log2 (a Jp. tops (3 pe toga (3 }
1).
= 4p log2 (3) since pi =P2=P3 =Pa
3 1
=log2 (4) since pi =p2=ps =pasp=z
= 2 bits / message
The information rate is given by equation 1.5.1 as,
R = rH
Here r=2B messages / sec. as obtained in example 1.5.1. Putting these values in
above example we get,
R = 2B messages / sec. x 2 bits / message
= 4B bits / sec. (Ans) vs (15.4)
Comment :
Just before this example we have seen that a binary coded PCM with 2 binits per
message is capable of conveying 4B bits of information per second. The transmission
scheme discussed in above example transmits 4B bits of information per second
(sce equation 1.5.4). This has been made possible since all the messages are equally
likely. Thus with binary PCM coding, the maximum information rate is achieved if all
messages are equally likely.
mma Example 1.5.3: Consider a telegraph source having two symbols dot and dash, The
dot duration is 0.2 sec ; and the dash duration is 3 times of the dot duration, The
probability of the dot’s occuring is twice that of dash, and time between symbols is
0.2 seconds. Calculate information rate of the telegraph source.
Solution : (i) To calculate probabilities of dot and dash :
Let the probability of dash be ‘p’. Then probability of dot will be "2p. And,
ptp=boos -5
and (do!) = 3
els
Thus (dash)Information Coding Techniques 1-20 information Entropy Fundamentals
{ii) To calculate entropy of the source :
Entropy is given by equation 1.4.6 as,
(4 )
H = ¥ pelogs/+
z "pe
For dots and dash, entropy will be
1 2, (3
H = Hagst-Zhe(3)
= 0.9183 bits/symbol
(ili) To calculate average symbol rate :
It is given that,
dot duration = Ty = 0.2 sec
dash duration = Tyas: = 3 0.2 = 0.6 sec
Duration between symbols = Tsymtois = 0.2 sec
Now let us consider the string of 1200 symbols. On average the dots and dash will
appear according to their probabilities in this string
Hence,
Number of dots = 1200x p(dots) = 1200x 3 = 800
Number of dash = 1200x p(dash} = 12004 = 400
Now let us calculate the total time for this string ie.,
T = dots duration + dash duration + (1200 x time between symbols)
= (800% 0.2) + (400 x 0.6)+ (120% 0.2)
= 640 sec.
Hence average symbol rate will be,
220
T
a = 1875. symbols/sec
(iv) To calculate information rate :
Hence average information rate becomes (from equation 1.5.1)
1 = rH =1875x 09183
= 17218 bits/sec
Thus the average information rate of the telegraph source will be 1.7218 bits/sec.Information Coding Techniques 4221 Information Entropy Fundamentals
wm Exampie 1.5.42 A zro mean unit variance white gaussian noise is bandlimited to
4kHz, This noise is then uniformly sampled al nyquist rate. The samples are digit
using @ quantizer with characteristics as shown in Fig. 1.5.1. Determine the information
rate at the quantizer output.
Assume Q(0.5) = 031, Q(LO0) +016 and Q(1.5) + 007, Q(-10) = 084.
Output
input
Fig. 1.5.1 Uniform quantizer characteri:
Solution : This example can be solved by following steps
(i) To determine (05), p(- 0.5), p(1.5) and p(- 15)
(ii) To determine source entropy (H).
(iii) To determine symbol rate (1),
(iv) To determine information rate (R).
(i) To obtain probabilities :
Let the input be represented by Y and output be represented by X.
For 0s Y¥<1, output is X = 0.5
For -1 < Y<-1 output is X = ~ 0.5
For 1 < Y¥
) pr logs —
BP 7,
= > Pe log2 +
ik
Pi tog: 2 +p: toga +s logs t+ ps log2
1
Pa
"
Putting the values in above equation,
1 - ‘ 1
H(X) = slog. 2+ Flog2 4+ 5log2 8+ 5 log28
2 bits /symbol
ji) To obtain second order extension of the source
The source alphabet X contains four symbols. Hence its second order extension
will contain sixteen symbols. These symbols, their probabilities and entropy
calculations are shown in Table 1.6.1.Information Coding Tech 4-26 _ Information Entropy Fundamentals
| Second aa __|Probabity of symbol ptenton ata
% Psi)
1. XX Gloves -3
2, Ky XQ gloue8 +3
% mx 71002 18= 3
4 Xi Xe spon t6-4
5. Sit fees =3
o X22 % Jog, 16 = i
7 Xp X3 = 09,32 &
6. X24 z 00:32 §
3. XX pevte-3
10 ae ghbo32- 8,
a a8 Arones- &
2. xy x4 dont = &
13. xix " loge 16-4
44. Me Xp a ‘og 2%
18 Xa X
16. Keka Agtwct
Ba" 64
Table 1.6.1 : Calculations of second order extension of sourceInformation Coding Techniques 1-27 information Entropy Fundamentals
The entropy of the second order extension of the source can be obtained as,
Hoe) =F poor a5
Putting value from Table 1.6.1 in above equation,
LAist 8 eal
2°8 4°48 4 32°32°4
Pu Syl 5 656
32 64,64 4 32 64 64
H(X2) =
= s bits/symbol (extended)
Here observe that,
H(X) = 2 bits symbol
HX)? = 5 bits/symbol (extended)
The above results confirm the following :
7
HQ? = 2H(QX)= 2x7
mm} Example 1.6.2: For DMS ’X’ with two symbols x1 and x2 and p(x)=09 and
p(x2)=04. Find out the second order extension for the source. Find the efficiency ‘ny’
and redundancy of this extended code.
Solution : (i) To obtain entropy (H) :
Entropy is given as,
He ¥ rutogs 2
fost
Here p; =0.9 and p2 =0.1. Hence,
H 09 tog: (35 5 +1 logs gp
"
= 0.1368 + 0.3322
= 0,469 bits/message.
(ii) To obtain second order extension :
The source alphabet 'X’ contains two symbols. Hence its second order extension
will contain four symbols. These symbols, their probabilities and entropy calculations
are given in Table 1.6.2.Information Coding Techniques 4-28 Information Entropy Fundamentals
i Second order extension| Probability of symbol ila\eors
symbol cj) % P(si) : P (ai)
1 xX) 09.09= 081 0.81 fogs gy = 0.2482
XX 09x01 0.09 0.09 loge ayy = 0.3126
2% 01x09 = 0.09 0.09 loge gg = 0.3126
Xp Xe OtxO1= 0.01 0.01 bog ani = 0.0664
Table 1.6.2
Entropy of the second order extension of the source can be obtained as,
H(A) = Even log Fe
Putting values from above table,
H(X?) = 0.2462 + 0.3126 + 0.3126 + 0.0668
0.9378 bits/message
0.938
2H(X)
= 2« 0469
Here note that H(X?)
To obtain efficiency :
Here no source coding is employed. Here are four symbols in extension of the
source. Hence two bits will be required to code four symbols. Hence average number
of bits per symbol will be two. ie., N = 2 bits/symbol.
H(X?)
N
0.938
=
0.469
Code efficieney 1 =
1.7 Source Coding Theorem (Shannon's First Theorem)
In the last section we have seen that if the messages have different probabilities
and they are assigned same number of binary digits, then information carrying
capability of binary PCM is not completely utilized. That is, the actual information
than the maximum achievable rate. But if all the messages have same
probabilities (i.e. they are equally likely), then maximum information rate is possible
rate is leInformation Coding Techniques 1-29 Information Entropy Fundamentals
with binary PCM coding. The binary PCM method; used to code the discrete messages
from the source is one of the source coding methods. The device which performs
source coding (like PCM, DM, ADM etc) is called source encoder. Efficient source
encoders can be designed which use the statistical properties of the source. For
example, the messages occurring frequently can be assigned short codewords, whereas
messages which occur rarely are assigned long codewords. Such coding is called
variable length coding. The efficient source encoder should satisfy following
requirements.
i) The codewords generated by the encoder should be binary in nature.
ii) The source code should be unique in nature. That is every codeword should
represent unique message.
Let there be ‘L’ number of messages emitted by the source. The probability of the
k" message is py and the number of bits assigned to this message be ns. Then the
average number of bits (N) in the codeword of the message are given as,
tel
¥ pane e. (L7.1)
&
Let Nin be the minimum value of N. Then the coding efficiency of the source
encoder is defined as,
ae (17.2)
The source encoder is called efficient if coding efficiency (r) approaches unity. In
other words Nyin SN and, coding efficiency is maximum when Nin = N. The value
of Nin can be determined with help of Shannon’s first theorem, called source coding
theorem. This theorem is also called Shannon's theorem on source coding. It is stated
as follows
Given a discrete memoryless source of entropy H, the average codeword length N for any
distortionless source encoding is bounded as,
N2H ass (7.3)
Here the entropy H represents the fundamental limit on the average number of
bits per symbol ie. N. This limit says that the average number of bits per symbol
cannot be made smaller than entropy H. Hence Nin =H, and we can write the
efficiency of source encoder from equation 1.7.2 as,
H
. (17.4)
neInformation Coding Techniques
30 _ Information Entropy Fundamentals
1.7.1 Code Redundancy
It is the measure of redundancy of bits in the encoded message sequence. It is
given as,
Redi sdancy (y) = 1 - code efficiency
= 1-7 wo» (1.7.5)
Redundancy should be as low as possible.
4.7.2. Code Variance
Variance of the code is given as,
ot = Sn ("% -N) ww» (1.7.6)
0
Here 6? is variance of the code.
M is the number of symbols.
ps is probability of k” symbol.
ny is the number of bits assigned to k" symbol.
N is average codeword length.
Variance is the measure of variability in codeword lengths. Variance should be as
small as possible
yum Example 1.7.1: For a discrete memoryless source with K equiprobable symbols, use of
fixed length code will provide same efficiency as any other coding technique. Justify the
above statement.
State the condition to be satisfied by K for achieving 100% efficiency.
Solution : (i) Justification :
If there are 'K' equiprobable symbols, the probability of each symbol will be,
peels
Pe K
‘Average codeword length is given as,
_ tet
N= Dpem
1 oe
tin =L Sm
0K K&
Here let us assume that K=4. For these 4 symbols two bits/symbol will be
required in fixed length coding. Hence n =2 ior all symbols. Hence above equation can
be written as,Information Entropy Fundamentals
Walde
ko
= i [ro +m +n +3]
= Lpe2+2ez]=2ien
Above result shows that N =n for fixed length coding with equiprobable symbols.
Here note that ‘N' is the minimum number of bits per symbol. With any other coding
technique, N will not be less than 'n'. This statement confirms that N will remain
same. Efficiency of the code is given as,
Since N remains same in both cases, efficiency will also be same. This can also be
varified through some numerical example.
(ii) Condition for 100% efficiency, ie. 1 =1:
Entropy of the source is given as,
Ss 1
H = 4, logan
2 Pe los2 5
po log: + +py logs 2 +... px log
Po Pi Pea
Here Po = pi=p2
H= f bop: K+t logs K+. (Add 'K’ number of terms)
= log, K
Code efficiency is given as,
H
ns
For 100% efficiency,
l= ie H=N
zim
We know that N = 1 for fixed length coding.
H=n
Putting for H in above equation,
log, K = 2Information Coding Techniques 1-32 _ Information Entropy Fundamentals
log ig K
=n
logic 2
logio K = 1 logy 2= logic 2"
Ke oe
Thus for 100% efficiency above condition must be satistied,
Review Question
1. State and explain source coding theorem. What is coding efficiency ?
1.8 Variable Length Source Coding Algorithms (Entropy Coding)
Variable length coding is done by source encoder to get higher efficiencies. Two
algorithms are discussed next, which use variable length coding, They are Shannon
Fano algorithm and Huffman coding. These algorithms are also called entropy coding,
algorithms.
1.8.1 Prefix Coding (Instantaneous Coding)
This is variable length coding algorithm. It assigns binary digits to the messages as
per their probabilities of occurrence. Prefix of the codeword means any sequence
which is initial part of the codeword. In prefix code, no codeword is the prefix of any
other codeword. Table 1.8.1 shows four source symbols, their probabilities and the
codewords assigned to them by prefix coding.
Source symbol | Probability of Prefix code
occurrence
$9 05 0 «< codeword
s 0.25 10 < codeword
en
& 0.125 1 10 € codeword
ve
Ss 0.125 1.11 « codeword
‘te
Table 1.8.1 Prefix code
In the above table observe that message sq has codeword ‘0. Message s1 has prefix
of 1 and codeword of 0. Observe that no codeword is prefix of other codewords.
Fig. 1.8.1 shows the decision tree for decoding the prefix code of above table.
As shown in the above figure, the tree has initial state. The decoder always starts
from initial state. If the first received bit is ‘0’ then the decoder decides in favour ofaa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.Information Coding Techniques 1-34 Information Entropy Fundamental
a
Doo = 2422 425429 21
k=0
Thus Kraft-McMillan inequality is satisfied.
2. The probability of k message is related to 2-* as,
pe = 2k ns (1.8.2)
For the messages of Table 1.8.1,
po = 05227
Pr = 0.25 =
pr = 01% 529
ps = 012 =23
3. The average codeword length of the prefix code is bounded by,
H() s N pele as per equation 1.7.1
Ko
Putting for py = 2% = 51 from equation 1.82 in above equation,
kal
a on (1.8.4)
k=O alk
The entropy of the source is given as,
3 .
HO = Dm ione(;)
k=0 Pe
Putting for Pk = a in above equation,
Ky ne
HS) = 2 gar oe. 2k = % girs loga 2
Sh
=> a ws (1.8.5)
Ko
ie. HO) =
Thus the prefix code is matched to the source. This means the average codeword
length of the prefix code is equal to entropy of the source.Information Coding Techniques 1-35 Information Entropy Fundamentals
nm) Example 1.8.1: State the condition for unique decodability of codes. Consider the four
codes given below :
Symbol | Code A | CodeB | CodeC | CodeD
$0 0 0 0 00
a 10 01 01 01
$2 110 001 o1t 10
Identify which of the above codes are uniquely decodable codes and construct their
individual decision trees.
Solution : (i) Condition for unique decodability :
A code is uniquely decodable, if no codeword is the prefix of any other codeword.
(li) Uniquely decodable codes :
The given codes are shown below with indications of prefix.
Code A Code C
04
~
This is prefix
Ofsy
0
14
As shown above in code B, the code word s2 = 001 is prefix of s3 = 0010. Similarly
in code C, the code word s; =01 is prefix of s; = 0 1 1. Hence code B and C are not
uniquely decodable. But code A and D are prefix codes and they are uniquely
decodable.
(ii) Decision tree for code A :
Fig, 1.8.2 shows the decision tree for code A.information Coding Techniques 1-36 Information Entropy Fundamentals
Fig. 1.8.2 Decision tree for code A
{iv) Decision tree for code D :
Fig, 1.8.3 shows the decision tree for code D.
Syentt
Fig. 1.8.3 Decision tree for code D
1.8.2 Shannon-ano Algorithm
In the section 1.5 we have seen that if the probabilities of occurrence of all the
messages are not equally likely, then average information (i.e. entropy) is reduced. This
in turn reduces information rate (R). This problem is solved by coding the messages
with different number of bits. As the probability of the message is increased, less
number of bits are used to code it.Information Coding Techniques 1-37 information Entropy Fundamentats.
Shannon-Fano Algorithm is used to encode the messages depending upon their
probabilities. This algorithm allots less number of bits for highly probable messages
and more number of bits for rarely occurring messages. Shannon-Fano Algorithm can
be best explained with the help of an example.
A complete encoding process of Shannon-Fano algorithm is shown in Tabie 1.8.2.
As shown in Table 1.8.2 there are eight messages with mm 1o ms. The probabilities of
occurrence of those messages are shown in 2“ column. For example probability of m
is = probability of mz and my is = and so on. The algorithm proceeds as follows :
‘abesseu
a) ON
5
5
Only one
message
Only one
1 Lq—message
stop hore
Ofg
fone J
smu | ane Fawn
{maeacug| Anaeaaia { Amaeqo.duy sepygegaid jo
2
3
2
3
é
siuonped si)
1 saqmnqeqosd jo wing.
ze
pee oe oye
Table 1.8.2 Shannon-Fano AlgorithmInformation Coding Techniques 1-38 Information Entropy Fundamentals.
As shown in column-I a dotted line is drawn between m and m2. This line makes
two partitions. In upper partition there is only one message and its probability is -
Lower partition contains m toms and sum of their probabilities is also g
Thus the partition is made such that sum of probabilities in both the partitions are
almost equal. The messages in upper partition are assigned bit ‘0’ and lower partition
are assigned bit ‘1’. Those partitions are further subdivided into new partitions
following the same rule. The partitioning is stopped when there is only one message
in partition. Thus in column-I upper partition has only one message hence no further
partition is possible. But lower partition of column-I is further subdivided in
column-I.
In column-IL, the dotted line is drawn between ms and m4. Observe that in upper
partition we have two messages mz and ms. The sum of probabilities of m2 and ms is
4
_8
32°32 3
re me |
probabilities is 2+
The lower partition in column-II has messages my toms. Their sum of
2 2. 2
32732"
partitions is equal. The messages in upper partition are assigned ‘0’ and lower
partition are assigned ‘I’. Since both the partitions in column-II contains more than
one message, they are further subdivided. This subdivision is shown in column-IIl.
=, Thus the sum of probabilities in both the
This partitioning is continued till there is only one message in the partition. The
partitioning process is self explanatory in columns-Ill, IV, V in Table 1.8.2. In the last
column of thé table codeword for the message and number of bits / message are
shown. The codeword is obtained by reading the bits of a particular message rowwise
through all columns. For example message m, has only one bit i.e. 0, message mz has
three bits ie. 10 0, message ms has also three bits i.e. 1 0 1, message mg has five bits
ie 11111. This shows that, the message mm has highest probability hence it is coded
using single bit ie. ‘0’. As the probabilities of messages goes on decreasing, the bits in
codeword increase.
We know by equation (1.4.6) that average information per message (entropy) is
given as,
n= ¥ yy tog.(4
& 1 oe: (2)
= prlogs x} loy (4) +s loy +)
Pr log> PA P2 log2 pe Ps logo PsInformation Coding Techniques 4-39 Information Entropy Fundamentals
16 2) 4, (32) 4 32), 2 32)
Blea (Te) axles (Fe apleee( +35 e2( 3)
2 32 2 ae 32 1 32
+H loeo(F \s ae } sve (T)essee (7)
Floge 2+4 log: 8+ logs 16+ log 32
=e
= 22 bits of information / message. s+ (1.8.6)
Now let us calculate average number of binary digits (binits) per message. Since
each message is coded with different number of binits, we should use their
probabilities to calculate average number of binary digits (binits) per message. It is
calculated as follows :
Average number of binary digits per message
16 4 4 2 2 2 1 1%
= ($)+3($)+a(S)+4(S) 4(S)-4($)-3(S) 93)
s = 2 binary digits / message (18.7)
It is clear from equation 1.8.6 and equation 1.8.7 that average bits of information
per message is same as average binary digits per message. This means one binary
digit (binit) carries one bit of information, which is maximum information. that can be
conveyed by one binit. In general to transmit eight messages (mm toms) we need three
binary digits per message without any special coding. Thus with Shannon-Fano coding
algorithm we required 27 binary digits per message. Thus with special type of
coding, like Shannon-Fano algorithm, average number of binary digits per message are
reduced and maximum information is conveyed by every binary digit (binit),
1.8.3 Huffman Coding
In the last section we have seen Shannon-Fano Algorithm. This algorithm assigns
different number of binary digits to the messages according to their probabilities of
occurrence. Huffman coding also uses the same principle. This type of coding makes
average number of binary digits per message nearly equal to Entropy (Average bits of
information per message). Huffman coding can be best explained with the help of an
example.
awInformation Coding Techniques 1-40 Information Entropy Fundamentals
Consider that the source generates five messages mg,m,....m4. The probabilities
of these messages are as shown in 2“ column of Table 1.83.
LThe messages are arranged according to their decreasing probabilities. For
example m3 andy have lowest probabilities and hence they are put at the
bottorm-in column of stage-1
2.The two messages of lowest probabilities are assigned binary ‘0’ and ‘I’.
3. The two lowest probabilities in stage-I are added. Observe that the sum of two
probabilities is 0.1 + 0.1 = 0.2.
4. The sum of probabilities in stage-1 is placed in stage-II such that the
probabilities are in descending order. Observe that 0.2 is placed last in stage-Il.
Now the last two probabilities are assigned ‘0’ to ‘1’ and they are added. Thus
the sum of last two probabilities in stage-II is 0.2 + 0.2 = 0.4.
6. The sum of last two probabilities (i.e. 0.4) is placed in stage-IIl such that the
. probabilities are in descending order. Again ‘0’ and ‘1’ is assigned to the last
two probabilities.
Similarly the values in stage-IV are obtained. Since there are only two values in
stage-IV, these two values are assigned digits 0 and 1 and no further repetition
is required.
a
x
Now let us see how the codewords for messages are obtained.
To obtain codeword for message my :
‘The Table 1.8.3 is reproduced in Table 1.8.4 for explanation.
The sequence of 0 and 1 are traced as shown in Table 1.8.4. The tracing is started
from stage-I. The dotted line (......) shows the path of tracing. Observe that the tracing
is in the direction of arrows. In stage-I digit ‘I’ is traced. In stage-Il digit 'I' is traced.
In stage-IIl digit ‘0’ is traced. In stage-IV digit ‘0’ is traced. Thus the traced sequence
is 1100. We get the codeword for the message by reading this sequence from LSB to
MSB. i.e. 0011. Thus the codeword for mg is 0011.
To obtain codeword for mo :
The center line shows (- --- ~~~) the tracing path for message mp. The tracing is
started from stage-I. No binary digits are occurred in stage-I, II and IIf. Only digit ‘1’
is occurred in stage-IV in the path of tracing. Hence codeword for message mg is 1.
Thus single digit is assigned to my since its probability is highest.Information Entropy Fundamentals
1-44
information Coding Techniques
‘uuinjoo ouput
182] 1nd 2ou0u,
‘Ayugeqord ys99007
|
z0
usryoo ou up
420} Ind eouey
‘Ayaqeqosd jsonoq
\
0
1-26e4s 0}
#0 Plenuoy pair
——
‘mau 0} p¥UIqUIED ase
‘saniqeqoud 1s9M0} Jo
soBessoui om Sou,
0
paubisse ©
S10, loqQuiAs”
11-9Be}8 0)
70 PeMOd Pawien
_———_
11-9628 0)
premio, pawen
v0
Arebeig
aBessau jo
Ire6eig sonugedosd
1621s
Table 1.8.3 Huffman CodingInformation Entropy Fundamentals
1-42
Information Coding Techniques
«1,819 eBessou so}
piomepeo Sau9H "ip BUNS St
bLo0 “21
7u 104 psomepoo ule}90 OF
sepio passenas Ul Sq
sou peal ott
Q
<0 —28./
Arebais
Ira6eg
Table 1.8.4 Huffman CodingInformation Coding Techniques 1-43 Information Entropy Fundamentais
To obtain codeword for m2 :
Similarly if we trace in the direction of arrows for message mz we obtain the
sequence as 000 (Tracing path is not shown in Table 1.8.5). Reading from LSB side we
get codeword for m as 000 again. Table 1.8.5 shows the messages, their probabilities
the sequence obtained by tracing and codeword obtained by reading from LSB to
MSB. '
Message | Probability | Digits obtained | Codeword obtained by
by tracing reading digits of
column-3 from LSB side
1
1
000
Table 1.8.5 Huffman Coding
We know that equation 1.4.6 that average information per message (entropy) is
given as,
H
wu
> px logs (4)
a Pe
For five messages above equation can be expanded as,
peed aons| anche 2)
1
ssn Jerinea()
Here we started from k=0. Putting values of probabilities in above equation from
Table 18.5 we get,
otters (Gy A } 9atone( gs 1 5} 02 logs (é 5]
+04 top2( 3} 04 teal a)
= 0.52877 + 0.46439 + 0.46439 + 0.33219 + 0.33219
H
= 2.12193 bits of information / message v- (1.8.8)
Now let us calculate the average number of binary digits (binits) per message.
Since each message is coded with different number of binits, we should use theirInformation Coding Techniques 1-44 Information Entropy Fundamentais:
probabilities to calculate average number of binary digits (binits) per message. It is
calculated as follows :
Average number of binary _ > ens “i of an
digits per message message in codeword
= (04 x1) +(0.2% 2 + (02x 3) +(O1x 4) + (01 4)
= 22 binary digits / message vs (1.89)
Thus it is clear from equation 1.8.8 and equation 1.8.9 that Huffman coding assigns .
binary digits to each message such that Average number of binary digits per message
are nearly equal to average bits of information per message (ie. H). This means
because of Huffman coding one binary digit carries almost one bit of information,
which is the maximum information that can be conveyed by one digit.
nm> Example 1.8.2: A discrete memoryless source has five symbols x1 ,x2, x3, x4 and x5
with probabilities 0.4, 0.19, 0.16, 0.15 and 0.15 respectively attatched to every symbol.
i) Construct a Shannon-Fano code for ihe source and calculate code efficiency 1,
ii) Repeat (i) for Huffman code compare the two techniques of source coding.
Solution : i) To obtain Shannon-Fano code :
The Shannon-fano algorithm is explained in Table 1.8.2. The Table 1.8.6 shows the
procedure and calculations for obtaining Shannon-fano code for this example.
Message | Probability of 0
message
x 04 0 0
x 019 1} 0] 0 100
x 0.16 1] o}]14 101
xs 0.15 1 1] 0 110
0.18
Table 1.8.6 To obtain Shannon-fano code
The entropy (H) is given by equation 1.4.6 as,
n= Sno
kt
Here M = 5 and putting the values of probabilities in above equation,Information Coding Techniques § __ Information Entropy Fundamental
H=04 loga{ ij +029 tog2( gig }+046 wos:( 33)
40.15 toga( sis }*9451082( 45)
= 2.2281 bits/message
The average number of bits per message N is given by equation 1.7.1 as,
tt
Do pam
ko
Here px is the probability of k”” message and ny are number of bits assigned to it.
Putting the values in above equation,
N= 0.4(1) + 0.19(3) + 0.16(3) + 0.15(3) + 0.15(3)
= 235
The code efficiency is given by equation 1.7.4 ie.
code efficiency n =
ii) To obtain Huffman code :
Table 18.7 illustrates the Huffman coding. Huffman coding is explained in
Table 1.83 with the help of an example. The coding shown in Table 1.8.7 below is
based on this explanation.
[Message Stage Stag St
U u ut
% 04 on n
% 0.19. 0.350
% 016 koh
% 0.159 sae 1
% — 0.tsdy
Table 1.8.7 To obtain Huffman code
Table 1.8.4 shows how code words are obtained by tracing along the path. These
code words are given in Table 1.88.Information Coding Techniques 46
Message | Probability | Digits | Codeword | Number of
obtained by digits n,
tracing By by by
by by by
x 04 1 1 4
x2 0.19 000 000 3
xy 0.16 100 001 3
x 0.15 010 o10 3
x 0.15 110 om 3
Table 1.8.8 Huffman coding
Now let us determine the average number of bits per message (N). It is given as,
—*
N= Vmm
ms
Putting the values in above equation,
N = 0.4(1) + 019) + 0168) + 015(8) + 0.158)
= 235
Hence the code efficiency is,
nw &
N
_ 2.2281 _
= “35 = 0988
Thus the code efficiency of Shannon-Fano code and Huffman code is same in this
example.
um) Example 1.8.3: Compare the Huffman coding and Shannon-Fano coding algorithms
for data compression. For a discrete memoryless source ‘X’ with six symbols
XipX2,-Xq, find a compact code for every symbol if the probability distribution is as
follow:
p(aj=03 p(x2)=025 — p(xa) = 0.2
p(xs)=0.12 p(ts)= 0.08 — p(x) = 0.05
Calculate entropy of the source, average length of the code, efficiency and redundancy of
the code.
Solution : (1) Entropy of the source :
Entropy is given as,
He & me og (2)
f
(prInformation Coding Techniques 1-47 Information Entropy Fundamentals
For six messages above equation becomes,
1 ‘< 1) a 1
= piloga pt? log2| 5 + ps loga ne)? lopz| a + Ps log2| 5,
tp logz 2
Po
Putting values,
1 1
H= 03 toge( ds 5 }+025t0g2 | 55z)+02I0g2( Z;]+01240ga( zis)
log
amet 082 (as)
= 0.521 + 0.5 + 0.4643 + 0.367 + 0.2915 + 0.216
= 2.3568 bits of information/message.
(i) Shannon-Fano Coding :
+Q08l0g2 (
{i) Te obtain codewords
Table 18.9 shows the procedure for Shannon-Fano coding. The partitioning is
made as per the procedure discussed earlier.
‘Symbol| Probability] Stage-1 |Stage-| Stage-li | StageV |Codeword| No. of bits per
message
% os | lo 00
g
x ozs | to 1 o1
% 02 1 10
xe or [24 | 110
| ;
xs 0.08 1 | 21 1110
Xe 0.05 1 1 1 1 ant
Table 1.8.9 Shannon-Fano algorithm‘Information Coding Techniques
48 __ Information Entropy Fundamentals
(il) To obtain average number of bits per message (®) 7
Nis givenas, = N = Sn ne
to
Putting vaiues in above equation from table 1.8.8,
N = (0.3) (2) + (025) (2) + (0.2) (2) + (0.12) (3)
+ (0.08) (4) + (0.05) (4)
= 238
(ii) To obtain code efficiency :
Code efficiency is given as,
n=
= 0.99
(iv) To obtain redundancy of the cod
Redundancy is given as,
Redundancy (y) = 1-n = 1-099 =0.01
Here 0.01 indicates that there are 1% of redundant bits in the code.
(lt) Huffman coding :
i) To obtain codewords
Table 1.8.10 lists the Huffman coding algorithm.
Table 1.8.10 : Huffman codingInformation Coding Techniques 1-49 Information Entropy Fundamentals
Based on the above encoding arrangements following codes are generated.
Number of digits
%
Table 1.8.11 : Huffman codes
(ii) To obtain average number of bits per message (WN) :
N is given as,
—
N= $ Pam
mi
Putting values in above equation from table 1.8.11,
N = (03) (2) + (0.25) (2) + (02) (2) + 0.12) (3)
+ (0.08) (4) + (0.05) (4)
= 238 .
(ili) To obtain code efficiency :
Code efficiency is given as,
2.3568
2.38
Thus-the code efficiency of Shannon-Fano algorithm and Huffman coding is same.
(iv) Redundancy of the code :
Redundancy (7) is given as,
= 0.99
y=i-n
= 1-099 =001
mm Example 1.8.4: A DMS have five symbols so,51,.....54, characterized by probability
distribution as 0.4, 0.2, 0.1, 0.2 and 0.1 respectively. Evaluate two distinct variable
length Huffman codes for the source to illustrate nonuniqueness of Huffman technique.
Calculate the variance of the ensemble as defined by,information Coding Techniques 1-50 Information Entropy Fundamentals
re
= 3 pe [ale
ms
where px and 1, are probability and length of codeword respectively for symbol s, and
lang is average length. Conclude on result.
2
Solution : Huffman coding can be implemented by. placing the combined probability
as high as possible or as low as possible. Let solve above example with these two
techniques.
(I) Placing combined symbol as high as possible =
(0) To obtain codéword :
Table 18.12 lists the coding. The combined symbol is placed as high as possible.
[SP set ee som |
[SP set ee som | 1 Stage-fl —Stage-1_ Stage - IV
Table 1.8.12 : Huffman coding algorithm
As per the above table, the codes are listed below :
‘Symbot Probability py | Digits obtained |Codeword by b;b2| No. of bits per
by tracing ‘symbol nx
ba by by ;
50 04 oo oo 2
5 02 o4 10 2
82 of 010 010 3
83 0.2 41 s 2
+
Sq O01 110 o11 3
Ld
Table 1.8.13aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.Information Coding Techni 1-52 Information Entropy Fundamentals
As per the above table, the codes are listed below
symbol Probability p, | Digits obtained |Codeword by )bz| No. of bits per
by tracing b; by bo symbol ny
So 04 1 4 1
s 0.2 10 01 2
S82 4 0100 0010 4
83 0.2 000 00 3
01 1100 4
Table 1.8.15
(ii) To obtain average codeword length :
Average codeword length is given as,
N = ¥ pam = 04) +02 Q) +04) +02) +01 (4)
ms
= 22
(iii) To obtain variance of the code :
Variance can be calculated as,
o? -z pe [ue -NP
=0.4[1-2.27 +0.2[2-2.2]' +0.1[4-2.27° +02[3~22] +01 [4-227
=136 4
Results :
Method Average length
As high as possible. 22
As low ac possible. 22
Above results show that average length of the codeword is same in both the
methods. But minimum variance of Huffman code is obtained by moving the
probability of a combined symbol as high as possible.
wb Example 1.8.5: A DMS has following alphabet with probability of occurrence as
shown below
0.0625 0.125information Coding Techniques: 1-53 Information Entropy Fundamentals
Generate the Huffman code with minimum code variance. Determine the code variance
and code efficiency. Comment on code efficiency. (April/May-2004, 16 Marks)
Solution : (i) To obtain codewords :
Minimum code variance can be obtained in Huffman coding by putting the
combined symbol probability as high as possible. Table 1.8.16 shows the coding based
‘on this principle.
s |e
2/3 8
. HS
“| tLsLys
> wy
o{/8 8 & &
2/8 § § 8
Z
m 4
= ey
|e 6 S$ & s
=
7/28 2 8 & & 8
2/8 8 # 8 8 8
2 $5 5 5
é
e « £ 8 8 8 §
6 38 3s 6 6 — 8
3
e | et PF PRP FBO
__|
Table 1.8.16 : Huffman codinginformation Coding Techniques 1-54 Information Entropy Fundamentals
Based on the above table, the codes are traced and generated as follows :
Symbol Probability Digits obtained | Codeword No. of bits per
by tracing symbol my
0 0.125 100
5 0.0625 0000
& 0.25 ot
83 0.0625 1000
Sa 0.125 010
55 0.125 1140
Se 0.25 14
Table 1.8.17 : Codewords
(ii) To obtain average codeword length :
Average codeword length is given as,
n= Snm
= 0.125 (3) + 0.0625 (4) + 0.25 (2) + 0.0625 (4)
+ 0.125 (3) + 0.125 (3) + 0.25 (2)
= 2625 bits/symbol.
To obtain entropy of the source :
s pr loge (4)
k=0 Pk
0125log2 (cing) 006251062 (5 ans }*0251962( ase
4.
+ 006251082 (spam }+ O25I082 (gape |
+0125 0¢0( oF ) +025t0g2( s55)
2.625 bits/symbol.
(iv) To obtain code efficiency :
Code efficiency is given as,
Entropy is given as,
HInformation Coding Techniques 1-55 Information Entropy Fundamentals
n=
2il=
2.625 y
= 365 = 1 or 100 %
Here efficiency of the code is 100% since the combined probabilities are equal in
stage IV and stage VI.
(v) To obtain variance :
dm (m=-N)?
=f
= 0,125(3- 2.625)? +0.0625(4 - 2.625)? +0.25(2-2.625)*
+0.0625(4 ~ 2.625)? + 0.125(3 ~ 2.625)"
+0.125(3 2.625)” +0.25(2-2.625)’
= 0.4843
im) Example 1.8.6 : A discrete memoryless source consists of three symbols x,,x2,x3;
with probabilities 0.45, 0.35 and 0.2 respectively. Determine the minimum variance
Huffman codes for the source for following two alternatives :
(i) Considering symbol by symbol occurrence.
(ii) Considering second order block extension of the source
Determine the code efficiency for the two alternatives and comment on the efficiencies
Solution : Following steps are required to solve this problem.
(1) Symbol by symbol occurrence
(i) Entropy of the source (H)
(ii) To determine Huffman code
(iii) To determine average codeword length (®)
(iv) To determine code efficiency (n)
(il) Second order extension
(i) To determine entropy of second order extension H (X?).
(ii) To determine Huffman code.
(iii) To determine average codeword length (N)
(iv) To determine code efficiency (n)information Coding Techniques 1-56 Information Entropy Fundamentals
(I) Symbol by symbol occurrence :
(i) To determine entropy of the source :
Entropy is given as,
H = ¥prlogs(4
“ og2(
a? «(7)
(4
= 045log2
082 | gas
= 0.5184 + 0.5301 + 0.4643
= 1.5128
(li) To determine Huffman code :
Table 1.8.18 lists the Huffman coding.
\s 035 log
symbot
a . 0.55 |
% ; o4s J,
Table 1.8.18 : Huffman coding
Based on above table, codes are prepared as follows :
Probability p, | Digits obtained
by tracing by 64
0.45 1
0.35
02 10
Table 1.8.19
(iii) To determine average codeword length (") :
Average codeword length is given as,
ae
N= ¥ pane = 045 (1) + 0.35 2) + 02 @)Information Coding Techniques 1-57 __ Information Entropy Fundamentals
(iv) To determine code efficiency :
Code efficiency is given as,
0.976
155
(ll) Second order extension of the source :
(i) To determine entropy of second order extension :
In the second order extension, sequence of two symbols is considered. Since there
are three symbols, second order extension will have nine symbols. Following table lists
this second order extension.
ee
| easter, [Peery ee smtet | pa) tos 5
1 aX 048x045 = 02025 | 92095 loge antag = 0.4668
2 ne 045 036 = 0.1575 01875 lege gag = 0.42
fb
3 HX 04502 = 0.09 0.08 loge pg = 0.3126
4 aw ossx04s= 01875 | ais7SIogs gatas = 042
5 an oasvaas= 0.1225 | gizasieg qytyg «0371
6 eh 036202 = 0.07 207 og gy = 0.2885
7 13% 204s = 0.08 08 ose gig = 08120
8 aM 102x035 = 0.07 007 oa gly = 0.2088
9 at 02x02 = 0.04 004 loge ggg = 0-187
Table 1.8.20 : Second order extension
Entropy of the second order extension can be calculated as,
a(x) = ne) os: Ses
0
3.0254 (from above table)Information Coding Techniques 4-58 Information Entropy Fundamentals
Above entropy can also be directly calculated as,
H(X?) = 2HQ)
= 2«1.5128 = 3.0256
(ii) To determine Huffman code :
Minimum variance Huffman code can be obtained by placing combined
probabilities as high as possible. Table 1.8.21 lists the coding based on above principle.
Table 1.8.21 : Huffman codingInformation Coding Techniques 1-59. oodnformation Entropy Fundamentals
Based on above table, codes are prepared as follows
‘Symbol px =0, | Probability p, | Digits obtained
by tracing
a 0.2025 ot 10 2
o 0.1875 100 001 3
oa 0.09 010 010 3
o 0.1875 110 O14 3
os, 0.1225 144 144 3
96, 007 0000 0000 4
or 009 1000 0001 4
os 007 oo14 1100 | 4
% 0.04 1011 1701 4
Table 1.8.22
(iil) To determine average codeword length :
Average codeword length is given as,
N=Dpm
ia
0
0.2025 (2) + 0.1575 (3) + 0.09 (3) + 0.1575 (3)
+ 0.1225 (3) + 0.07 (4) + 0.09 (4)+ 0.07 (4)+ 0.04 (4)
= 3.0675
(iv) To determine code efficiency :
Code efficiency is given as,
H_H(X)
N N
3.0256
= 30675 ~ 0.9863
Comment : Code efficiency is more for second order extension of the source.
Review Questions
1. With the help of an example explain Shannon-Fano Algorithm.
2. Explain Huffman coding with the help of an example.Information Coding Techniques 1-60 Information Entropy Fundamentals.
Unsolved Examples
1._A source generates 5 messeges with probabilities of occurrence as shown below.
Message
Probability
Apply Huffnan coding algorithm and place the combined message as low as possible when its
probability is equal to that of another message.
) Calculate codeword for the messages
ii) Calculate average codeword length (i.e. average number of binary digits per message).
[Ans. : (i):
Messago Codoword
Mo °
m, i"
m, 100
ms 1010
me 1011
(ii) 1.9 binary digits / message]
2. Apply Huffman coding algorithm to the messages in example 3. Place the combined message
as high as possible when its probability is equal to that of another message.
i) Calculate codeword for messages.
it) Calculate average codeword length (i.e. average number of binary digits per message).
Hint : The stage-II for given data is shown below.
Message Stage-1 Stage-i
mo 0.55 O55 Combined
probability is placed
mm 0.15 0.15 on higher side
when they are
m 0.18 0.15 equal
my 0.10 045
m. 0.05
In the above table observe that combined probability of m3 and mg is equal to that of
m, and mz, The combined protability of ms and m; is placed on higher side in stage-II instead
af bottom. This changes the codeword.Information Coding Techniques 1-61 information Entropy Fundamentals
Ans : (i)
Message Codeword
m 0
my 100
mM 101
ms 119
ms M1
(ii) 1.9 binary digits / message.
3. Apply Shannon-Fano algorithm to code the messages given in example 3.
1.9 Discrete Memoryless Channels
In the preceding sections we discussed the discrete memoryless sources and their
coding. Now let us discuss discrete memoryless channels. The discrete memoryless
channel has input X and output Y. Both X and Y are the random variables. The
channel is discrete when both X and Y are discrete. The channel is called memoryless
(zero memory) when current output depends only on current input.
The channel is described in terms of input alphabet, output alphabet and the set of
transition probabilities. The transition probability P{y, /x,) is the conditional
probability of y; is received, given that x, was transmitted. If i=j then P(y; / xi)
represents conditional probability of correct reception. And if i#j, then Ply; / xi)
represents a conditional probability of error. The transition probabilities of the channel
can be represented by a matrix as follows :
Ply /m) Py2 Si) --- Pm / m1)
Py / x2) Ply2 / x2) --- P(Ym / 2)
P= : : vs (1.9.1)
PQ / Xn) Py2 Xn) - ~~ PY ves
The above matrix has the size of nx. It is called the channel matrix or probability
transition matrix, Each row of above matrix represents fixed input. And each column of
above matrix represents fixed output. The summation of all transition probabilities
along the row is equal to 1 ie.,
Ply J x1) + PY2 1X1) to PYm / 1) =
This is applicable to other rows also. Hence we can write,
S py fm) = 1 ve (1.9.2)
mnInformation Coding Technique: 1-62 Information Entropy Fundamentals
For the ixed input x), the output can be any of 1, Y2/ Y3,eo0Ym- The
summation. | all these possibilities is equal to 1. From the probability theory we know
that,
P(AB) = P(B/ A) P(A) o» (1.93)
Here P(AB) is the joint probability of A and B. Here if we let A=x; and B=y;,
ther
Pox yj) = Ply / mi) P(x) on (19.4)
Here P(x; yj) is the joint probability of x; and y;. If we add all the joint
probabilities for fixed y, then we get P(y)) ie,
¥ Poy) = Py) w (1.95)
This is written as per the standard probability relations. The above equation gives
probability of getting symbol y;. From equation 19.4 and above equation we can
write,
Py) = ¥ Py / x) Pex) ~- 0.96)
im
Here j = 1,2,.
Thus if we are given the probabilities of input symbols and transition probabilities,
then it is possible to calculate the probabilities of output symbols. Error will result
when i” symbol is transmitted but j* symbol is received.
Hence the error probability P. can be obtained as,
Pom y Pty) ww (1.97)
a
jai
Thus all the probabilities will contribute to error except i=. This is because in
case of i = j, correct symbol is received. From equation 1.9.6 and above equation,
jeticl
P= FS py /m) Peo ww (1.9.8)
Bi
And the probability of correct reception will be,
Pp. =1-P, « (1.99)Information Coding Techniques 1-63 information Entropy Fundamentals
1.9.1 Binary Communication Channel
Consider the case of the discrete channel where there are only two symbols
transmitted. Fig, 1.9.1 shows the diagram of a binary communication channel.
% PWohe)
Pox) "
(%)
YolX1)
Ply s/o)
% Ys
Po) Poa)
Fig. 1.9.1 Binary communication channel
We can write the equations for probabilities of yo and y; as,
P(yo) = P(yo / ¥0) P(Xo)+P(Yo / PM) .- (1.9.10)
and Py) = PQ /1) P(x) +P Qh / x0) P(x) .. (1.9.11)
Above equations can be written in the matrix form as,
P(vo) P(yo / x0) P(yi / xo)
= [Pi PCR
Pe (Po) PG] ie Jn) PQ /m)
Note that the 2x 2 matrix in above equation is a probability transition matrix. It is
similar to equation 1.9.1.
.. (1.9.12)
Binary symmetric channel :
The binary communication channel of Fig. 19.1 is said to be symmetric if
P(yo / x0) = P(y / m) =p. Such channel is shown in Fig. 1.9.2.
Plyoltg)=
% \Yo!Xo)=P Yo
Piygix,)=1-p
Ply s/Xo)=1-p
% Y
Plyyixy)50
Fig. 1.9.2 Binary symmetric channelInformation Coding Techniques 1-64 Information Entropy Fundamentals
For the above channel, we can write equation 1.9.12 as,
[pe] = Peo Peo? P| eas
L
1.9.2 Equivocation (Conditional Entropy)
The conditional entropy H(X / Y) is called equivocation. It is defined as,
1
H(X/Y) = PQ, yj)log2 =——~ ve (1.9.14)
E £ HOB? ET G5)
And the joint entropy H(X,¥) is given as,
ue
1
HX.Y) = P(x;, yj logs 5——— (1.9.15)
(X,Y) z £ 1, WiNoga BE a ( )
The conditional entropy H(X / Y} represents uncertainty of X, on average, when Y
is known. Similarly the conditional entropy H(Y / X) represents uncertainty of Y, on
average, when X is transmitted. H(Y / X) can be given as,
= oo tiny
HYY/®) = £ z Pai woes FT (1.916)
The conditional entropy H(X/Y) is an average measure of uncertainty in
X after Y is received. In other words H(X /Y) represents the information lost in the
noisy channel.
amp Example 1.9.1 : Prove that
H(X, Y) = H(X/¥) + H(Y)
= H(Y/X) + H(X)
Solution : Consider equation 1.9.15,
oxy = SS Ps. wiles aa
S $ PQ, ¥))log2 Pla, y)) ws (1.9.17)
i iat
From probability theory we know that,
P(AB) = P(A/B) P(B)
Posi, yj) = Pla / yj) Ply)
Putting this result in the log> term of equation 1.9.17 we get,Information Coding Techniques 1-65 Information Entropy Fundamentals
H(X%Y) = + + P(x, yj og a [PQ / yj) PUA
am
We know that log2 [P(x / yj) P(yi)] = log2 P(2i / 9) ) +log2 P(yj)-
Hence above equation becomes,
ae
H(XY) = -2) 2, Posi, yj)loge Pou / yi)
cre
-¥ $ Pei, y;)logs Ply)
ia jan
u
i
= $ Pla, yj)log2 Fay)
an
s \% res ston
nia
The first term in above equation is H(X/Y) as per equation 1.9.14. From the
standard probability theory,
Mi
¥ Pos. yj) = Ply)
Hence H(X,Y) will be written as,
ut
HY) = HX/ 1) ¥ Plyj) logs Py)
at
= HOC) +S Ply) logs 5s
‘As per the definition of entropy, the second term in the above equation is H(Y).
Hence,
H(X.Y) = H(X/Y)+HM ww (1.9.18)
‘Thus the first given equation is proved. From the probability theory we know that
P(AB) = P(B/ A) P(A)
P(xj, yj) = Ply; / 31) PQ)
Putting this result in the log term of equation 1.9.17 we get,
Hey = -% 9} peas, yj) togatPyy /m) Ped)
it fat
$F pts, 97) loge Py) / 2)
ialjat
"Information Coding Techniques 1-66 Information Entropy Fundamentats
+ + P(x, yj) log P(x)
fet jat
= $ $ PO, yj) loga
_1_
ingen Ply; / xi)
“
sy {3 ne. 9o} log Px)
ala
As per equation 1.9.16, the first term of above equation is H(Y /X). And from
standard probability theory,
¥ Ps.y)) = Pox)
ft
Hence H(X,Y) will be written as,
HEN = HOr/x)-$ PU)og2 Pla)
W
1
er /0+ 5 Poadlogs By
As per the definition of entropy, the second term in the above equation is H(X).
Hence
HY) = H(Y/X)+H®) -- (1.9.19)
‘Thus the second part of the given equation is proved.
ump Example 1.9.2 : Two BSC’s are connected in cascade as shown in Fig, 1.9.3.
Fig. 1.9.3 BSC of example 1.9.2
i) Find the channel matrix of resultant channel.
ii) Find P(z1) and Plz2) if P(m) = 0.6 and P(x) = 0.4Information Coding Techniques 1-67 Information Entropy Fundamentals.
Solution : i) To obtain channel matrix using equation 1.9.1
Here we can write two matrices of the channel as follows :
_ fPQ/m) Pe /m)]_fos 02
rr7) = [ron /x2) P(y2 rai lo2 oa
5 P(zi/ys) P@2/y1)] [07 03
id PZ = =]
and ZY) get rectal 03 07
Hence resultant channel matrix is given as,
P(Z/X) = PYY/X)* P(Z/Y)
_ [08 027/07 03
~ [02 08}[03 07
_ [0.62 0.38
~ [0.38 0.62
il) To obtain P(z;) and P(z2) :
The probabilities of Z; and Z2 are given as,
P(Z) = P(X) P(Z/X)
= (Pla) Plas)) [oss al
0.38 0.62
0.62 0.38
= [0.6 0.4)
8.0 ile oar
= [0.524 0.476]
Thus P(z;) = 0.524 and
P(z2) = 0.476.
1.9.3 Rate of Information Transmission Over a Discrete Channel
The entropy of the symbol gives average amount of information going into the
channel. ie.,
1
i HX) = > Pi w{ +] + (1.920)
mA Pi
| Let the symbols be generated at the rate of ‘r' symbols per second. Then the
| average rate of information going into the channel is given as,
| Din = rH(X) bits /sec a (1.921)Information Coding Techniques 4-68 _ Information Entropy Fundamentals
Errors are introduced in the data during the transmission. Because of these errors,
some information is lost in the channel. The conditional entropy H(X /Y) is the
measure of information lost in the channel. Hence the information transmitted over
the channel will be,
‘Transmitted information = H(X)-H(X /Y) (1.9.22)
Hence the average rate of information transmission D, across the channel will be,
= [H(X)- HiX / Y)Jr_ bits/sec w» (1.9.23)
When the noise becomes very large, then X and Y become statistically
independent. Then H(X / Y) = H(X) and hence no information is transmitted over the
channel. In case of errorless transmission H(X /Y) = 0, hence Dy, =D;. That is the
input information rate is same as information rate across the channel. No information
is lost when H(X / Y) = 0.
ump Example 1.9.3: Fig. 1.9.4 shows the binary symmetric channel. Find the rate of
information transmission across this channel for p = 0.8 and 0.6. The symbols are
generated at the rate of 1000 per second. P(xo)= P(x) = 3 Also determine channel
input information rate.
Pig=t 0
Pen) = watt
Fig. 1.9.4 Binary symmetric channel of Ex. 1.9.3
Solution : (i) To obtain entropy of the source :
‘The entropy of the source can be obtained as,
A(X) = PC) logs a + P(xi) loge 5—
7a)
1
~ Flogs 2 +5 logs 2
= 1 bit/symbol.Information Coding Techniques 4-69 Information Entropy Fundamentals
(ii) To obtain input information rate :
The input information rate is,
Dn = TH(X)
Here r = 1000 symbols sec. Hence above equation will be,
Dy = 10001 = 1000 bits/sec.
To obtain P(yo) and P(y;) :
The probability transition matrix for binary symmetric channel of Fig. 1.9.4 can be
written as,
_ [Po / x0) PQ / x0)
Pe posse P(n i
“he
l-p Pp
From equation 1.9.13 we can write the probabilities of output symbols as,
Po) = Pew Pell”, 5")
“Bales
i
beet $0-p)
"
50-p)idp 2
Thus P(yo)= 4 and P(yi)=}.
(iv) To obtain P(x,,y;) and P(x:/ yj):
Now let us obtain the conditional probabilities P(X /¥). From the probability
theory we know that,
P(AB) = P(A /B) P(B) = P(B/ A) P(A) v (1.9.24)
P(Xo Yo) = Plyo / Xo) P(xo)
whe
PXZ=5P
Similarly P(x yo) = P(yo /m1) P(mi)Information Coding Techniques 14-70
Information Entropy Fundamentals
and
Pixo yi)
Pla yn)
a-px5
1
30-y)
Plyr / x0) P(x)
1_l,_
(-prx5=h0-p)
Py /m) Pon)
iil
PXO= OP
Now from equation 1.9.24 we can obtain the condition probabilities P(X / Y).
Plxo / yo) Plyo) = Plyo / x0) Pxo)
Similarly,
and
P(xo / yo)
P(x / Yo)
PQo / yn)
Poa /y)
Plyo / x0) Plu)
Plyo)
Plyo / x1) P(r)
P(yo)
(-p)x}
=
Plyn / x0) Plexo
Poi)
a
xin
OPS
1
2
Ply / m1) Pa)
Pi)Information Coding Techniques: 1-71 Information Entropy Fundamentals:
All the calculated results are given in Table 1.9.1 below
Quantity Value
Plo) = Pls) 3
2
P(Yo) = Pr) 4
PUI x) = Plo! X0) Pp
PUY! Xo) = Po! Xs) 1p
P(x yo) je
1
Pom ye) doen)
Pooy) do-w)
Posy) te
P(xo!¥o) p
P(xa!/ys) Pp
P(x1/¥o) ep
Poaly) Pp
Table 1.9.1 Conditional probabilities of Fig. 1.9.4
(v) To obtain information rate across the channel :
The conditional entropy H(X / Y) is given by equation 1.9.14 as,
a:
-$$ a
Hx /¥) & 2 I Re BETH
Here M =2 and expanding the above equation we get,
H(X/Y) = P(xo, yo) log2
+ Plo, yi) log>
—— i. a
Plo 7 yo) Poe 7H)
1 . gece
+ POs, yo) logs BET yay * Pett Yn) 1082 DEE Ty
Putting the values in above equation from Table 1.9.1,
=1 ptogsd+ta- 1 +ka-pylog: Ly tog2 4
HX /Y) = zploga | + 5(1—P)loga 7 +5(l- Plog 7, + gP oB2 5Information Coding Techniques 1-72 Information Entropy Fundamentais
oa (1.9.25)
ie H(K/Y) = ploga 5 +(1~pilog2 s+ )
For p = 08 , H(X / Y) becomes,
_ 1 1
HX/Y) = 08loga 7g + -O8)log2 Gay
= 0.721928 bits/symbol
Hence from equation 1.9.23, the information transmission rate across the channel
will be,
Dy = (1- 0.721928) «1000
278 bits/sec
For p =0.6, H(X / Y) of equation 1.9.25 becomes,
u
H(X /Y) = 0.6log2 — + (1-0.6)log2
06
= 0.97 bits/symbol
Hence from equation 1.9.23, the information transmission rate across the channel
will be,
(1-06)
Dy, = (1-0.97)x 1000
= 29 bits/sec
The above results indicate that the information transmission rate decreases rapidly
a8 p approaches 4
1.9.4 Capacity of a Discrete Memoryless Channel
In the previous subsection we discussed the rate of information transfer across the
discrete memoryless channel. The channel capacity is denoted as C.
It is given as,
a CF p09! vn (1.9.26)
Putting for D; from equation 1.9.23,
max
. X) = * vu (1.9.27)
C= og iH@)- HK /N-r (1.9.27)
The maximum is taken with respect to probability of random variable X. The
channel capacity can be defined as ‘xe maximum possible rate of information
transmission across the channel.Inform:
n Coding Techniques 1-73 Information Entropy Fundamentals
‘mp Example 1.9.4: A channel has the following channel matrix,
1-p p 0
P(Y /X)] =
POs] [‘s P irl
i) Draw the channel diagram.
(ii) If the source has equally likely outputs, compute the probabilities associated with
the channel outputs for p = 02
Solution : Given data wakes
Paes P(y2/m) P(ys /)}
P(yr / x2) Ply2/32) P(ys/ »)|
_fi-p p 0
“| 0 p i-p
(i) To obtain channel diagram :
From above matrices we can prepare the channel diagram as follows :
P(Y/X) =
Plyyiky)=1-P
Poyging)=t-P
Fig. 1.9.5 Binary erasure channel
In the above channel diagram, observe that there are two input symbols and, three
output symbols. At the output symbol y2 represents error (e). Such channel ig,called
binary erasure channel.
(ii) To calculate P(y,), P(y2) and P(ys) :
It is given that source emits 1 and x2 with equal probabilities. Hence,
1
Pn) = 4 and Powe
The output probabilities are given as,
Pm) }
1- 9
ee)| = teem) Pell g? Fy?
P(ys),Information Coding Techniques 1-74 Information Entropy Fundamentals
It is given that p = 0.2. Putting values in above equation,
(Pm)
1 17/08 02 0
P(y2)| = [4 2
| (2) 2 z1(° 02 ail
(Ps) &
sl cos+0
eo 4 04
= [5 x02+5x02) =|02
dxo+ 5x08 Bes
Thus P(yi) = 04, P(y2)=02 and P(y3)= 04
wm Example 1.9.5 : A discrete memoryless source with three symbols, with probabilities
P(x1)=P(xs) and P(x2)=a, feeds into discrete memoryless channel shown in
Fig. 1.9.6.
Xx
Xy
Xa
Fig. 1.9.6 A discrete channel
(i) Sketch the variation of H(X) with «. Determine maximum value of H(X) and show
it on the sketch.
(ii) Determine the transition matrix for discrete memoryless channel.
(iii) Determine the maximum value of entropy H(Y) at the channel output
Solution : (i) To obtain variation of H(X) with a :
it is given that,
P(x) = @
We know that, P(x) +P(x2)+P(as) = 1
Since P(x2)=P(xs), P(x) +2P(%2)=1
a@+2P(x2) = 1
l-a
2
P(x2) =Information Coding Techniques 1-75 Information Entropy Fundamentals
And P(x2) = P(Qx3)=
Entropy of the source is given as,
H(X) = & Pos)togs 55 Fay
"
=a toga 2 + (a) logz toa
Following table shows the calculations of H(X) with respect to a.
a o | o4 | 02 | 03
H(x)| 1 | 1.368 | 1.522 | 1.581] 1.57 | 15 | 137 | 1.18 | 0.921 | 0.569] 0
Table 1.9.2 H(X) with respect to a
Fig. 1.9.7 shows the sletch of H(X) versus a as per above calculations.
04 | os | 06 | 07 | o8 | o9 1.0
Fig. 1.9.7 Plot of H(X) versus «Information Coding Techniques 1-76 Information Entropy Fundamentals
To obtain maximum value of H(X) :
Maximum value of H(X) can be obtained by differentiating H(X) with respect to a
and equating it to zero. ie,
gHOD. #[ tos: | [0-29 08: 2 }
da da l-a
We know that log2 — =—log2 a. Hence above equation becomes,
a
dH) _ dp af. lea
HO). 4 ategaa] +i (-a)log:(45 }
1 loge =
in| gt Jet a1 (tof 2 dee)
= {e é TES +1on2 0} (oe +log2( 3 }e 1)
ar kese}- (ia)
wf ree hese gy AEE,
logi02
loge
logive I-a
\-{ Togo? 08? 3 }
== {logs e+ logaa}~ {logs ¢~togs 15%}
=~ log: e-log2 a +log2 e+ log: 15%
= ~log: 4 +og2(452)
Above derivative will be zero when H(X) is maximum.
dH®) _ ..
Hence AT = 0 sives,
0 = -log2a +log2
log2a = tog: (454)Information Coding Techniques 1-77 __ Information Entropy Fundamentals
l-a
2
or 2a =1-a > a=
Thus at o =}, H(X) will be maximum. This maximum value will be,
Z
Hyox(X) = }log2 343 loga ws
2 5 log2 343 log2 3
= log2 3 = 1.5849
(i) To determine the transition matrix :
The given channel diagram is redrawn here with transition matrix values,
Based on the above diagram, the transition probability matrix can be written as
follows :
Plysix,)=0.6
yy
Plyy/xg)=0.2
Plyghxa)0.8
Fig. 1.9.8 Channel diagram with transition probability values
Pi /m) P(yz/m)| [06 nA]
P = |P(yi/x2) P(y2/x2)|=|0 1
P(y, /x3) P(y2 /x3)} {02 08|
This is the 3x 2 channel transition matrix.
(ii) To obtain the maximum value of H(Y) :
First we have to obtain probabilities of output. ie,
PQ /m) P(y2/m)
[rey] = GD Pea) Plesl|P sa) PGs f=)
@), P(yy / x3) P(y2 (x3),Information Coding Techniques 4.7% Information Entropy Fundamentals
Uo U4
0 4
02 08
= ja
I
noo +0 +02 45%
| ora +52 +0945%
ie P(y)=0.1+05a and P(y2)=09-050
Hence entropy of output becomes,
if 1
Hi =P log2 ——~ +P | ae
O) = PU)loga Biyy +P (va) lo82 BET
==P(yi) log2 P(yi)~P(y2) loge P(y2)
=~(0.1+ 0.5a) log > (0.1 + 0.54) -(0.9- 0.5) log 2(0.9- 0.50)
Now the maximum value can be obtained by differentiating H(Y) with respect to
a and equating it to zero. ie.,
ee) = -t {(0.1+0.5a)log2 (0.1 +05a)} A {0.9-0.5u log 2 (0.9 -0.5a)}
log. (0.1+0.5a)}
{a 14050) jag ‘1082 0-1+05a)0 5}
I
(09-050). pee
5 gs (09- 0.5a)(— 03
=-{00. 140. SETTER ed Jagr 1 O-Slog2 (0.10. 52)
5) -05 .
Fay age 0.5 log 2 (0.9-0.5a)
05 0.5 |
= -{-S2+0.5 log2 (0.1+0.5a)-——>. - 0.5 log2 (0.9-0.5
(se 0g 2 (0.1+0.5a) fogs 0g 2 ( @)p
= ~ {0.5 log (0.1+0.5a)~0.5 log2 (0.9-0.5a)}
dH)
da.
For maximum, = 0 gives,
0 = ~{0.5log2(0.1+0.5a)-0.5log2 (0.9-0.5a)}
log (0.14+0.5a) = logs (0.9-0.5a)
0.14050 = 09-050
a = 08Information Coding Techniques 1-79 _ Information Entropy Fundamentals
And H(Y) ata = 08 will be,
H(X) = -(0.1+0.5x 0.8) log 2(0.1+ 0.50.8) -(0.9-0.5x 0.8)
log: (0.9-0.5x0.8)
= -05 logs 0.5-0.5 logs 0.5
= = log: 0.5=1
Thus maximum entropy of Y is H(Y)=1 and it occurs at a = 0.8.
1.10 Mutual Information
The mutual information is defined as the amount of information transferred when
x; is transmitted and y; is received. It is represented by I(x, y;) and given as,
P(x / 7
104, ¥j) = tog YP bits vs» (1.10.1)
Here I(x, y;)is the mutual information
P(x; / y;) is the conditional probability that x; was transmitted and y; is received.
P(x;) is the probability of symbol x; for transmission.
The average mutual information is represented by I(X;Y). It is calculated in
bits/symbol. The average mutual information is defined as the amount of source
information gained per received symbol. Here note that average mutual information is
different from entropy.
It is given as,
UX; Y) = DY Pos, y)) Iai, yj)
mis
Thus I(x), y))is weighted by joint probabilities P(aj, yj) over all the possible joint
events. Putting for I(x), yj) from equation 1.10.1 we get,
wy=% > Pla, yj)log> Ra)
= .. (1.10.2)
ae
1.10.1 Properties of Mutual Information
The mutual information has following properties :
i) The mutual information of the channel is symmetric. i
UX: Y) = WY; X)
ii) The mutual information can be expressed in terms of entropies of channel
input or output and conditional entropies. ie,.Information Coding Techniques 1-80 Information Entropy Fundamentals
UX; Y) = H(X)-H(X /¥)
= H()-H(Y/X)
Here H(X /Y) and HIY / X) are conditional entropies.
iii) The niatual information is always positive, ie.,
GY) 2 0
.v)The mutual information is related to the joint entropy H(X, Y) by following
relation
U(X; Y) = H(X)+ HY) - HOY)
‘a> Example 1.10.4: Prove that the mutual information of the channel is symmetric i.e.,
10 Y) = HY; X)
Solution : Let us consider some standard relationships from probability theory. These
are as follows.
POx; yj) = Pla / yj) Ply) . (1.10.3)
and POs yj) = yj / x1) PO) vw» (1.10.4)
Here P(x y;) is the joint probability that x; is transmitted and y; is received.
P(x; /y;) is the conditional probability of that x is transmitted and y; is
received,
P(y; /x:) is the conditional probability that y; is received and x; is
transmitted.
P(x) is the probability of symbol x; for transmission.
Py) is the probability that symbol y; is received.
From equation 1.10.3 and equation 1.10.4 we can write,
Posi fy) PQY)) = Plyy £1) PO)
Pox / yi) Py; / xi)
——— = .» (1.10.5)
P(x) Py;) a I
The average mutual information is given by equation 1.10.2 as,
a we ai
16 = ¥& ¥ PGi, y)) logs rap . (1.10.6)
fat jot
Hence we can write I(Y; X) as follows.
a ge Ply; / x4
1%) = Sd Pei, yj) log ae va (1.10.7)
fo jetinformation Coding Techniques 1-81 Information Entropy Fundamentals
From equation 1.10.5 the above equation can be written as,
Le Pos /¥)
10%) = YY Pls, yj) loge wae
it ft
= 1X; Y) from equation 1.10.6
Thus U(X; Y) = HY; X) ~. (1.10.8)
Thus the mutual information of the discrete memoryless channel is symmetric.
wm Example 1.10.2: Prove the following relationships.
U(X; Y) = H(X) - H(X/Y)
1K; Y) = HO)-H(Y/X)
Solution : Here H(X / Y) is the conditional entropy and it is given as,
- ’
H(X/ = ie I > 1.10.9}
cen = FE re 8s KTH eae)
H(X / Y) is the information or uncertainty in X after Y is received. In other words
H(X /Y) is the information lost in the noisy channel. It is the average conditional self
information.
Consider the equation 1.10.2,
‘ PO / ¥y)
Woo = FS PG, y)) loge
iat jot
Let us write the above equation as,
. Y 3 1
1a = YY Pes, vlogs pers
fo jet
-¥ F Pew, y;) logs ea
iat jat
From equation 1.10.9, above equation can be written as,
wer = > ¥ Pes, yj) loge ey Hoe) (1.10.10)
ist jaa
Here let us use the standard probability relation which is given as follows :
S Pos wi) = Pa)
inInformation Coding
Information Entropy Fundamentals
Hence equation 1.10.10 will be,
UX; ) = 3 P(x) luge ry HX /Y) . (1.10.11)
im i
PC
First term of the above equation represents entropy. ie.,
HX) = ¥; PGi) fogs Fea .. (1.10.12)
Hence equation 1.10.11 becomes,
U(X; Y) = H(X)-H(X /Y) w+ (1.10.13)
Here note that [(X; ¥) is the average information transferred per symbol across the
channel. It is equal to source entropy minus information lost in the noisy channel is
given by above equation
Similarly consider the average mutual information given by equation 1.10.8,
Py; /%)
WY; X) = PU, I _—
oi = YY Pei, y/) tog2 mG.)
fait
By gh 1
is Yj) logs
ze Pou.) loB2 By
-¥ y Pi, Wj) loge Py 7m Loy vx» (1.10.14)
a i
ist jet
The conditional entropy H(Y / X) is given as,
Hy /x) = ¥ ¥ Pix, yj) logs
ietjeat
1
ae 10.1
By 7H) (1.10.15)
Here H(Y / X) is the uncertainty in Y when X was transmitted. With this result,
equation 1.10.14. becomes,
1; X) = + = P(x, yj) log2 A -H(Y / X) . (1.10.16)
As uw)
Here let us use the standard probability equation,
¥ Pex, y) = Py) (1.10.17)
a
Hence equation 1.10.16 becomes,Information Coding Techniques 14-83 Information Entropy Fundamentals
1
-H(¥ /X)
Puy) ey / X)
1) = ¥ Ply) log:
i
The entropy is given by equation 14.6 Hence first term of above equation
represents H(Y). Hence above equation becomes,
IY, X) = H()- H(Y / X) (1.10.18)
Note that the above result is similar to that of equation 1.10.13.
wm Example 1.10.3: Prove that tie mutual information is alzeays positive ie.,
1Y) 2 0
Solution : Mutual information is given by equation 1.10.2 as,
rey = ¥ ¥ Py, y,)togs
es jst
w- (1.30.19)
. POX, vi)
Fre att 1.10.3, P(x, / ome
rom equation (x; / yj) Pi)
Putting above value of P(x; / y,) in equation 1.10.19,
¥ pei, ydlogs pee?
z
X;¥) Poa) Py)
mj
We know that log> * can be written as, ~log 2 x Hence above equation becomes,
y
PO) P(yj)
1G) = - & » Posi, yes aay
0
This equation can be written as,
PO) PY)
sue v= (1.10.20
POI, Yj) ( )
10%) = Pei, wtogs
iS jst
Earlier we have derived one resuit given by equation 1.4.11. It states that,
z ee
peloga(2) <
JP PL
This result can be applied to equation 1.10.20. We can consider px be P(x; y;) and
gu be P(x;) P(y;). Both px and qx are two probability distributions on same alphabet.
Then equation 1.10.20 becomes,
-IKGY) < 0
ie, (X;Y) 2 0 - (1.10.21)Information Coding Techniques 1-84 Information Entropy Fundamentals
The above equation is the required proof, It says that mutual information is
always non negative.
mm> Example 1.10.4 = Prove the following,
1X; Y) = H(X) +H) -HiX, Y)
Solution : In Ex. 1.10.1 we have derived following relation :
HX, Y) = H(X/Y)+H™)
HX/Y) = H(X, Y)- HM) v= (1.10.22)
Mutual information is given by equation 1.10.13 also i.e.,
UX; Y) = H(X)-HX/¥)
Putting for H(X / Y) in above equation from equation 1.10.22,
UX; ¥) = HX) +HM)-HK,Y) v= (1.10.23)
Thus the required relation is proved.
wm Example 1.10.5: Consider the binary symmetric channel shown in Fig. 1.10.1
Calculate H(X), HCY), H(Y/X) and WX; Y).
Payy=P yy
PX Fane Ye
Fig. 1.10.1 Binary symmetric channel
Solution : Here for BSC, the various probabilities are as follows :
Pu) = p and P(x2)=1-p
Ply / x2) = Ply2 /m) =a
Py / x) = Ply2 /x2) =1-aaa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.aa
You have either reached a page that is unavailable for viewing or reached your viewing limit for this
book.Oe Cee aay
Coe ee ee ee eee oe ee ea Lee aD acd
‘memory less channels - Channel capacity - Channel coding theorem - Channel capacity theorem,
Cee hte
Differential puse code modulation - Adaptive difieretial pulse code modulation - Adaptive subband coding - Delta
‘modulation - Adaptive delta modulation - Coding of speech signal at low bitrates (Vocoders, LPC),
© Error Control Coding
Linear block codes - Syndrome decoding - Minimum distance consideration - Cyclic codes - Generator polynomial -
a eats See a ene ee eee eu ee en sey
ee rtd
Principles - Text compression - Static Huffman coding - Dynamic Huffman coding - Arithmetic coding - Image
compression - Graphics interchange format - Tagged image file format - Digitized documents - Introduction to JPEG
Rens
eon A Coet or}
Linear predictive coding - Code excited LPC - Perceptual coding, MPEG audio coders - Dolby audio coders - Video
Cee ee ene ae seh ee ees
First Edition : 2009
Price INR 250/-
UT ISBN 81-89411-69-1
PEeUIg ISBN 978-81-89411-69
Technical Publications Pune | |
DMCC ser ga Oe
BOM rene Wee eE ent sera
A Siate RUA AAA ULstole Mero