Chap_07
Chap_07
Digital Communications
ECE 6640 2
Sklar’s Communications System
Notes and figures are based on or taken from materials in the course textbook:
ECE 6640 Bernard Sklar, Digital Communications, Fundamentals and Applications, 3
Prentice Hall PTR, Second Edition, 2001.
Signal Processing Functions
Notes and figures are based on or taken from materials in the course textbook:
ECE 6640 Bernard Sklar, Digital Communications, Fundamentals and Applications, 4
Prentice Hall PTR, Second Edition, 2001.
Waveform Coding Structured Sequences
• Structures Sequences:
– Transforming waveforms in to “better” waveform representations
that contain redundant bits
– Use redundancy for error detection and correction
• Block Codes are memoryless
• Convolution Codes have memory!
ECE 6640 5
Convolutional Encodings
ECE 6640 6
Convolutional Encoder Diagram
• Each message, mi, may be a
k-tuple. (or k could be a bit)
• K messages are in the
encoder
• For each message input, an
n-tuple is generated
• The code rate is k/n
ECE 6640 7
Proakis Convolution Encoder
ECE 6640 9
Connection Representation
• k=1, n=3
• Generator Polynomials
– G1 = 1 + X + X2
– G2 = 1 + X2
• To end a message, K-1 “zero” messages are transmitted.
This allows the encoder to be flushed.
– effective code rate is different than k/n … the actual rate would be
(2+k*m_length)/n*m_length
– a zero tailed encoder ….
ECE 6640 10
Impulse Response of the Encoder
• allow a single “1” to transition through the K stages
– 100 -> 11
– 010 -> 10
– 001 -> 11
– 000 -> 00
• If the input message where 1 0 1
– 1 11 10 11
– 0 00 00 00
– 1 11 10 11
– Bsum 11 10 00 10 11
– Bsum is the transmitted n-tuple sequence …. if a 2 zero tail follows
– The sequence/summation involves superpoition or linear addition.
• The impulse response of one k-tuple sums with the impulse responses of
successive k-tuples!
ECE 6640 11
Convolutional Encoding the Message
ECE 6640 12
Proakis (3,1), rate 1/3, K=3 Pictorial
• Generator Polynomials
– G1 = 1
– G2 = 1 + X2
– G3= 1 + X + X2
Solid Lines
are 0 inputs
Dashed Lines
are 1 inputs
ECE 6640 JohnG. Proakis, “Digital Communications, 4th ed.,” McGraw Hill, Fourth 16
Edition, 2001. ISBN: 0-07-232111-3.
Tree Diagram
ECE 6640 17
Proakis (3,1) K=3
Polynomial and Tree
g1 1 0 0
g 2 1 0 1
g 3 1 1 1
ECE 6640 JohnG. Proakis, “Digital Communications, 4th ed.,” McGraw Hill, Fourth 18
Edition, 2001. ISBN: 0-07-232111-3.
Trellis Diagram
ECE 6640 19
Trellis Diagram
ECE 6640 John G. Proakis, “Digital Communications, 4th ed.,” McGraw Hill, Fourth 21
Edition, 2001. ISBN: 0-07-232111-3.
A more complicated example follows
ECE 6640 22
Proakis (3,2) K=2
Pictorial
ECE 6640 JohnG. Proakis, “Digital Communications, 4th ed.,” McGraw Hill, Fourth 24
Edition, 2001. ISBN: 0-07-232111-3.
Proakis (3,2) K=2
State Diagram
Solid Lines
are 0 inputs
Dashed Lines
are 1 inputs
ECE 6640 JohnG. Proakis, “Digital Communications, 4th ed.,” McGraw Hill, Fourth 25
Edition, 2001. ISBN: 0-07-232111-3.
Proakis (3,2) K=2
Trellis Diagram
ECE 6640 JohnG. Proakis, “Digital Communications, 4th ed.,” McGraw Hill, Fourth 26
Edition, 2001. ISBN: 0-07-232111-3.
Encoding
ECE 6640 27
Decoding Convolutional Codes
ECE 6640 28
ECE 5820 MAP and ML
PY y | X x PX x
P X x | Y y
PY y
ECE 6640 29
ECE 5820 MAP and ML
ECE 6640 30
ECE 5820 Markov Process
– where Zi is the ith branch of the received sequence Z, zji is the jth
code symbol of Zi and similarly for U and u.
ECE 6640 32
ML Computed Using Logs
ECE 6640 33
Channel Models:
Hard vs. Soft Decisions
• Our previous symbol determinations selected a detected symbol with
no other considerations … a hard decision.
• The decision had computed metrics that were used to make the
determination that were then discarded.
• What if the relative certainty of decision were maintained along with
the decision.
– if one decision influenced another decision, hard decisions keep certainty
from being used.
– maintaining a soft decision may allow overall higher decision accuracy
when an interactions exists.
ECE 6640 34
ML in Binary Symmetric Channels
P Z | U m p dm 1 p
L dm
1 p
log P Z | U m dm log L log1 p
p
– The constant is identical for all possible U and can be pre-computed
– The log of the probability ratios is also a constant
log P Z | U m dm A B
ECE 6640 37
Viterbi Decoder Trellis
encoder trellis
ECE 6640 38
Viterbi Example
• m: 1 1 0 1 1
• U: 11 01 01 00 01
• Z: 11 01 01 10 01
merging
paths
ECE 6640 39
Viterbi Example
• m: 1 1 0 1 1
• U: 11 01 01 00 01
• Z: 11 01 01 10 01
ECE 6640 40
Add Compare Select
Viterbi Decoding Implementation
• Section 7.3.5.1, p. 406
Possible Connections
ECE 6640 41
Add Compare Select
Viterbi Decoding Implementation
• State Metric Update based on new Branch Metric Values
– Hard coding uses bit difference measure
– Soft coding uses rms distances between actual and expected branch
values
– The minimum path value is maintained after comparing incoming
paths.
– Paths are eliminated that are not maintained.
• When all remaining paths use the same branch, update the output
sequence
• Path history does has to go back to the beginning anymore …
ECE 6640 42
MATLAB
– t2 = poly2trellis([3],[7 7 5])
– t2.outputs
– t2.nextStates
ECE 6640 43
MATLAB Simulations
• Communication Objects.
– see ViterbiComm directory for demos
– TCM Modulation
• comm.PSKTCMModulator
• comm.RectangularQAMTCMModulator
• comm.GeneralQAMTCMModulator
– Convolutional Coding
• comm.ConvolutionalEncoder
• comm.ViterbiDecoder (Hard and Soft)
ECE 6640 44
Properties of Convolutional Codes
• Distance Properties
– If an all zero sequence is input and there is a bit error, how and
how long will it take to return to an all zeros path?
– Find the “minimum free distance”
• The number of code bit errors required before returning
• Note that this is not time steps and not states moved through
• This determines the error correction capability
d 1
t f
2
ECE 6640 46
Computing Distance Caused by a One
47
ECE 6640
Computing Distance, Number of Branches,
and Branch Transition caused by a One
• Split the state diagram to start at 00.. and end at 0..
– Show state transitions with the following notations
– D: code bit errors for a path
– L: one factor for every branch
– N: one factor for every branch taken due to a “1” input
• Define the state equations using the state diagram
– Determine the result with the smallest power of D and interpret
– See Figure 7.18 and page p. 412
ECE 6640 48
Computing Distance, Number of Branches,
and Branch Transition caused by a One
Interpretation:
N=1 branch transitions caused by a 1 input
L=3 number of branches taken counter
D=5 number of 1 outputs that occurs (Hamming distance of error)
ECE 6640 49
Performance Bounds
• Upper Bound of bit error probability
dTD, N
PB
dN N 1,D 2 p1 p
D5 N
– for Figure 7.18 and Eq. 7.15 on p. 412 TD, N
1 2 D N
dTD, N D5 D5 N
2 D
dN 1 2 D N 1 2 D N 2
1 2 D N 2 N D D 5
1 2 D N 2
D5
1 2 D N 2
PB
D 5
2 p 1 p
5
ECE 6640
1 2 D N 2 N 1, D 2 p1 p 1 4 p 1 p
2
50
Performance Bounds
• For
EC E k E
r b b
N0 N0 n N0
5 Eb
PB Q exp 5 E b 1
2 N 2
N0 0 Eb
1 2 exp
2 N
0
ECE 6640 51
Coding Gain Bounds
• From Eq. 6.19
E
G dB b dB E b dB, for Pb Same Value
N 0 uncoded N 0 coded
• This is bounded by
– The 10 log base 10 of the code rate and the min. free distance
G dB 10 log10 r d f
• Coding Gains are shown in Tables 7.2 and 7.3, p. 417
ECE 6640 52
Proakis Error Bounds (1)
1 d d 2 d
d k
P2 d p 1 p p 1 p
d nk
1 2
2 d k d 2 k
2
Pe a d P2 d
d d free
ECE 6640 54
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Soft Decision Viterbi
ECE 6640 55
Other Decoding Methods
ECE 6640 56
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Sequential Decoding
• Complexity
– Viterbi grows exponentially with constraint length
– Sequential is independent of the constraint length
• Can have buffer memory problems at low SNR (many trials)
ECE 6640 57
Feedback Decoding
ECE 6640 58
References
• http://home.netcom.com/~chip.f/viterbi/tutorial.html
• http://www.eccpage.com/
ECE 6640 59
Practical Considerations (1)
ECE 6640 60
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Practical Considerations (2)
• Two important issues in the implementation of Viterbi decoding are
1. The effect of path memory truncation, which is a desirable feature that ensures a
fixed decoding delay.
2. The degree of quantization of the input signal to the Viterbi decoder.
• As a rule of thumb, we stated that path memory truncation to about five
constraint lengths has been found to result in negligible performance loss.
• In addition to path memory truncation, the computations were performed with
eight-level (three bits) quantized input signals from the demodulator.
ECE 6640 61
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Practical Considerations (3)
ECE 6640 62
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
MATLAB
• ViterbiComm
– poly2trellis: define the convolutional code and trellis to be used
– istrellis: insuring that the trellis is valid and not catastrophic
– distpec: computes the free distance and the first N components of
the weight and distance spectra of a linear convolutional code.
– comm.ConvolutionalEncoder
– quantiz - a quantization index and a quantized output value
allowing either a hard or soft output value
– comm.ViterbiDecoder – either hard or soft decoding
– bercoding
– Viterbi_Hard.m
– Viterbi_Soft.m
ECE 6640 63
Supplemental Information
ECE 6640 64
Figure 8.1-2
ECE 6640 65
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Polynomial Representation
Example 8.1-1
ECE 6640 67
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Decoding Convolutional Codes
ECE 6640 68
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Transfer Function Example (1)
ECE 6640 69
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Transfer Function Example (2)
ECE 6640 70
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Transfer Function Example (3)
ECE 6640 72
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Augmenting the Transfer Function (1)
• The transfer function can be used to provide more detailed information
than just the distance of the various paths.
– Suppose we introduce a factor Y into all branch transitions caused by the
input bit 1. Thus, as each branch is traversed, the cumulative exponent on
Y increases by 1 only if that branch transition is due to an input bit 1.
– Furthermore, we introduce a factor of J into each branch of the state
diagram so that the exponent of J will serve as a counting variable to
indicate the number of branches in any given path from node a to node e.
ECE 6640 73
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
Augmenting the Transfer Function (2)
• This form for the transfer functions gives the properties of all the paths
in the convolutional code.
– That is, the first term in the expansion of T (Y, Z, J ) indicates that the distance
d = 6 path is of length 3 and of the three information bits, one is a 1.
– The second and third terms in the expansion of T (Y, Z, J ) indicate that of the two
d = 8 terms, one is of length 4 and the second has length 5.
ECE 6640 74
John G. Proakis, “Digital Communications, 5th ed.,” McGraw Hill, Fourth Edition, 2008.
ISBN: 978-0-07-295716-6.
References (Conv. Codes)
• K. Larsen, "Short convolutional codes with maximal free distance for rates
1/2, 1/3, and 1/4 (Corresp.)," in IEEE Transactions on Information Theory,
vol. 19, no. 3, pp. 371-372, May 1973.
– http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1055014
• E. Paaske, "Short binary convolutional codes with maximal free distance for
rates 2/3 and 3/4 (Corresp.)," in IEEE Transactions on Information Theory,
vol. 20, no. 5, pp. 683-689, Sep 1974.
– http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1055264
• J. Conan, "The Weight Spectra of Some Short Low-Rate Convolutional
Codes," in IEEE Transactions on Communications, vol. 32, no. 9, pp. 1050-
1053, Sep 1984.
– http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1096180
• Jinn-Ja Chang, Der-June Hwang and Mao-Chao Lin, "Some extended results
on the search for good convolutional codes," in IEEE Transactions on
Information Theory, vol. 43, no. 5, pp. 1682-1697, Sep 1997.
– http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=623175
ECE 6640 75