0% found this document useful (0 votes)
102 views34 pages

Digital Communications (EECS-4214) : (Fall-2021)

The document is a slide presentation for a course on digital communications. It introduces the topics that will be covered in the course including probability concepts, random variables, random processes, analog and digital signals, sampling, quantization, source and channel encoding/decoding, modulation, and mathematical channel models. The presentation provides an overview of these key concepts in digital communications at a high level.

Uploaded by

Samyak Jain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
102 views34 pages

Digital Communications (EECS-4214) : (Fall-2021)

The document is a slide presentation for a course on digital communications. It introduces the topics that will be covered in the course including probability concepts, random variables, random processes, analog and digital signals, sampling, quantization, source and channel encoding/decoding, modulation, and mathematical channel models. The presentation provides an overview of these key concepts in digital communications at a high level.

Uploaded by

Samyak Jain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Digital Communications (EECS-4214)

[Fall-2021]
Hina Tabassum
Department of Electrical Engineering and Computer Science
York University, Canada
Review of Probability Concepts Random Variables Random Processes

Week 0

Introduction to Course
Communication System
Analog vs Digital Communication
Transmitter
Channel
Receiver

(Hina Tabassum) 2 / 34
Review of Probability Concepts Random Variables Random Processes

A Walk through!

(Hina Tabassum) 3 / 34
Review of Probability Concepts Random Variables Random Processes

Analog vs Digital Sources

Analog Sources: Microphone actuated by a speech, TV Camera scanning a scene


Digital Sources: These are teletype or the numerical output of computer which
consists of a sequence of discrete symbols or letters.
An Analog information is transformed into a discrete information through the process
of sampling and quantizing.

(Hina Tabassum) 4 / 34
Review of Probability Concepts Random Variables Random Processes

Analog to Digital Conversion

Sampling: makes signal discrete in time


Quantization: makes signal discrete in amplitude
Good quantizers are able to use few bits and minimize distortion.

(Hina Tabassum) 5 / 34
Review of Probability Concepts Random Variables Random Processes

Source Encoder and Decoder

Source Encoder converts the input (i.e. symbol sequence) into a binary sequence
of 0’s and 1’s by assigning code words to the symbols in the input sequence.

Source Decoder converts the binary output of the channel decoder into a symbol
sequence at the receiver. The decoder for a system using fixed – length code words
is quite simple, but the decoder for a system using variable – length code words
will be very complex.

(Hina Tabassum) 6 / 34
Review of Probability Concepts Random Variables Random Processes

Channel Encoder and Decoder


Channel Encoder: Error control is accomplished by the channel coding operation
that consists of systematically adding extra bits to the output of the source coder.
These extra bits do not convey any information but helps the receiver to detect
and/or correct some of the errors in the information bearing bits.
Types of Channel Coding:
Block Coding: The encoder takes a block of ‘k’ information bits from
the source encoder and adds ‘r’ error control bits, where ‘r’ is
dependent on ‘k’ and error control capabilities desired.
Convolutional Coding: The information bearing message stream is
encoded in a continuous fashion by continuously interleaving
information bits and error control bits.
Channel Decoder: recovers the information bearing bits from the coded binary
stream and performs error detection and correction.
The important parameters of coder / decoder are: Method of coding, efficiency,
error control capabilities and complexity of the circuit.

(Hina Tabassum) 7 / 34
Review of Probability Concepts Random Variables Random Processes

Modulation

Modulation: Converts digital data to a continuous waveform suitable for


transmission.
Amplitude Shift Keying (ASK)
Frequency Shift Keying (FSK)
Phase Shift Keying (PSK)
Demodulation: Converts continuous waveform suitable into a digital data for
transmission.

(Hina Tabassum) 8 / 34
Review of Probability Concepts Random Variables Random Processes

Mathematical Channel Models: Additive Channel

Additive noise channel


Attenuation, additive noise
r(t) = As(t) + n(t),
where
r(t) is the received signal,
A is the attenuation,
s(t) is the transmitted signal,
n(t) is the noise.

(Hina Tabassum) 9 / 34
Review of Probability Concepts Random Variables Random Processes

Mathematical Channel Models: LTI

Linear Time Invariant Filter Channels


Attenuation, additive noise, modification of frequency spectrum
due to filter characteristics of the channel
Linear filter can change only magnitude and phase.
Impulse response remains constant during communication period.
Examples
Twisted Pair (∼ 100 kHz)
Coaxial (∼ 100 MHz)
Waveguide (∼ 100 GHz)
Choice when input data rate is 99 kHz?

(Hina Tabassum) 10 / 34
Review of Probability Concepts Random Variables Random Processes

Mathematical Channel Models: LTI

r(t) = As(t) ∗ h(t) + n(t),


where
r(t) is the received signal,
A is the attenuation,
s(t) is the transmitted signal,
h(t) is the impulse response,
n(t) is the noise.
R(f ) = AS(f )H (f ) + N (f ),

(Hina Tabassum) 11 / 34
Review of Probability Concepts Random Variables Random Processes

Mathematical Channel Models: LTV

Linear Time Variant Filter Channel


Channel varies with time but is linear for a given time instant.
r(t) = As(t) ∗ h(t, τ ) + n(t),
where
r(t) is the received signal,
A is the attenuation,
s(t) is the transmitted signal,
h(t, τ ) is the impulse response and τ is the path delay,
n(t) is the noise.
Example

(Hina Tabassum) 12 / 34
Review of Probability Concepts Random Variables Random Processes

Week 1

. Review of Probability Concepts

. Random Variables

. Random Processes

(Hina Tabassum) 13 / 34
Review of Probability Concepts Random Variables Random Processes

Axioms and Counting Methods

Given the sample space S and an event A, a probability function P(·)


associated to event A is a real number such that
P(A) ≥ 0 for every event A
P(S) = 1
For S S mutually exclusive events A1 , A2 , · · · , An , we have
countable
P(A1 A2 · · · An ) = P(A1 ) + P(A2 ) + · · · + P(An ).

Permutations and Combinations are two important counting


methods.

(Hina Tabassum) 14 / 34
Review of Probability Concepts Random Variables Random Processes

Example-1

Urn A has 5 red balls, 2 white balls. Urn B has 3 red balls, 2 white
balls. An urn is selected randomly and 2 white balls are drawn
successively. Each urn is equally likely to be selected. Find the
probability of 2 white balls that are taken out without replacement.

(Hina Tabassum) 15 / 34
Review of Probability Concepts Random Variables Random Processes

Properties, Conditional Probability, Baye’s Rule

Range of probability of Event A: 0 ≤ P(A) ≤ 1


Probability of impossible event: zero
Probability of Complement of Event A: P(Ā) = 1 − P(A)
Probability of two events
[ \
P(A B) = P(A) + P(B) − P(A B)

Probability of two independent events


[
P(A B) = P(A) + P(B) − P(A)P(B)

(Hina Tabassum) 16 / 34
Review of Probability Concepts Random Variables Random Processes

Properties, Conditional Probability, Baye’s Rule

Conditional Probability (Baye’s Rule)


T
P(B A)
P(B|A) =
P(A)

Total Probability Theorem

P(A) = P(A|A1 )P(A1 ) + P(A|A2 )P(A2 ) + · · · + P(A|An )P(An )

(Hina Tabassum) 17 / 34
Review of Probability Concepts Random Variables Random Processes

Example-2

Urn A: 5 Red, 6 Green, 2 White


Urn B: 3 Red, 3 Green, 4 White
Urn C: 6 Red, 2 Green, 1 White

(a) P(W|Urn A) =?

(b) P(Urn B| W) =?

(Hina Tabassum) 18 / 34
Review of Probability Concepts Random Variables Random Processes

Discrete Random Variable

If a random variable X can take values on a finite set of values, X is


said to be a discrete random variable.

P(Xi ) ≥ 0
P∞
i=1 P(Xi ) =1

(Hina Tabassum) 19 / 34
Review of Probability Concepts Random Variables Random Processes

Example-3

Consider the experiment of rolling two dices. Let X represents the


total number that shows up on the upper faces of 2 dices.

Find P(4 ≤ X ≤ 6)

P(X ≥ 5)

Sketch the PDF and CDF of X

(Hina Tabassum) 20 / 34
Review of Probability Concepts Random Variables Random Processes

Continuous Random Variable - PDF

If X is a continuous random variable


Rx
FX (x) = P(X ≤ x) = −∞ fX (x)dx

fX (x) ≥ 0
R∞
−∞ fX (x)dx = 1

dFX (x)
fX (x) = dx

Physical Meaning of CDF and PDF

(Hina Tabassum) 21 / 34
Review of Probability Concepts Random Variables Random Processes

Continuous Random Variable - CDF

The CDF of X follows the following properties

0 ≤ FX (x) ≤ 1, FX (∞) = 1, FX (−∞) = 0

P(a ≤ x ≤ b) = FX (b) − FX (a) Any other alternative?

Pr(X > a) = 1 − Pr(X ≤ a) = 1 − FX (a)

(Hina Tabassum) 22 / 34
Review of Probability Concepts Random Variables Random Processes

Expectation and Moments

For a general function g(X ) of random variable X , its expectation


can be derived as follows:
Z ∞
E[g(X )] = g(X )fX (x)dx (1)
−∞

Special Cases
g(X ) = c
First moment of X , g(X ) = X
Second moment of X , g(X ) = X 2
nth moment of X , g(X ) = X n

(Hina Tabassum) 23 / 34
Review of Probability Concepts Random Variables Random Processes

Expectation and its Special Cases

For a general function g(X ) of random variable X , its expectation


can be derived as follows:
Z ∞
E[g(X )] = g(X )fX (x)dx (2)
−∞

Special Cases
n-th central moment of X , g(X ) = (X − E[X ])n
Second central moment of X , (X − E[X ])2 (Interpretation, Second
definition)

How mean, variance, standard deviation are related to moments?


Can expectation be computed using CDF?

(Hina Tabassum) 24 / 34
Review of Probability Concepts Random Variables Random Processes

Example-4
Consider fX (x) = 2e−2x , x > 0. Find the mean and standard
deviation?
P(X ≤ 3)

P(X ≥ 1, X ≤ 3)

If mean and standard deviation are same, which random variable it


could be?

Examples

Significance of Gaussian Distribution

(Hina Tabassum) 25 / 34
Review of Probability Concepts Random Variables Random Processes

Summary: Random Variables

Random variable X

Characteristic Function
of X is similar to LT of X
with s= jw

Moment Generating Function


Probability Density Cumulative Density Moments of
(MGF) or
Function (PDF) Function (CDF) X
Laplace Transforms (LTs)

1st Moment of X,
1st Central Moment
i.e., Mean
around X, i.e., E[X-E[x]]
E[X]

2nd Central Moment


2nd Moment of X around mean of X, i.e.,
E[X^2] Variance Standard Deviation of X
E[(X-E[x])^2] is Square root of
Variance
Nth Central Moment
Nth Moment of X around mean of X, i.e.,
E[X^n] Variance
E[X-E[x]]

(Hina Tabassum) 26 / 34
Review of Probability Concepts Random Variables Random Processes

Transformation of Random Variables

Single Random Variable Transformation


Direct Method
CDF Method

Multi-Variable Transformation (Jacobian Method)

Example: Y = 2X , fY (y) =?

Example: Z = X + Y , fZ (z) =?

(Hina Tabassum) 27 / 34
Review of Probability Concepts Random Variables Random Processes

Random Processes

Random variables through


time.

At any given time to , the


samples at each realization
constitute a RV fs,t0 (s, t0 ).

If the PDF fs,t0 (s, t0 ) varies


over time, it is non-stationary
random process.

(Hina Tabassum) 28 / 34
Review of Probability Concepts Random Variables Random Processes

Strict Stationary Random Processes

Amplitude Domain: If the PDF and CDF do not change over time.

Time Domain: If the autocorrelation of random process is


independent of time and only depends on the delay τ between two
time instants.
RXX (t, τ ) = RXX (τ )

(Hina Tabassum) 29 / 34
Review of Probability Concepts Random Variables Random Processes

Wide-Sense Stationary Random Processes

Amplitude Domain: If the mean and variance do not change over


time.

Time Domain: If the autocorrelation of random process is


independent of time and only depends on the delay τ between two
time instants.
RXX (t, τ ) = RXX (τ )
Example

(Hina Tabassum) 30 / 34
Review of Probability Concepts Random Variables Random Processes

Ensemble and Time Averaging

Ensemble Averaging: By freezing the time, we can average over all


possible realizations.
Z ∞
E[g(X (t))]|t=t0 = g(x(t0 ))fX (x, t0 )dx
−∞

Time Averaging: Averaging over time


Z
1
< g(X (t)) >= g(X , t))dt
T0 T0

(Hina Tabassum) 31 / 34
Review of Probability Concepts Random Variables Random Processes

Ergodic Random Process

A random process is ergodic if time domain average becomes equal


to the ensemble average.

E[X (t)]|t=t0 =< X (t) > (3)

(Hina Tabassum) 32 / 34
Review of Probability Concepts Random Variables Random Processes

Summary: Random Process

Random Process

Wide-Sense Auto Correlation


Ergodicity Strict Sense Stationary
Stationary (Time Domain)
Wiener-Khintchin
Theorem

Power Spectral Density


(Frequency Domain)
Ensemble Averaging Time Averaging

(Hina Tabassum) 33 / 34
Questions?

You might also like