CHAPTER 04 M.Sc. 2014
CHAPTER 04 M.Sc. 2014
2 y,
f y
0 y 1,
,0 elsewhere
The company is paid at the rate of Rs.300 per ton for the refined sugar, but it also has a fixed
overhead cost of Rs.100 per day. Thus the daily profit, in hundreds of Rupees, is U = 3Y 1. Find
the probability density function for U.
Example 02:
We considered the random variables Y1, the proportional amount of gasoline stocked at the
beginning of a week, and Y2 , the proportional amount of gasoline sold during the week. The joint
density function of Y1, and Y2 is given by
3y1,
f y1, y2
0,
0 y2 y1 1,
elsewhere
Find the probability density function for U = Y 1 Y2 , the proportional amount of gasoline
remaining at the end of the week. Use the density function of U to find E ( U).
Example 03:
Let (Y1 , Y2 ) denote a random sample of size n = 2 from the inform distribution on the interval
(0 ,1) . Find the probability density function for U = Y1 + Y2.
Example 04:
Let Y have the probability density function given by
y1,
2
0,
fY ( y)
1 y 1,
elsewhere
(2)
Method of Transformations
Summary of the transformation Method
Let U be an increasing or decreasing function of the random variables Y; say, U = h(Y).
1.
2.
3.
dy
du
where y h -1 (u ) .
Let Y have probability density function fy(y). If h(y) is either increasing or decreasing in y, then
U = h(y) has density function
f u (u ) f Y ( y )
dy
du
where y h -1 (u )
Example 05:
2
In Example 01 we worked with a random variable Y ( amount of sugar produced ) with a density
function given by
0 y ,1
2y,
f y y
,0 elsewhere
We were interested a new random variable ( profit) given by U = 3Y 1. find the probability
density function for U by the transformation method.
Example 06:
Let Y have the probability density function given by
0 y ,1
2y,
fY y
,0 elsewhere
4Y + 3.
Example 07:
Let Y1 and Y2 have a joint density function given by
e y1y2 ,
0 y1 ,0 y2
0,
elsewhere
f y1 , y 2
f y1, y2
21 y1 0 y11,0 y2 1
0,
elsewhere
We are interested in U = Y1 Y2 , which denote the proportion of type I impurities in the sample.
Find the probability density function for U and use it to find E(U).
(3)
The moment - generating function method for finding the probability distribution of a function
of random variables Y1, Y2, Yn is based on the following uniqueness theorem.
Theorem
Suppose that for each of two random variables, X and Y, moment generating functions
exist and are given by mx(t) and mY(t), respectively. If mx(t) = my(t) for all values of t,
then X and Y have the same probability distribution.
Example 9:
Suppose that Y is a normally distribution random variable with mean and variance 2 . Show
that
Y
Z
Theorem
Let Y1, ,Yn be independent random variables with moment generating functions
Let Y1, .,Yn be independent normally distributed random variables with
mY 1 (t),.., mY 1 (t), respectively
. If U = Y1 + Y2 + .+ Yn, then
E Yi i and V (Yi ) i2 , i 1,.......,
n. Define U by
mU (t ) mY 1 (t ) mY 2 (t )........mY n (t )
n
Theorem
i 1
Where a1 , ..an are constants. Then U is a normally distributed random variable with
n
E (U ) ai i a1 1 a 2 2 ...... a n n
i 1
n
V (U ) a a a ...... a
i 1
2
i
2
i
2
1
2
1
2
2
2
2
2
n
2
n
Theorem
Let Y1, ,Yn defined as before, theorem and define Zi by
Y i
Zi i
, i 1,.........., n
i
n
(4)
Order Statistics
Many functions of random variables that are of interest in practice depend on the relative
magnitudes of the observed variables. For instance, we may be interested in the fasted time in an
automobile race or the heaviest mouse among those fed on a certain diet. Thus we often order
observed random variables according to their magnitudes. The resulting orders variables are
called order statistic.
Formally, let Y1, Y2,..,Yn denote independent continuous random variables with
distribution function F(Y) and density function f(y).
We will denote the ordered random variables Yi by Y(1) , Y(2), .., Y(n), where Y(1) Y(2) .
Y(n). Because the random variables are continuous, the equality signs can be ignored. That is,
Y(1) = min (Y1,.,Yn)
The probability density functions for Y(1) and Y(n) can be found using the method of distribution
functions.
We will derive the density function of Y(n) first. Because Y(n) is the maximum of Y1,Y2,.Yn.
The event (Y(n) y) will occur if and only if the events (Yi y) occur, for every i = 1, 2n.
That is ,
P( Y(n) y ) = P (Y1 y, Y2 y, .. y, ..Yn y)
Because the Yi are independent and P( Yi y ) = F (y) for i = 1, 2, , n, it follows that
P( Y(n) y ) = P (Y1 y) P( Y2 y) .. P(Yn y) = [F(y)]n
Letting gn (y) denote the density function of Y(n), we see that, on taking derivatives of both sides,
gn(y) = n[F(y)]n-1f(y)
The density function for Y(1) can be found in a similar manner. We have that
P(Y(1) y) = 1- P (Y(1) > y)
Because Y(1) is the minimum of Y1, Y2,. Yn, it follows that the event (Y(1) > y) occurs if and
only if the events (Yi > y) occur for i = 1,2,.,n.
Because the Yi are independent and P(Yi > y)=1-F(y) for i = 1,2,..,n, we see that
P(Y(1) y) = 1- P (Y1 > y, Y2 > y,Yn > y)
= 1- [P (Y1 > y) P(Y2 > y)P(Yn > y)
= 1-[1-F(y)]n
Thus if g1(y) denotes the density function of Y(1), differentiation of both sides of the last
expression yields
g1(y) = n[1-F(y)]n-1f(y)
Example 10:
Electronic components of a certain type have a length of life Y, with a probability density given
by
f y
1 / 100 e y / 100 , y 0
0,
elsewhere
( Length of life measured in hours) . Suppose the two such components operate independently and
in series in a certain system ( that is , the system fails when either component fails). Find the
density function for X, the length of life of the system.
Example 11:
Suppose that in Example11 the components operate in parallel ( that is, the system does not fail
until both components fail). Find the density function for X, the length of life of the system.