Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Math 472 Homework Assignment 1

Problem 1.9.2. Let p(x) = 1/2x , x = 1, 2, 3, . . ., zero elsewhere, be the


pmf of the random variable X. Find the mgf, the mean, and the variance
of X.
P
x1 for
Solution 1.9.2. Using the geometric series a/(1 r) =
x=1 ar
|r| < 1, we are able to compute the mgf of X,
tX

m(t) = E[e

]=

tx

e p(x) =

x=1

et /2
1 (et /2)

tx

e /2 =

x=1

(et /2)x

x=1

= (2et 1)1 ,

for t < ln 2. With m(t) = (2et 1)1 , we are able to compute the first and
second derivatives of m(t),
m0 (t) = 2et (2et 1)2
m00 (t) = 2et (2et + 1)(2et 1)3 .
The first and second moments of X are = m0 (0) = 2 and 2 = m00 (0) = 6,
and the variance is 2 = 2 2 = 6 4 = 2. Therefore the mgf, the mean,
and the variance of X are
m(t) = (2et 1)1 ,

= 2,

2 = 2.

Problem 1.9.3. For each of the following distributions, compute


P ( 2 < X < + 2).
(1) f (x) = 6x(1 x), 0 < x < 1, zero elsewhere.
(2) p(x) = 1/2x , x = 1, 2, 3, . . ., zero elsewhere.
Solution 1.9.3. (1) The mean and second moment are
Z 1
Z 1
=
xf (x) dx =
6x2 (1 x) dx = 1/2
0
0
Z 1
Z 1
2 =
x2 f (x) dx =
6x3 (1 x) dx = 3/10,
0

so the variance is = 2
= 3/10 (1/2)2 = 1/20 and the standard
deviation is = 1/ 20 = 5/10 < 0.224. Hence

1
1
5
5
P ( 2 < X < + 2) = P (
<X< +
)
2 5
2
5
Z 1+ 5
2
5
=
6x(1 x) dx

1
55
2

11 5
=
0.984.
25
1

Remark: f (x) = 6x(1 x) is the density for a Beta distribution with parameters = 2, = 2, so you can quickly find the mean and variance using
the equations on page 667.

(2)From problem 1.9.2, we know


that

=
2
and

=
2. Since 2 =

2 2 2 < 0 and + 2 = 2 + 2 2 4.82


P ( 2 < X < + 2) = P (X 4)
=

4
X
1
15
=
= 0.9375.
x
2
16
x=1

Problem 1.9.5. Let a random variable X of the continuous type have a


pdf f (x) whose graph is symmetric with respect to x = c. If the mean value
of X exists, show that E[X] = c.
Solution 1.9.5. Given that f (cx) = f (c+x), we will show that E[Xc] =
E[X] c = 0.
Z
E[X c] =
(x c)f (x) dx

Z c
Z
=
(x c)f (x) dx +
(x c)f (x) dx.

In the first integral, make the substitution x = c u, dx = du and in the


second integral make the substitution x = c + u, dx = du. Then
Z c
Z
E[X c] =
(x c)f (x) dx +
(x c)f (x) dx.

uf (c u) du +

uf (c + u) du
0

Z
=

Z
uf (c + u) du +

uf (c + u) du = 0,
0

as desired. We conclude that if the density function for a random variable


X is symmetric about the point c, then = E[X] = c.
Problem 1.9.6. Let the random variable X have mean , standard deviation , and mgf M (t), h < t < h. Show that
"

 #

X 2
X
= 1, and
E
= 0,
E


 

 
X
t
E exp t
= et/ M
,
h < t < h.

Solution 1.9.6. Using the linear properties of expected value (see Theorem
1.8.2) and the definition of = E[X], we calculate


X
E[X ]
E[X]

E
=
=
=
= 0,

which verifies the first equation.


Using the linear properties of expected value again and the definition of
2 = E[(X )2 ], we calculate
"
 #


E[(X )2 ]
X 2
(X )2
2
=
E
=E
=
= 1,

2
2
2
which verifies the second equation.
If h < t < h then h < t/ < h, which shows that t/ is in the
domain of M . Using the definition of M (t) = E[exp(tX)] and the linear
properties of the expected value, we calculate
 
h t i
h t t i
h t
i
h (X) i
t
t
t
t
= e E e X = E e e X = E e X = E et
,
e M

which verifies the third equation.


Problem 1.9.7. Show that the moment generating function of the random
variable X having the pdf f (x) = 1/3, 1 < x < 2, zero elsewhere, is
( 2t t
e e
, t 6= 0
3t
M (t) =
1,
t = 0.
Solution 1.9.7. As with every mgf, M (0) = E[e0 ] = E[1] = 1. For t 6= 0,
2
Z
Z 2 tx
 tX 
e
etx
e2t et
tx
=
e f (x) dx =
M (t) = E e
dx =
.
=
3t
3t

1 3
1

Problem 1.9.11. Let X denote a random variable such that K(t) = E[tX ]
exists for all real values of t in a certain open interval that includes the
point t = 1. Show that K (m) (1) is equal to the mth factorial moment
E[X(X 1) (X m + 1)].
Solution 1.9.11. Differentiating k(t) = tx m times we find
k (m) (t) = x(x 1) (x m + 1)txm .
We may therefore expand tx in its Taylor series about t = 1
tx =

X
m=0

k (m) (1)

X
(t 1)m
(t 1)m
=
x(x 1) (x m + 1)
.
m!
m!
m=0

Using this Taylor series we see that


"
#
X
 X
(t 1)m
K(t) = E t = E
X(X 1) (X m + 1)
m!
m=0

=
=

X
m=0

X
m=0

E [X(X 1) (X m + 1)]
K (m) (1)

(t 1)m
m!

(t 1)m
.
m!

Comparing the last two series shows that


K (m) (1) = E [X(X 1) (X m + 1)] .
Problem 1.9.12. Let X be a random variable. If m is a positive integer,
the expectation E[(X b)m ], if it exists, is called the mth moment of the
distribution about the point b. Let the first, second, and third moments of
the distribution about the point 7 be 3, 11, and 15, respectively. Determine
the mean of X, and then find the first, second, and third moments of the
distribution about the point .
Solution 1.9.12. We are given E[X 7] = 3, E[(X 7)2 ] = 11, and
E[(X 7)3 ] = 15. Expanding the first equation gives
E[X 7] = E[X] 7 = 7 = 3,
and therefore = 10. Continuing the calculations,


E[(X )2 ] = E[(X 10)2 ] = E [(X 7) 3]2
= E[(X 7)2 6(X 7) + 9] = E[(X 7)2 ] 6E[X 7] + 9
= 11 18 + 9 = 2.


E[(X )3 ] = E[(X 10)3 ] = E [(X 7) 3]3
= E[(X 7)3 ] 9E[(X 7)2 ] + 27E[X 7] 27
= 15 99 + 81 27 = 30.
Thus the first, second, and third moments of X about the mean = 10 are
respectively 0, 2, and 30.
Problem 1.9.25. Let X be a random variable with a pdf f (x) and mgf
M (t). Suppose f is symmetric about 0; i.e., f (x) = f (x). Show that
M (t) = M (t).

Solution 1.9.25. We will use the substitution x = u, dx = du in the


following calculation.
Z
Z
et(x) f (x) dx
e(t)x f (x) dx =
M (t) =

tu

e f (u) du =

etu f (u) du

= M (t).
Problem 1.10.3. If X is a random variable such that E[X] = 3 and
E[X 2 ] = 13, use Chebyshevs inequality to determine a lower bound for
the probability P (2 < X < 8).
Solution 1.10.3. Chebyshevs inequality states that P (|X | < k)
1 (1/k 2 ). In this problem = 3 and 2 = 13 9 = 4, giving = 2. Thus
P (2 < X < 8) = P (5 < X 3 < 5) = P (|X 3| < 5)
5
= P (|X 3| < 2)
2
 2
2
4
21
1
=1
= .
5
25
25
From the Chebyshev inequality we conclude that P (2 < X < 8) 21/25.
Problem 1.10.4. Let X be a random variable with mgf M (t), h < t < h.
Prove that
P (X a) eat M (t),
0 < t < h,
and that
P (X a) eat M (t),
h < t < 0.
Solution 1.10.4. We will use Markovs inequality (Theorem 1.10.2), which
states that if u(X) 0 and if c > 0 is a constant, then
P (u(X) c) E[u(X)]/c.
We will also use the facts that increasing functions preserve order and decreasing functions reverse order.
Consider the function u(x) = etx > 0, which is an increasing function
of x as long as t > 0. So, for t > 0 we have that X a if and only if
etX = u(X) u(a) = eta . Applying Markovs inequality and the definition
of M (t) we have for 0 < t < h
P (X a) = P (etX eta ) E[etX ]/eta = eta M (t).
When t < 0, u(x) = etx > 0 is a decreasing function of x, so X a if and
only if etX = u(X) u(a) = eta . Applying Markovs inequality again we
have for h < t < 0
P (X a) = P (etX eta ) E[etX ]/eta = eta M (t).

Problem 1.10.5. The mgf of X exists for all real values of t and is given
by
et et
M (t) =
, t 6= 0, M (0) = 1.
2t
Use the result of the preceding exercise to show that P (X 1) = 0 and
P (X 1) = 0.
Solution 1.10.5. Taking a = 1 in problem 1.10.4, we see that for all t > 0,
P (X 1) et M (t) = (1 e2t )/(2t). Taking the limit as t
1 e2t
=0
t
2t

0 P (X 1) lim

which shows that P (X 1) = 0.


Taking a = 1 in problem 1.10.4, we see that for all t < 0, P (X 1)
et M (t) = (e2t 1)/(2t). Taking the limit as t
e2t 1
=0
t
2t

0 P (X 1) lim
which shows that P (X 1) = 0.

Problem 3.3.1. If (1 2t)6 , t < 1/2, is the mgf of the random variable
X, find P (X < 5.23).
Solution 3.3.1. The mgf of X is that of a 2 -distribution with r = 12
degrees of freedom. Using Table II on page 658, we see that the probability
P (X < 5.23) 0.050.
Problem 3.3.2. If X is 2 (5), determine the constants c and d so that
P (c < X < d) = 0.95 and P (X < c) = 0.025.
Solution 3.3.2. Using Table II on page 658 we find P (X < 0.831) 0.025
and P (X < 12.833) 0.975. So, with c = 0.831 and d = 12.833 we have
P (c < X < d) = 0.975 0.025 = 0.95 and P (X < c) = 0.025.
Problem 3.3.3. Find P (3.28 < X < 25.2) if X has a gamma distribution
with = 3 and = 4.
Solution 3.3.3. The mgf of X is MX (t) = (1 4t)3 . From this we see
that M (t/2) = E[etX/2 ] = (1 2t)3 , which is the mgf for a 2 with r = 6
degrees of freedom. Using Table II on page 658 we calculate
P (3.28 < X < 25.2) = P (1.64 < X/2 < 12.6) 0.950 0.050 = 0.900.
Problem 3.3.5. Show that
Z k1 z
k1 x
X
e
z
e
(1)
dz =
,
(k)
x!

k = 1, 2, 3, . . . .

x=0

This demonstrates the relationship between the cdfs of the gamma and Poisson distributions.

Solution 3.3.5. An easy calculation shows that equation (1) is valid for
k = 1, which establishes the base case for an induction proof.
The key to establishing the induction step is the following calculation:

k e
z k ez
=

(k + 1)
(k + 1)



Z
z k ez
d

dz
=
(k + 1)
dz
(2)
Z
kz k1 ez
z k ez

=
+
dz
k(k)
(k + 1)

Z
z k1 ez
z k ez
+
dz.
=

(k)
(k + 1)

Now add k e /(k + 1) to both sides of equation (1) to obtain


Z k1 z
k1 x
k
X
X
z
e
k e
e
k e
x e
dz +
=
+
=
.
(k)
(k + 1)
x!
(k + 1)
x!

x=0

x=0

Using equation (2) to simplify the left hand side of the previous equation
yields
Z k z
k
X
z e
x e
dz =
,
x!
(k + 1)
x=0

which shows that equation (1) is true for k + 1 if it is true for k. Therefore,
by the principle of mathematical induction, equation (1) is true for all k =
1, 2, 3, . . ..
Problem 3.3.9. Let X have a gamma distribution with parameters and
. Show that P (X 2) (2/e) .
Solution 3.3.9. From appendix D on page 667, we see that the mgf for
X is M (t) = (1 t) for t < 1/. Problem 1.10.4 shows that for every
constant a and for every t > 0 in the domain of M (t), P (X a) eat M (t).
Applying this result to our gamma distributed random variable we have for
all 0 < t < 1/
e2t
P (X 2)
.
(1 t)
Let us try to find the minimum value of y = e2t (1t) over the interval
0 < t < 1/. A short calculation shows that the first two derivatives of y
are
y 0 = e2 (1 t)1 (2t 1)


y 00 = 2 e2 (1 t)2 1 + (2t 1)2 .

Since y 0 = 0 at t = 1/(2) (0, 1/) and since y 00 > 0 we see that y takes
its minimum value at t = 1/(2) and therefore

 
e2t
2
P (X 2)
=
.

(1 t) 1
e
t= 2

Problem 3.3.15. Let X have a Poisson distribution with parameter m. If


m is an experimental value of a random variable having a gamma distribution with = 2 and = 1, compute P (X = 0, 1, 2).
Solution 3.3.15. We will be using techniques from the topic of joint and
conditional distributions. We are given that m has a gamma distribution
with = 2 and = 1, therefore its marginal probability density function is
fm (m) = mem for m > 0, zero elsewhere. For the random variable X, we
are given the conditional probability mass function given m is p(x | m) =
mx em /x! for x = 0, 1, 2, . . . and m > 0, zero elsewhere. From the given
information we are able to determine that the joint mass-density function
is
f (x, m) = p(x | m)fm (m) = mx+1 e2m /x!
for x = 0, 1, 2, . . . and m > 0, zero elsewhere. We calculate the marginal
probability mass function for X,
Z
pX (x) =
f (x, m) dm

mx+1 e2m
dm
(letting u = 2m, du = 2dm)
x!
0
Z
1
= x+2
u(x+2)1 eu du
2 x! 0
(x + 2)
= x+2
2 x!
x+1
= x+2 ,
2
for x = 0, 1, 2, . . ., zero elsewhere. This allows us to find
1
P (X = 0) = pX (0) = ,
4
1
P (X = 1) = pX (1) = ,
4
3
P (X = 2) = pX (2) = .
16
Z

3.3.19. Determine the constant c in each of the following so that each f (x)
is a pdf:
(1) f (x) = cx(1 x)3 , 0 < x < 1, zero elsewhere.
(2) f (x) = cx4 (1 x)5 , 0 < x < 1, zero elsewhere.
(3) f (x) = cx2 (1 x)8 , 0 < x < 1, zero elsewhere.

Solution 3.3.19.
(1) c = 1/B(2, 4) = (6)/((2)(4)) = 5!/(1!3!) = 20.
(2) c = 1/B(5, 6) = (11)/((5)(6)) = 10!/(4!5!) = 1260.
(3) c = 1/B(3, 9) = (12)/((3)(9)) = 11!/(2!8!) = 495.
Problem 3.3.22. Show, for k = 1, 2, . . . , n, that
Z 1
k1  
X
n!
n x
k1
nk
z
(1 z)
dz =
p (1 p)nx .
(k

1)!(n

k)!
x
p
x=0

This demonstrates the relationship between the cdfs of the and binomial
distributions.
Solution 3.3.22. This problem is very similar to problem 3.3.5. In this
case the key to the induction step is the following calculation:
1
 

n k
n!
k
nk
nk
z (1 z)
p (1 p)
=


k!(n k)!
k
p


Z 1
n!
d
k
nk

z (1 z)
dz
=
k!(n k)!
p dz
Z 1
kn!
(n k)n! k
=

z k1 (1 z)nk +
z (1 z)nk1 dz
k!(n k)!
k!(n k)!
p
Z 1
n!
n!
=

z k1 (1 z)nk +
z k (1 z)nk1 dz
(k

1)!(n

k)!
k!(n

1)!
p
The rest of the proof is similar to the argument in problem
3.3.5.
Pk1 n x
p
(1 p)nx is a
Remark: an alternate proof is to observe that x=0
x
telescoping series, when you take advantage of the above calculation.
Problem 3.3.23. Let X1 and X2 be independent random variables. Let
X1 and Y = X1 + X2 have chi-square distributions with r1 and r degrees
of freedom, respectively. Here r1 < r. Show that X2 has a chi-square
distribution with r r1 degrees of freedom.
Solution 3.3.23. From appendix D on page 667, we see that the mgfs of X1
and Y are, respectively, MX1 (t) = (1 2t)r1 /2 and MY (t) = (1 2t)r/2 .
Since Y = X1 + X2 is the sum of independent random variables, MY (t) =
MX1 (t)MX2 (t) (by Theorem 2.2.5). Solving, we find that
MX2 (t) =

MY (t)
(1 2t)r/2
=
= (1 2t)(rr1 )/2 ,
MX1 (t)
(1 2t)r1 /2

which is the mgf of a chi-square random variable with r r1 degrees of


freedom. Therefore, by Theorem 1.9.1, X2 has a chi-square distribution
with r r1 degrees of freedom.

10

Problem 3.3.24. Let X1 , X2 be two independent random variables having


gamma distributions with parameters 1 = 3, 1 = 3 and 2 = 5, 2 = 1,
respectively.
(1) Find the mgf of Y = 2X1 + 6X2 .
(2) What is the distribution of Y ?
Solution 3.3.24. (1) The mgfs of X1 and X2 are, respectively,
MX1 (t) = (1 3t)3 and MX2 (t) = (1 t)5 .
Since X1 and X2 are independent, theorem 2.5.4 implies that
h
i
MY (t) = E et(2X1 +6X2 )


= E e2tX1 e6tX2

 

= E e2tX1 E e6tX2
= MX1 (2t)MX2 (6t)
= (1 3(2t))3 (1 1(6t))5
= (1 6t)8 .
(2) Since (1 6t)8 is the mgf of a gamma distribution with parameters
= 8 and = 6, Theorem 1.9.1 shows that Y has a gamma distribution
with parameters = 8 and = 6.

You might also like