**Unformatted text preview: **One Thousand Exercises
in Probability
Geoffrey R. Grimmett
Statistical Laboratory, Cambridge University
David R. Stirzaker
Mathematical Institute, Oxford University 18 April 2001 iii Preface
This book contains more than 1000 exercises in probability and random processes, together
with their solutions. Apart from being a volume of worked exercises in its own right, it is
also a solutions manual for exercises and problems appearing in our textbook Probability and
Random Processes (3rd edn), Oxford University Press, 2001, henceforth referred to as PRP.
These exercises are not merely for drill, but complement and illustrate the text of PRP, or are
entertaining, or both. The current volume extends our earlier book Probability and Random
Processes: Problems and Solutions, and includes in addition around 400 new problems. Since
many exercises have multiple parts, the total number of interrogatives exceeds 3000.
Despite being intended in part as a companion to PRP, the present volume is as selfcontained as reasonably possible. Where knowledge of a substantial chunk of bookwork is
unavoidable, the reader is provided with a reference to the relevant passage in PRP. Expressions
such as ‘clearly’ appear frequently in the solutions. Although we do not use such terms in
their Laplacian sense to mean ‘with difficulty’, to call something ‘clear’ is not to imply that
explicit verification is necessarily free of tedium.
The table of contents reproduces that of PRP; the section and exercise numbers correspond to those of PRP; there are occasional references to examples and equations in PRP.
The covered range of topics is broad, beginning with the elementary theory of probability
and random variables, and continuing, via chapters on Markov chains and convergence, to
extensive sections devoted to stationarity and ergodic theory, renewals, queues, martingales,
and diffusions, including an introduction to the pricing of options. Generally speaking, exercises are questions which test knowledge of particular pieces of theory, while problems are
less specific in their requirements. There are questions of all standards, the great majority
being elementary or of intermediate difficulty. We ourselves have found some of the later
ones to be rather tricky, but have refrained from magnifying any difficulty by adding asterisks
or equivalent devices. If you are using this book for self-study, our advice would be not to
attempt more than a respectable fraction of these at a first read.
We pay tribute to all those anonymous pedagogues whose examination papers, work
assignments, and textbooks have been so influential in the shaping of this collection. To them
and to their successors we wish, in turn, much happy plundering. If you find errors, try to
keep them secret, except from us. If you know a better solution to any exercise, we will be
happy to substitute it in a later edition.
We acknowledge the expertise of Sarah Shea-Simonds in preparing the T EXscript of this
volume, and of Andy Burbanks in advising on the front cover design, which depicts a favourite
confluence of the authors.
Cambridge and Oxford
April 2001 G. R. G.
D. R. S.
v Life is good for only two things, discovering mathematics and teaching it.
Sim´eon Poisson
In mathematics you don’t understand things, you just get used to them.
John von Neumann
Probability is the bane of the age.
Anthony Powell
Casanova’s Chinese Restaurant
The traditional professor writes a, says b, and means c; but it should be d.
George Po´ lya Contents 1 Introduction
Events as sets
Probability
Conditional probability
Independence
Completeness and product spaces
Worked examples
Problems 1
1
2
3 135
135
137
139 4
4 140
141 10
10
11
11
12 151
152
152
152
153 12 154 16
16
17
18
19
19
20
21
22
23
23 158
158
161
162
165
165
167
169
170
171
172 Random variables and their distributions
2.1
2.2
2.3
2.4
2.5
2.6
2.7 3 Solutions Events and their probabilities
1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8 2 Questions Random variables
The law of averages
Discrete and continuous variables
Worked examples
Random vectors
Monte Carlo simulation
Problems Discrete random variables
3.1
3.2
3.3
3.4
3.5
3.6
3.7
3.8
3.9
3.10
3.11 Probability mass functions
Independence
Expectation
Indicators and matching
Examples of discrete variables
Dependence
Conditional distributions and conditional expectation
Sums of random variables
Simple random walk
Random walk: counting sample paths
Problems
vii Contents 4 Continuous random variables
4.1
4.2
4.3
4.4
4.5
4.6
4.7
4.8
4.9
4.10
4.11
4.12
4.13
4.14 5 29
29
30
30
31
32
33
34
35
36
36
37
38
39 187
188
189
190
191
193
195
199
201
202
204
205
206
209 48
49
50
51
52
52
53
54
55
56
57
57 230
232
234
238
239
241
241
244
247
249
253
254 64
65
66
67
68
69
70
71
72 272
275
276
281
286
287
289
290
293 73
74
74
75
76 297
299
301
303
304 Generating functions and their applications
5.1
5.2
5.3
5.4
5.5
5.6
5.7
5.8
5.9
5.10
5.11
5.12 6 Probability density functions
Independence
Expectation
Examples of continuous variables
Dependence
Conditional distributions and conditional expectation
Functions of random variables
Sums of random variables
Multivariate normal distribution
Distributions arising from the normal distribution
Sampling from a distribution
Coupling and Poisson approximation
Geometrical probability
Problems Generating functions
Some applications
Random walk
Branching processes
Age-dependent branching processes
Expectation revisited
Characteristic functions
Examples of characteristic functions
Inversion and continuity theorems
Two limit theorems
Large deviations
Problems Markov chains
6.1
6.2
6.3
6.4
6.5
6.6
6.7
6.8
6.9
6.10
6.11
6.12
6.13
6.14
6.15 Markov processes
Classification of states
Classification of chains
Stationary distributions and the limit theorem
Reversibility
Chains with finitely many states
Branching processes revisited
Birth processes and the Poisson process
Continuous-time Markov chains
Uniform semigroups
Birth–death processes and imbedding
Special processes
Spatial Poisson processes
Markov chain Monte Carlo
Problems
viii Contents 7 Convergence of random variables
7.1
7.2
7.3
7.4
7.5
7.6
7.7
7.8
7.9
7.10
7.11 8 85
85
86
88
88
89
89
90
90
91
91 323
323
326
330
331
331
331
332
333
334
336 97
97
98
99 349
350
351
352 99 353 101
101
102
102
103
103
104 355
356
357
359
359
360
361 107
107
108
108
109
109 370
371
373
375
375
376 112
113
113
113
114
114
115 382
384
384
385
386
386
387 Random processes
8.1
8.2
8.3
8.4
8.5
8.6
8.7 9 Introduction
Modes of convergence
Some ancillary results
Laws of large numbers
The strong law
The law of the iterated logarithm
Martingales
Martingale convergence theorem
Prediction and conditional expectation
Uniform integrability
Problems
Introduction
Stationary processes
Renewal processes
Queues
The Wiener process
Existence of processes
Problems Stationary processes
9.1
9.2
9.3
9.4
9.5
9.6
9.7 Introduction
Linear prediction
Autocovariances and spectra
Stochastic integration and the spectral representation
The ergodic theorem
Gaussian processes
Problems 10 Renewals
10.1
10.2
10.3
10.4
10.5
10.6 The renewal equation
Limit theorems
Excess life
Applications
Renewal–reward processes
Problems 11 Queues
11.1
11.2
11.3
11.4
11.5
11.6
11.7
11.8 Single-server queues
M/M/1
M/G/1
G/M/1
G/G/1
Heavy traffic
Networks of queues
Problems
ix Contents 12 Martingales
12.1
12.2
12.3
12.4
12.5
12.6
12.7
12.8
12.9 Introduction
Martingale differences and Hoeffding’s inequality
Crossings and convergence
Stopping times
Optional stopping
The maximal inequality
Backward martingales and continuous-time martingales
Some examples
Problems 118
119
119
120
120 396
398
398
399
400 121 403 121 403 126
127
127
127
127
128
129
129
130
130 411
413
413
413
415
416
417
418
420
420 13 Diffusion processes
13.1
13.2
13.3
13.4
13.5
13.6
13.7
13.8
13.9
13.10
13.11
13.12 Introduction
Brownian motion
Diffusion processes
First passage times
Barriers
Excursions and the Brownian bridge
Stochastic calculus
The It oˆ integral
Itoˆ ’s formula
Option pricing
Passage probabilities and potentials
Problems Bibliography 429 Index 430 x 1
Events and their probabilities 1.2 Exercises. Events as sets
1. Let { Ai : i ∈ I } be a collection of sets. Prove ‘De Morgan’s Laws’†:
!"
i 2. Ai #c = $ !$ Aci , i i Ai #c = " Aci . i Let A and B belong to some σ -field F. Show that F contains the sets A ∩ B, A \ B, and A △ B. 3. A conventional knock-out tournament (such as that at Wimbledon) begins with 2n competitors
and has n rounds. There are no play-offs for the positions 2, 3, . . . , 2n − 1, and the initial table of
draws is specified. Give a concise description of the sample space of all possible outcomes. 4. Let F be a σ -field of subsets of " and suppose that B ∈ F. Show that G = { A ∩ B : A ∈ F} is a
σ -field of subsets of B.
5.
(a)
(b)
(c)
(d) Which of the following are identically true? For those that are not, say when they are true.
A ∪ (B ∩ C) = (A ∪ B) ∩ (A ∪ C);
A ∩ (B ∩ C) = (A ∩ B) ∩ C;
(A ∪ B) ∩ C = A ∪ (B ∩ C);
A \ (B ∩ C) = (A \ B) ∪ (A \ C). 1.3 Exercises. Probability
1 ≤ P(A∩ B) ≤ 1 ,
1. Let A and B be events with probabilities P(A) = 34 and P(B) = 13 . Show that 12
3
and give examples to show that both extremes are possible. Find corresponding bounds for P(A ∪ B). 2. A fair coin is tossed repeatedly. Show that, with probability one, a head turns up sooner or later.
Show similarly that any given finite sequence of heads and tails occurs eventually with probability
one. Explain the connection with Murphy’s Law.
3. Six cups and saucers come in pairs: there are two cups and saucers which are red, two white, and
two with stars on. If the cups are placed randomly onto the saucers (one each), find the probability
that no cup is upon a saucer of the same pattern.
†Augustus De Morgan is well known for having given the first clear statement of the principle of mathematical
induction. He applauded probability theory with the words: “The tendency of our study is to substitute the
satisfaction of mental exercise for the pernicious enjoyment of an immoral stimulus”. 1 [1.3.4]–[1.4.5] Exercises Events and their probabilities Let A1 , A2 , . . . , An be events where n ≥ 2, and prove that 4.
P !"
n i=1 Ai # = %
i P(Ai ) − %
i< j P(Ai ∩ A j ) + % i< j <k P(Ai ∩ A j ∩ Ak ) − · · · + (−1)n+1 P(A1 ∩ A2 ∩ · · · ∩ An ). In each packet of Corn Flakes may be found a plastic bust of one of the last five Vice-Chancellors
of Cambridge University, the probability that any given packet contains any specific Vice-Chancellor
being 15 , independently of all other packets. Show that the probability that each of the last three
Vice-Chancellors is obtained in a bulk purchase of six packets is 1 − 3( 45 )6 + 3( 35 )6 − ( 25 )6 .
&'∞
(
5. Let Ar , r ≥ 1, be events such that P(Ar ) = 1 for all r . Show that P r=1
Ar = 1. 6. You are given that at least one of the events Ar , 1 ≤ r ≤ n, is certain to occur, but certainly no
more than two occur. If P(Ar ) = p, and P(Ar ∩ As ) = q, r ̸= s, show that p ≥ 1/n and q ≤ 2/n. 7. You are given that at least one, but no more than three, of the events Ar , 1 ≤ r ≤ n, occur, where
n ≥ 3. The probability of at least two occurring is 12 . If P(Ar ) = p, P(Ar ∩ As ) = q, r ̸= s, and
P(Ar ∩ As ∩ At ) = x, r < s < t, show that p ≥ 3/(2n), and q ≤ 4/n. 1.4 Exercises. Conditional probability
1. Prove that P(A | B) = P(B | A)P(A)/P(B) whenever P(A)P(B) ̸= 0. Show that, if P(A | B) >
P(A), then P(B | A) > P(B).
2. For events A1 , A2 , . . . , An satisfying P(A1 ∩ A2 ∩ · · · ∩ An−1 ) > 0, prove that P(A1 ∩ A2 ∩ · · · ∩ An ) = P(A1 )P(A2 | A1 )P(A3 | A1 ∩ A2 ) · · · P(An | A1 ∩ A2 ∩ · · · ∩ An−1 ). 3. A man possesses five coins, two of which are double-headed, one is double-tailed, and two are
normal. He shuts his eyes, picks a coin at random, and tosses it. What is the probability that the lower
face of the coin is a head?
He opens his eyes and sees that the coin is showing heads; what is the probability that the lower
face is a head?
He shuts his eyes again, and tosses the coin again. What is the probability that the lower face is
a head?
He opens his eyes and sees that the coin is showing heads; what is the probability that the lower
face is a head?
He discards this coin, picks another at random, and tosses it. What is the probability that it shows
heads?
4. What do you think of the following ‘proof’ by Lewis Carroll that an urn cannot contain two balls
of the same colour? Suppose that the urn contains two balls, each of which is either black or white;
thus, in the obvious notation, P(BB) = P(BW) = P(WB) = P(WW) = 14 . We add a black ball, so
that P(BBB) = P(BBW) = P(BWB) = P(BWW) = 14 . Next we pick a ball at random; the chance
that the ball is black is (using conditional probabilities) 1 · 14 + 23 · 14 + 23 · 14 + 13 · 14 = 23 . However, if
there is probability 23 that a ball, chosen randomly from three, is black, then there must be two black
and one white, which is to say that originally there was one black and one white ball in the urn.
5. The Monty Hall problem: goats and cars. (a) Cruel fate has made you a contestant in a game
show; you have to choose one of three doors. One conceals a new car, two conceal old goats. You 2 Independence Exercises [1.4.6]–[1.5.7] choose, but your chosen door is not opened immediately. Instead, the presenter opens another door
to reveal a goat, and he offers you the opportunity to change your choice to the third door (unopened
and so far unchosen). Let p be the (conditional) probability that the third door conceals the car. The
value of p depends on the presenter’s protocol. Devise protocols to yield the values p = 12 , p = 23 .
Show that, for α ∈ [ 12 , 23 ], there exists a protocol such that p = α. Are you well advised to change
your choice to the third door?
(b) In a variant of this question, the presenter is permitted to open the first door chosen, and to reward
you with whatever lies behind. If he chooses to open another door, then this door invariably conceals
a goat. Let p be the probability that the unopened door conceals the car, conditional on the presenter
having chosen to open a second door. Devise protocols to yield the values p = 0, p = 1, and deduce
that, for any α ∈ [0, 1], there exists a protocol with p = α. 6. The prosecutor’s fallacy†. Let G be the event that an accused is guilty, and T the event that
some testimony is true. Some lawyers have argued on the assumption that P(G | T ) = P(T | G).
Show that this holds if and only if P(G) = P(T ). 7. Urns. There are n urns of which the r th contains r − 1 red balls and n − r magenta balls. You
pick an urn at random and remove two balls at random without replacement. Find the probability that:
(a) the second ball is magenta;
(b) the second ball is magenta, given that the first is magenta. 1.5 Exercises. Independence
1. Let A and B be independent events; show that Ac , B are independent, and deduce that Ac , B c
are independent.
2. We roll a die n times. Let Ai j be the event that the i th and j th rolls produce the same number.
Show that the events { Ai j : 1 ≤ i < j ≤ n} are pairwise independent but not independent. 3. A fair coin is tossed repeatedly. Show that the following two statements are equivalent:
(a) the outcomes of different tosses are independent,
(b) for any given finite sequence of heads and tails, the chance of this sequence occurring in the first
m tosses is 2−m , where m is the length of the sequence. 4. Let " = {1, 2, . . . , p} where p is prime, F be the set of all subsets of ", and P(A) = | A|/ p for
all A ∈ F. Show that, if A and B are independent events, then at least one of A and B is either ∅ or
".
5. Show that the conditional independence of A and B given C neither implies, nor is implied by,
the independence of A and B. For which events C is it the case that, for all A and B, the events A and
B are independent if and only if they are conditionally independent given C?
6. Safe or sorry? Some form of prophylaxis is said to be 90 per cent effective at prevention during
one years treatment. If the degrees of effectiveness in different years are independent, show that the
treatment is more likely than not to fail within 7 years.
7. Families. Jane has three children, each of which is equally likely to be a boy or a girl independently
of the others. Define the events:
A = {all the children are of the same sex},
B = {there is at most one boy},
C = {the family includes a boy and a girl}.
†The prosecution made this error in the famous Dreyfus case of 1894. 3 [1.5.8]–[1.8.3] (a)
(b)
(c)
(d) Exercises Events and their probabilities Show that A is independent of B, and that B is independent of C.
Is A independent of C?
Do these results hold if boys and girls are not equally likely?
Do these results hold if Jane has four children? 8. Galton’s paradox. You flip three fair coins. At least two are alike, and it is an evens chance that
the third is a head or a tail. Therefore P(all alike) = 21 . Do you agree?
9. Two fair dice are rolled. Show that the event that their sum is 7 is independent of the score shown
by the first die. 1.7 Exercises. Worked examples
1. There are two roads from A to B and two roads from B to C. Each of the four roads is blocked by
snow with probability p, independently of the others. Find the probability that there is an open road
from A to B given that there is no open route from A to C.
If, in addition, there is a direct road from A to C, this road being blocked with probability p
independently of the others, find the required conditional probability.
2. Calculate the probability that a hand of 13 cards dealt from a normal shuffled pack of 52 contains
exactly two kings and one ace. What is the probability that it contains exactly one ace given that it
contains exactly two kings?
3. A symmetric random walk takes place on the integers 0, 1, 2, . . . , N with absorbing barriers at 0
and N , starting at k. Show that the probability that the walk is never absorbed is zero.
4. The so-called ‘sure thing principle’ asserts that if you prefer x to y given C, and also prefer x to
y given C c , then you surely prefer x to y. Agreed?
5. A pack contains m cards, labelled 1, 2, . . . , m. The cards are dealt out in a random order, one
by one. Given that the label of the kth card dealt is the largest of the first k cards dealt, what is the
probability that it is also the largest in the pack? 1.8 Problems
1.
(a)
(b)
(c)
(d) A traditional fair die is thrown twice. What is the probability that:
a six turns up exactly once?
both numbers are odd?
the sum of the scores is 4?
the sum of the scores is divisible by 3? 2.
(a)
(b)
(c)
(d) A fair coin is thrown repeatedly. What is the probability that on the nth throw:
a head appears for the first time?
the numbers of heads and tails to date are equal?
exactly two heads have appeared altogether to date?
at least two heads have appeared to date? 3. Let F and G be σ -fields of subsets of ".
(a) Use elementary set operations to
'show that F is closed under countable intersections; that is, if
A1 , A2 , . . . are in F, then so is i Ai .
(b) Let H = F ∩ G be the collection of subsets of " lying in both F and G. Show that H is a σ -field.
(c) Show that F ∪ G, the collection of subsets of " lying in either F or G, is not necessarily a σ -field. 4 Problems Exercises [1.8.4]–[1.8.14] 4. Describe the underlying probability spaces for the following experiments:
(a) a biased coin is tossed three times;
(b) two balls are drawn without replacement from an urn which originally contained two ultramarine
and two vermilion balls;
(c) a biased coin is tossed repeatedly until a head turns up.
5. Show that the probability that exactly one of the events A and B occurs is
P(A) + P(B) − 2P(A ∩ B). Prove that P(A ∪ B ∪ C) = 1 − P(Ac | B c ∩ C c )P(B c | C c )P(C c ). 6. 7. (a) If A is independent of itself, show that P(A) is 0 or 1.
(b) If P(A) is 0 or 1, show that A is independent of all events B. 8. Let F be a σ -field of subsets of ", and suppose P : F → [0, 1] satisfies: (i) P(") = 1, and (ii) P
i...

View
Full Document