Probability With Applications and R 1st Edition by Robert Dobrow – Ebook PDF Instant Download/Delivery: 9781118589380, 1118589386
Full dowload Probability With Applications and R 1st Edition after payment
Product details:
• ISBN 10:1118589386
• ISBN 13:9781118589380
• Author:Robert Dobrow
Probability: With Applications and R
I like the organization of chapters definitely out of the box thinking. Referee at a state university (USA) This book, written at a renowned liberal arts college, introduces beginning probability at a reasonable pace, with mathematical maturity in mind, and through the use of trusted simulation techniques.
Probability With Applications and R 1st Table of contents:
I.1 WALKING THE WEB
FIGURE I.1: A subgraph of the BGP (Gateway Protocol) web graph consisting of major Internet routers. It has about 6400 vertices and 13,000 edges. Image produced by Ross Richardson and rendered by Fan Chung Graham.
I.2 BENFORD’S LAW
FIGURE I.2: Benford’s law describes the frequencies of first digits for many real-life datasets.
I.3 SEARCHING THE GENOME
FIGURE I.3: Reconstructed evolutionary tree for bears. Polar bears may have evolved from brown bears four to five million years ago with occasional exchange of genes between the two species (shaded gray areas) fluctuating with key climactic events. Credit: Penn State University.
I.4 BIG DATA
FIGURE I.4: Researchers at Rice University have developed a one-pixel camera based on compressed sensing that randomly alters where the light hitting the single pixel originates from within the camera’s field of view as it builds the image. Image courtesy: Rice University.
I.5 FROM APPLICATION TO THEORY
1 FIRST PRINCIPLES
1.1 RANDOM EXPERIMENT, SAMPLE SPACE, EVENT
Example 1.1
Example 1.2
Example 1.3
Example 1.4
1.2 WHAT IS A PROBABILITY?
1.3 PROBABILITY FUNCTION
PROBABILITY FUNCTION
Example 1.5
Example 1.6
TABLE 1.1: Probabilities for majors.
Example 1.7
1.4 PROPERTIES OF PROBABILITIES
TABLE 1.2: Events and sets.
FIGURE 1.1: Venn diagrams.
ADDITION RULE FOR MUTUALLY EXCLUSIVE EVENTS
ADDITION RULE FOR MUTUALLY EXCLUSIVE EVENTS
PROPERTIES OF PROBABILITIES
Example 1.8
FIGURE 1.2: Venn diagram.
1.5 EQUALLY LIKELY OUTCOMES
Example 1.9
Example 1.10
Example 1.11
FIGURE 1.3: Trees in a field.
1.6 COUNTING I
MULTIPLICATION PRINCIPLE
Example 1.12
Example 1.13
Example 1.14
COUNTING PERMUTATIONS
Example 1.15
Example 1.16
Example 1.17
SAMPLING WITH AND WITHOUT REPLACEMENT
Example 1.18
1.7 PROBLEM-SOLVING STRATEGIES: COMPLEMENTS, INCLUSION–EXCLUSION
Example 1.19
Proposition 1.2.
INCLUSION–EXCLUSION
Example 1.20
Example 1.21
1.8 RANDOM VARIABLES
RANDOM VARIABLE
Example 1.22
Example 1.23
Example 1.24
UNIFORM RANDOM VARIABLE
Example 1.25
Example 1.26
TABLE 1.3: Probability distribution for the sum of two dice.
1.9 A CLOSER LOOK AT RANDOM VARIABLES
1.10 A FIRST LOOK AT SIMULATION
MONTE CARLO SIMULATION
R: SIMULATING THE PROBABILITY OF THREE HEADS IN THREE COIN TOSSES
R: SIMULATING THE DIVISIBILITY PROBABILITY
1.11 SUMMARY
EXERCISES
Sample space, event, random variable
Probability functions
Equally likely outcomes and counting
Properties of probabilities
FIGURE 1.4: Venn diagram.
Simulation and R
2 CONDITIONAL PROBABILITY
2.1 CONDITIONAL PROBABILITY
CONDITIONAL PROBABILITY
Example 2.1
Example 2.2
FIGURE 2.1
Example 2.3
Example 2.4
R: SIMULATING A CONDITIONAL PROBABILITY
Example 2.5
2.2 NEW INFORMATION CHANGES THE SAMPLE SPACE
Example 2.6
2.3 FINDING P(A AND B)
GENERAL FORMULA FOR P (A AND B)
Example 2.7
Example 2.8
FIGURE 2.2: Tree diagram for picking two balls from a bag of two red and three blue balls.
FIGURE 2.3: Tree diagram for Example 2.8.
Example 2.9
FIGURE 2.4: Tree diagram for blackjack.
R: SIMULATING BLACKJACK
2.3.1 Birthday Problem
FIGURE 2.5: Solving the birthday problem with a tree diagram.
TABLE 2.1: Birthday probabilities.
Example 2.10
FIGURE 2.6: This graph of actual birthday frequencies in the United States is based on 20 years of census data. It was generated in R by Chris Mulligan (http://chmullig.com/2012/06/births-by-day-of-year/). Observe the seasonal trends and how unlikely it is for someone to be born on New Year’s Day, July 4, or Christmas.
2.4 CONDITIONING AND THE LAW OF TOTAL PROBABILITY
LAW OF TOTAL PROBABILITY
FIGURE 2.7: The events B1,…, B5 partition the sample space. The circle represents event A, which is decomposed into the disjoint union A = AB1 ∪ ⋅⋅⋅ ∪ AB5.
Example 2.11
TABLE 2.2: Insurance predictions for probability of auto accident.
Example 2.12 How to ask a sensitive question?
Example 2.13 Finding the largest number.
R: FINDING THE LARGEST NUMBER
Example 2.14 Random permutations.
R: SIMULATING RANDOM PERMUTATIONS
2.5 BAYES FORMULA AND INVERTING A CONDITIONAL PROBABILITY
BAYES FORMULA
Example 2.15 Diagnostic tests.
Example 2.16 Color blindness continued.
Example 2.17 Auto accidents continued.
Example 2.18 Bertrand’s box paradox.
2.6 SUMMARY
EXERCISES
Basics of conditional probability
FIGURE 2.8: Nontransitive dice.
Conditioning, law of total probability, and Bayes formula
Simulation and R
3 INDEPENDENCE AND INDEPENDENT TRIALS
3.1 INDEPENDENCE AND DEPENDENCE
INDEPENDENT EVENTS
Example 3.1
COIN TOSSING IN THE REAL WORLD
Example 3.2
INDEPENDENCE FOR COLLECTION OF EVENTS
Example 3.3
Example 3.4
TABLE 3.1: Distribution of blood type in the United States.
Example 3.5 Coincidences and the birthday problem.
Example 3.6 A before B.
A BEFORE B
Example 3.7 Craps.
3.2 INDEPENDENT RANDOM VARIABLES
INDEPENDENCE OF RANDOM VARIABLES
Example 3.8
TABLE 3.2: Distribution of number of children in U.S. households.
INDEPENDENT RANDOM VARIABLES
3.3 BERNOULLI SEQUENCES
BERNOULLI DISTRIBUTION
Example 3.9
INDEPENDENT AND IDENTICALLY DISTRIBUTED (i.i.d.) SEQUENCES
Example 3.10
Example 3.11
3.4 COUNTING II
TABLE 3.3: Correspondence between subsets and binary lists.
COUNTING SUBSETS
COUNTING k-ELEMENT SUBSETS AND LISTS WITH k ONES
Example 3.12
TABLE 3.4: Common values of binomial coefficients.
Example 3.13
Example 3.14 Texas hold ’em.
Example 3.15
Example 3.16 Lottery.
Example 3.17 Bridge.
2,235,197,406,895,366,368,301,599,999 to one
Example 3.18
Example 3.19
Theorem 3.1. Binomial theorem.
Proof.
Example 3.20 Ballot problem.
TABLE 3.5: Voting outcomes for the ballot problem. A receives three votes and B receives two votes.
FIGURE 3.1: Illustrating the correspondence between “bad” lists that start with A and lists that start with B.
3.5 BINOMIAL DISTRIBUTION
BINOMIAL DISTRIBUTION
VISUALIZING THE BINOMIAL DISTRIBUTION
FIGURE 3.2: Four examples of the binomial distribution.
Example 3.21
Example 3.22
R: WORKING WITH PROBABILITY DISTRIBUTIONS
Example 3.23
R: SIMULATING THE OVERBOOKING PROBABILITY
Example 3.24
TABLE 3.6: Nucleotide frequencies in human DNA.
Example 3.25
Example 3.26 Random graphs.
FIGURE 3.3: Three random graphs on n = 12 vertices generated, respectively, with p = 0.2, 0.5, and 0.9.
3.6 STIRLING’S APPROXIMATION
Example 3.27
TABLE 3.7: Accuracy of Stirling’s approximation.
3.7 POISSON DISTRIBUTION
POISSON DISTRIBUTION
FIGURE 3.4: Four Poisson distributions.
Example 3.28
Example 3.29
Example 3.30 Death by horse kicks.
TABLE 3.8: Deaths by horse kicks in the Prussian cavalry.
R: POISSON DISTRIBUTION
Example 3.31
R: SIMULATING ANNUAL ACCIDENT COST
Example 3.32
3.7.1 Poisson Approximation of Binomial Distribution
Example 3.33
ON THE EDGE
Example 3.34
Example 3.35 Balls, bowls, and bombs.
TABLE 3.9: Bomb hits over London during World War II.
3.7.2 Poisson Limit*
3.8 PRODUCT SPACES
3.9 SUMMARY
EXERCISES
Independence
Counting
Binomial distribution
Poisson distribution
TABLE 3.10: No-hitter baseball games.
Simulation and R
4 RANDOM VARIABLES
PROBABILITY MASS FUNCTION
TABLE 4.1: Discrete probability distributions.
A NOTE ON NOTATION
4.1 EXPECTATION
EXPECTATION
Example 4.1 Scrabble.
TABLE 4.2: Tile values in Scrabble.
Example 4.2 Roulette.
R: PLAYING ROULETTE
Example 4.3 Expectation of uniform distribution.
Example 4.4 Expectation of Poisson Distribution.
4.2 FUNCTIONS OF RANDOM VARIABLES
Example 4.5
R: LEMONADE PROFITS
EXPECTATION OF FUNCTION OF A RANDOM VARIABLE
Example 4.6
Example 4.7
Example 4.8
EXPECTATION OF A LINEAR FUNCTION OF X
Example 4.9
4.3 JOINT DISTRIBUTIONS
Example 4.10
Example 4.11 Red ball, blue ball.
MARGINAL DISTRIBUTIONS
Example 4.12 Red ball, blue ball, continued.
Example 4.13
EXPECTATION OF FUNCTION OF TWO RANDOM VARIABLES
4.4 INDEPENDENT RANDOM VARIABLES
Example 4.14
R: DICE AND COINS
FUNCTIONS OF INDEPENDENT RANDOM VARIABLES ARE INDEPENDENT
EXPECTATION OF A PRODUCT OF INDEPENDENT RANDOM VARIABLES
Example 4.15 Random cone.
R: EXPECTED VOLUME
4.4.1 Sums of Independent Random Variables
Example 4.16
TABLE 4.3: Distribution of U.S. households by number of TVs.
Example 4.17
THE SUM OF INDEPENDENT POISSON RANDOM VARIABLES IS POISSON
Example 4.18 Sum of uniforms.
4.5 LINEARITY OF EXPECTATION
LINEARITY OF EXPECTATION
4.5.1 Indicator Random Variables
EXPECTATION OF INDICATOR VARIABLE
Example 4.19 Expectation of binomial distribution.
Example 4.20 Problem of coincidences.
TABLE 4.4: Fixed points of permutations for n = 3.
R: SIMULATING THE MATCHING PROBLEM
Example 4.21 St. Petersburg paradox.
4.6 VARIANCE AND STANDARD DEVIATION
VARIANCE
STANDARD DEVIATION
FIGURE 4.1: Four distributions with μ = 4. Variances are (a) 0, (b) 2.08, (c) 4, and (d) 9.
Example 4.22
Example 4.23 Variance of uniform distribution.
Example 4.24 Variance of an indicator.
EXPECTATION AND VARIANCE OF INDICATOR VARIABLE
PROPERTIES OF VARIANCE, STANDARD DEVIATION, AND EXPECTATION
Example 4.25
R: SIMULATION OF POISSON DISTRIBUTION
FIGURE 4.2: Simulation of Poisson(25) distribution. Vertical lines are drawn at x = 15 and x = 35, two standard deviations from the mean.
VARIANCE OF THE SUM OF INDEPENDENT VARIABLES
Example 4.26 Variance of binomial distribution.
Example 4.27
Example 4.28 Roulette continued—how the casino makes money.
R: A MILLION RED BETS
4.7 COVARIANCE AND CORRELATION
COVARIANCE
FIGURE 4.3: Covariance is a measure of linear association between two random variables. Vertical and horizontal lines are drawn at the mean of the marginal distributions.
CORRELATION
UNCORRELATED RANDOM VARIABLES
Example 4.29
Example 4.30
Example 4.31 Red ball, blue ball continued.
GENERAL FORMULA FOR VARIANCE OF A SUM
Example 4.32
VARIANCE OF SUM OF N RANDOM VARIABLES
Example 4.33
Example 4.34 Coincidences continued.
4.8 CONDITIONAL DISTRIBUTION
CONDITIONAL PROBABILITY MASS FUNCTION
Example 4.35
Example 4.36 Red ball, blue ball continued.
Example 4.37 Hip fractures continued.
Example 4.38
Example 4.39
4.8.1 Introduction to Conditional Expectation
CONDITIONAL EXPECTATION OF Y GIVEN X = X
Example 4.40
Example 4.41
Example 4.42
R: SIMULATING A CONDITIONAL EXPECTATION
4.9 PROPERTIES OF COVARIANCE AND CORRELATION*
COVARIANCE PROPERTY: LINEARITY
Theorem 4.1.
Proof.
4.10 EXPECTATION OF A FUNCTION OF A RANDOM VARIABLE*
Proof.
4.11 SUMMARY
EXERCISES
Expectation
TABLE 4.5: Household size by vehicles available.
Joint distribution
FIGURE 4.4: Randomly colored six-by-six board. There are 19 one-by-one black squares, two two-by-two black boards, and no three-by-three or larger black boards.
Variance, standard deviation, covariance, correlation
Conditional distribution, expectation, functions of random variables
Simulation and R
5 A BOUNTY OF DISCRETE DISTRIBUTIONS
5.1 GEOMETRIC DISTRIBUTION
GEOMETRIC DISTRIBUTION
TAIL PROBABILITY OF GEOMETRIC DISTRIBUTION
Example 5.1 Expectation of geometric distribution.
5.1.1 Memorylessness
Example 5.2
MEMORYLESS PROPERTY
Example 5.3
Example 5.4 A hard day’s night at the casino.
5.1.2 Coupon Collecting and Tiger Counting
R: NUMERICAL SOLUTION TO TIGER PROBLEM
5.1.3 How R Codes the Geometric Distribution
R: GEOMETRIC DISTRIBUTION
5.2 NEGATIVE BINOMIAL—UP FROM THE GEOMETRIC
NEGATIVE BINOMIAL DISTRIBUTION
R: NEGATIVE BINOMIAL DISTRIBUTION
Example 5.5
Example 5.6 World Series.
TABLE 5.1: Lengths of 104 World Series, 1903–2011.
Example 5.7
Example 5.8 Problem of points.
R: PROBLEM OF POINTS
5.3 HYPERGEOMETRIC—SAMPLING WITHOUT REPLACEMENT
R: HYPERGEOMETRIC DISTRIBUTION
Example 5.9 Independents.
Example 5.10 Inferring independents and maximum likelihood.
R: MAXIMIZING THE HYPERGEOMETRIC PROBABILITY
Example 5.11
R: SIMULATING ACES IN A BRIDGE HAND
Example 5.12 Counting the homeless.
5.4 FROM BINOMIAL TO MULTINOMIAL
MULTINOMIAL DISTRIBUTION
Example 5.13
TABLE 5.2: Distribution of colors in a bag of candies.
5.4.1 Multinomial Counts
Example 5.14
Theorem 5.1. Multinomial theorem.
Proof.
Example 5.15
R: MULTINOMIAL CALCULATION
Example 5.16 Genetics.
TABLE 5.3: Genotype frequencies for a sample of 60 fruit flies.
5.5 BENFORD’S LAW
TABLE 5.4: Benford’s law.
FIGURE 5.1: Graph of P(n) = arn.
5.6 SUMMARY
EXERCISES
Geometric distribution
Negative binomial distribution
Hypergeometric distribution
Multinomial distribution
Benford’s law
Other
Simulation and R
6 CONTINUOUS PROBABILITY
Example 6.1
6.1 PROBABILITY DENSITY FUNCTION
PROBABILITY DENSITY FUNCTION
Example 6.2
FIGURE 6.1: P(X > 1).
Example 6.3
FIGURE 6.2: Three density shapes.
Example 6.4
Example 6.5
6.2 CUMULATIVE DISTRIBUTION FUNCTION
CUMULATIVE DISTRIBUTION FUNCTION
Example 6.6
Example 6.7
FIGURE 6.3: Density function and cdf.
FIGURE 6.4: Cumulative distribution function P(X ≤ x) for X ∼ Binom(2, 1/2). The cdf has points of discontinuity at x = 0, 1, 2.
CUMULATIVE DISTRIBUTION FUNCTION
6.3 UNIFORM DISTRIBUTION
UNIFORM DISTRIBUTION
Example 6.8
R: UNIFORM DISTRIBUTION
6.4 EXPECTATION AND VARIANCE
EXPECTATION AND VARIANCE FOR CONTINUOUS RANDOM VARIABLES
PROPERTIES OF EXPECTATION AND VARIANCE
Example 6.9 Uniform distribution: expectation and variance.
Example 6.10
EXPECTATION OF FUNCTION OF CONTINUOUS RANDOM VARIABLE
Example 6.11
R: SIMULATING BALLOON VOLUME
6.5 EXPONENTIAL DISTRIBUTION
EXPONENTIAL DISTRIBUTION
Example 6.12
R: EXPONENTIAL DISTRIBUTION
6.5.1 Memorylessness
FIGURE 6.5: Zach arrives at the bus station 10 minutes after Amy. But the distribution of their waiting times is the same.
R: BUS WAITING TIME
MEMORYLESSNESS FOR EXPONENTIAL DISTRIBUTION
FIGURE 6.6: Bathtub curves are used in reliability engineering to model failure rates of a product or component.
Example 6.13
6.6 FUNCTIONS OF RANDOM VARIABLES I
Example 6.14
R: COMPARING THE EXACT DISTRIBUTION WITH A SIMULATION
FIGURE 6.7: Simulated distribution of A = πR2, where R ∼ Unif(0,2). The curve is density function f(a)=1/(4πa).
HOW TO FIND THE DENSITY OF Y = F (X)
Example 6.15 Linear function of a uniform random variable.
Example 6.16 Linear function of a random variable.
Example 6.17
FIGURE 6.8: The geometry of the lighthouse problem.
R: SIMULATING AN EXPECTATION THAT DOES NOT EXIST
6.6.1 Simulating a Continuous Random Variable
INVERSE TRANSFORM METHOD
R: IMPLEMENTING THE INVERSE TRANSFORM METHOD
FIGURE 6.9: Simulating from density f(x) = 2x, for 0 < x < 1 using the inverse transform method.
Example 6.18
6.7 JOINT DISTRIBUTIONS
JOINT DENSITY FUNCTION
Example 6.19
FIGURE 6.10: Joint density f(x, y) = 4xy/15, 1 < x < 4, 0 < y < 1.
JOINT CUMULATIVE DISTRIBUTION FUNCTION
UNIFORM DISTRIBUTION IN TWO DIMENSIONS
Example 6.20
FIGURE 6.11: Shaded region defined by P(X < Y).
FIGURE 6.12: Top: joint density f(x, y) = 3x2y/64. Bottom: domain of joint density. Shaded region shows event {X < Y }.
Example 6.21
FIGURE 6.13: Top: joint density. Bottom: shaded region shows part of event {X ≥ 2 Y}.
Example 6.22
MARGINAL DISTRIBUTIONS FROM JOINT DENSITIES
Example 6.23
EXPECTATION OF FUNCTION OF JOINTLY DISTRIBUTED RANDOM VARIABLES
Example 6.24
6.8 INDEPENDENCE
INDEPENDENCE AND DENSITY FUNCTIONS
Example 6.25
Example 6.26
Example 6.27
6.8.1 Accept–Reject Method
Proof.
FIGURE 6.14: Accept–reject method for simulating points uniformly distributed in the top-left set S. In the top-right, 400 points are generated uniformly on the rectangle. Points inside S are accepted in the bottom left. In bottom right, the accept–reject method is used with an initial 5000 points of which 1970 are accepted.
Example 6.28
Example 6.29
FIGURE 6.15: One thousand points generated in the unit sphere using the accept–reject method.
6.9 COVARIANCE, CORRELATION
COVARIANCE
Example 6.30
R: SIMULATION OF COVARIANCE, CORRELATION
6.10 FUNCTIONS OF RANDOM VARIABLES II
6.10.1 Maximums and Minimums
INEQUALITIES FOR MAXIMUMS AND MINIMUMS
Example 6.31
FIGURE 6.16: Simulated distribution of three independent exponential random variables and their minimum.
MINIMUM OF INDEPENDENT EXPONENTIAL DISTRIBUTIONS
Example 6.32
6.10.2 Sums of Random Variables
Example 6.33 Sum of independent exponentials.
Example 6.34 Sum of independent uniforms.
FIGURE 6.17: Distributions from left to right: uniform, sum of two independent uniforms, sum of three independent uniforms. Histogram is from 100,000 trials. Curve is the theoretical density.
6.11 GEOMETRIC PROBABILITY
Example 6.35 When Angel meets Lisa.
FIGURE 6.18: The shaded region is the event that Angel and Lisa meet.
Example 6.36
Example 6.37
FIGURE 6.19: Let (X, Y) be uniformly distributed on the triangle. The shaded region is the event {X ≤ x}.
Example 6.38 Sum of uniforms revisited.
FIGURE 6.20: Region {X + Y ≤ t} for 0 < t < 1 and 1 < t < 2.
Example 6.39 Buffon’s needle.
FIGURE 6.21: Geometry of Buffon’s needle problem. The needle intersects a line if sin(θ)/2 > D.
FIGURE 6.22: Buffon’s needle problem is solved by finding the area under the curve d = sin(θ)/2 as a proportion of the area of the [0, π] × [0, 1/2] rectangle.
Ants, fish, and noodles
6.12 SUMMARY
EXERCISES
Density, cdf, expectation, variance
Exponential distribution
Functions of random variables
Joint distributions
Geometric probability
Simulation and R
7 CONTINUOUS DISTRIBUTIONS
7.1 NORMAL DISTRIBUTION
NORMAL DISTRIBUTION
FIGURE 7.1: Three normal distributions.
R: NORMAL DISTRIBUTION
7.1.1 Standard Normal Distribution
LINEAR FUNCTION OF NORMAL RANDOM VARIABLE
Example 7.1
7.1.2 Normal Approximation of Binomial Distribution
FIGURE 7.2: Normal approximation of the binomial distribution. For fixed p, as n gets large the binomial distribution tends to a normal distribution.
Example 7.2
FIGURE 7.3: Approximating the binomial probability P(11 ≤ X ≤ 13) using a normal distribution with and without the continuity correction.
Example 7.3 Dice, continued.
QUANTILE
Example 7.4
Example 7.5 IQR.
Example 7.6
10,000 coin flips
7.1.3 Sums of Independent Normals
SUM OF INDEPENDENT NORMAL RANDOM VARIABLES IS NORMAL
Example 7.7
Example 7.8
AVERAGES OF i.i.d. RANDOM VARIABLES
Example 7.9 Averages are better than single measurements.
Example 7.10 Darts.
FIGURE 7.4: A German 10-mark note honoring the contributions of Carl Gauss. There one finds a bell curve representing the Gaussian normal distribution.
7.2 GAMMA DISTRIBUTION
FIGURE 7.5: Four gamma distributions.
GAMMA DISTRIBUTION
SUM OF INDEPENDENT EXPONENTIALS HAS GAMMA DISTRIBUTION
Example 7.11
R: SIMULATING THE GAMMA DISTRIBUTION FROM A SUM OF EXPONENTIALS
FIGURE 7.6: The histogram is from simulating the sum of 20 exponential variables with λ = 2. The curve is the density of a gamma distribution with parameters a = 20 and λ = 2.
7.2.1 Probability as a Technique of Integration
Example 7.12
Example 7.13
Example 7.14
7.2.2 Sum of Independent Exponentials*
7.3 POISSON PROCESS
FIGURE 7.7: Poisson process: relationships of underlying random variables.
DISTRIBUTION OF Nt FOR POISSON PROCESS WITH PARAMETER λ
Example 7.15 Marketing.
FIGURE 7.8: The calls that Jack sees starting at 10 a.m. have the same probabilistic behavior as the calls that Jill sees starting at 8 a.m. Both are Poisson processes with the same parameter.
PROPERTIES OF POISSON PROCESS
Example 7.16
Example 7.17
R: SIMULATING A POISSON PROCESS
Example 7.18 Waiting time paradox.
R: WAITING TIME PARADOX
R: WAITING TIME SIMULATION CONTINUED
7.4 BETA DISTRIBUTION
FIGURE 7.9: Four beta distributions.
Example 7.19
Example 7.20 Order statistics.
DISTRIBUTION OF ORDER STATISTICS
Example 7.21 Simulating beta random variables.
7.5 PARETO DISTRIBUTION, POWER LAWS, AND THE 80-20 RULE
PARETO DISTRIBUTION
Example 7.22
TABLE 7.1: Comparison of tail probabilities for normal and Pareto distributions.
FIGURE 7.10: Scale-invariance of the Pareto distribution. The density curve is for a Pareto distribution with m = 1 and a = 1.16. The curve is shown on intervals of the form c < x < 5c, for c = 1, 3, 9, 27.
SCALE-INVARIANCE
7.5.1 Simulating the Pareto Distribution
R: SIMULATING THE 80-20 RULE
7.6 SUMMARY
EXERCISES
Normal distribution
TABLE 7.2: SAT statistics for 2011 college-bound seniors.
Gamma distribution, Poisson process
Beta distribution
Pareto, scale-invariant distribution
Simulation and R
8 CONDITIONAL DISTRIBUTION, EXPECTATION, AND VARIANCE
8.1 CONDITIONAL DISTRIBUTIONS
CONDITIONAL DENSITY FUNCTION
FIGURE 8.1: Left: graph of a joint density function. Right: conditional density of Y given X = 4.
Example 8.1
Example 8.2
Example 8.3
Example 8.4
BAYES FORMULA
Example 8.5
FIGURE 8.2: Graph of conditional density of Λ given X = 2. The density function takes its largest value at λ = 1/2.
8.2 DISCRETE AND CONTINUOUS: MIXING IT UP
Example 8.6 Will the sun rise tomorrow?
Example 8.7
R: SIMULATING EXPONENTIAL-POISSON TRAFFIC FLOW MODEL
FIGURE 8.3: Simulations of (i) joint distribution of (Λ, M), (ii) marginal distribution of Λ together with exponential density curve (λ = 1), (iii) marginal distribution of M, and (iv) geometric distribution (p = 1/101).
8.3 CONDITIONAL EXPECTATION
CONDITIONAL EXPECTATION OF Y GIVEN X = x
Example 8.8
Example 8.9
Example 8.10
8.3.1 From Function to Random Variable
CONDITIONAL EXPECTATION E[Y|X]Example 8.11
LAW OF TOTAL EXPECTATION
Example 8.12
Example 8.13
Example 8.14 At the gym, continued.
PROPERTIES OF CONDITIONAL EXPECTATION
Example 8.15
Example 8.16
Example 8.17
8.3.2 Random Sum of Random Variables
8.4 COMPUTING PROBABILITIES BY CONDITIONING
Example 8.18
Example 8.19
Example 8.20
Example 8.21
FIGURE 8.4: Cumulative distribution function of insurance payout has jump discontinuity at x = 0. The distribution has both discrete and continuous components.
8.5 CONDITIONAL VARIANCE
CONDITIONAL VARIANCE OF Y GIVEN X = x
Example 8.22
PROPERTIES OF CONDITIONAL VARIANCE
LAW OF TOTAL VARIANCE
Example 8.23
R: SIMULATION OF TWO-STAGE UNIFORM EXPERIMENT
Example 8.24
FIGURE 8.5: Number of exams versus number of A’s. Correlation is 0.775.
Example 8.25 Random sums continued.
R: TOTAL SPENDING AT ALICE’S RESTAURANT
8.6 SUMMARY
EXERCISES
Conditional distributions
Conditional expectation
Computing probabilities with conditioning
Conditional variance
Simulation and R
9 LIMITS
FIGURE 9.1: Mathematician John Kerrich tossed 10,000 coins when he was interned in a prisoner of war camp during World War II. His results illustrate the law of large numbers.
THE “LAW OF AVERAGES” AND A RUN OF BLACK AT THE CASINO
9.1 WEAK LAW OF LARGE NUMBERS
FIGURE 9.2: An illustration of the weak law of large numbers for Bernoulli coin flips with ε = 0.01.
R: WEAK LAW OF LARGE NUMBERS
9.1.1 Markov and Chebyshev Inequalities
Theorem 9.1. Markov’s inequality.
Proof.
Example 9.1
Corollary 9.2.
Theorem 9.3. Chebyshev’s inequality.
Proof.
Example 9.2
Example 9.3
Example 9.4
Theorem 9.4. Weak law of large numbers.
Proof.
9.2 STRONG LAW OF LARGE NUMBERS
Theorem 9.5. Strong law of large numbers.
FIGURE 9.3: Nine realizations of 1000 coin flips.
FIGURE 9.4: Sequences of averages (n = 50, 000) for the Cauchy distribution whose expectation does not exist. Observe the erratic behavior for some sequences.
Example 9.5
Example 9.6 Estimating the area of the United States.
FIGURE 9.5: Using probability and random numbers to estimate the area of the United States.
9.3 MONTE CARLO INTEGRATION
MONTE CARLO INTEGRATION
Example 9.7
R: MONTE CARLO INTEGRATION
Example 9.8
R: MONTE CARLO INTEGRATION
Example 9.9
R: MONTE CARLO SUMMATION
Example 9.10
R: MONTE CARLO INTEGRATION
9.4 CENTRAL LIMIT THEOREM
CENTRAL LIMIT THEOREM
FIGURE 9.6: Histogram of (Sn/n−μ)/(σ/n) from an underlying sequence of n = 1000 Bernoulli trials. Density curve is standard normal.
R: SIMULATION EXPERIMENT
FIGURE 9.7: The simulated distribution of (Sn/n−μ)/(σ/n) for four population distributions (n = 1000).
EQUIVALENT EXPRESSIONS FOR THE CENTRAL LIMIT THEOREM
Example 9.11
Example 9.12
TABLE 9.1: Grade distribution for AP exams.
Example 9.13 Random walk.
FIGURE 9.8: Four random walk paths of 10,000 steps. The horizontal axis represents the number of steps n. The vertical axis is position.
R: RANDOM WALK DISTANCE FROM ORIGIN
Example 9.14
R: SUM OF SIX DICE ARE CLOSE TO NORMAL
FIGURE 9.9: The normalized sum of just six dice throws comes close the normal distribution.
9.4.1 Central Limit Theorem and Monte Carlo
TABLE 9.2: Monte Carlo approximation of the mean of a uniform (0,1) distribution. Compare the precision for n = 101, 103, 105, 107. Each simulation is repeated 12 times.
9.5 MOMENT-GENERATING FUNCTIONS
MOMENT-GENERATING FUNCTION
Example 9.15 Poisson distribution.
Example 9.16
Example 9.17 Standard normal distribution.
PROPERTIES OF MOMENT GENERATING FUNCTIONS
Example 9.18 Sum of independent Poissons is Poisson.
Example 9.19 Sum of i.i.d. exponentials is gamma.
Theorem 9.7. Continuity theorem.
9.5.1 Proof of Central Limit Theorem
Proof.
9.6 SUMMARY
EXERCISES
Law of Large Numbers
Central Limit Theorem
Moment-Generating Functions
Simulation and R
R: STRONG LAW OF LARGE NUMBERS
10 ADDITIONAL TOPICS
10.1 BIVARIATE NORMAL DISTRIBUTION
BIVARIATE STANDARD NORMAL DISTRIBUTION
FIGURE 10.1: Bivariate standard normal distribution.
FIGURE 10.2: Contour plots of standard bivariate normal densities with (a) ρ = − 0.75, (b) ρ = 0,(c) ρ = 0.5, and (d) ρ = 0.9.
FIGURE 10.3: Distribution of arrowtooth flounder in Alaskan fisheries. Ellipses representing 30% probability contours of bivariate normal distribution fit to EBS survey CPUE data for arrowtooth flounder for the five coldest (black; 1994, 1999, 2008–2010) and warmest (gray; 1996, 1998, 2003–2005) years from 1982 to 2010. Source: Hollowed et al. (2011)
BIVARIATE NORMAL DENSITY
Example 10.1 Fathers and sons.
FIGURE 10.4: Galton’s height data for fathers and sons are well fit by a bivariate normal distribution with parameters (μF, μS, σ2F, σ2S, ρ) = (69, 70, 22, 22, 0.5).
PROPERTIES OF BIVARIATE STANDARD NORMAL DISTRIBUTION
R: SIMULATING BIVARIATE NORMAL RANDOM VARIABLES
FIGURE 10.5: Plot of 1000 observations from bivariate standard normal distribution with ρ = − 0.75.
BIVARIATE NORMAL: CONDITIONAL DISTRIBUTION OF Y GIVEN X = x
Example 10.2 Fathers and sons continued.
10.2 TRANSFORMATIONS OF TWO RANDOM VARIABLES
JOINT DENSITY OF V AND W
Example 10.3
Example 10.4
Example 10.5 Bivariate standard normal density.
10.3 METHOD OF MOMENTS
Example 10.6
TABLE 10.1: Bacteria counts at a water purification system.
FIGURE 10.6: Negative binomial distribution with parameters r = 1.77 and p = 0.143 fits water sample data.
10.4 RANDOM WALK ON GRAPHS
FIGURE 10.7: Graph on four vertices.
Example 10.7 Cycle, complete graph.
FIGURE 10.8: Left: cycle graph on k = 9 vertices. Right: complete graph on k = 5 vertices.
Example 10.8 Card shuffling by random transpositions.
Example 10.9 Hypercube.
FIGURE 10.9: The k-hypercube graph for k = 2 (left) and k = 3 (right).
10.4.1 Long-Term Behavior
LIMITING DISTRIBUTION
TABLE 10.2: Simple random walk for the cycle graph on nine vertices after n steps. Simulation of Xn for n = 5, 10, 50, 100.
LIMITING DISTRIBUTION FOR SIMPLE RANDOM WALK ON A GRAPH
R: RANDOM WALK ON A GRAPH
10.5 RANDOM WALKS ON WEIGHTED GRAPHS AND MARKOV CHAINS
Example 10.10
R: WEATHER MARKOV CHAIN
FIGURE 10.10: Weighted graph.
Example 10.11 PageRank.
10.5.1 Stationary Distribution
STATIONARY DISTRIBUTION
Example 10.12
STATIONARY, LIMITING DISTRIBUTION FOR RANDOM WALK ON WEIGHTED GRAPHS
Theorem 10.2.
Proof.
Example 10.13 Two-state Markov chain and Alexander Pushkin.
FIGURE 10.11: Weighted graph for the general two-state Markov chain.
10.6 FROM MARKOV CHAIN TO MARKOV CHAIN MONTE CARLO
Theorem 10.3. (MCMC: Metropolis–Hastings algorithm).
R: MCMC—A TOY EXAMPLE
Example 10.14
Example 10.15 Cryptography
DECODING THE MESSAGE
R: SIMULATION OF BIVARIATE STANDARD NORMAL DISTRIBUTION
FIGURE 10.12: Gibbs sampler simulation of bivariate standard normal with ρ = −0.5.
Example 10.16
R: GIBBS SIMULATION OF TRIVARIATE DISTRIBUTION
FIGURE 10.13: Distribution of the marginal distribution of X from 10,000 runs of the Gibbs sampler. Each sampler is run for 500 iterations.
10.7 SUMMARY
EXERCISES
Bivariate normal distribution
Transformations of random variables
Method of Moments
Random walk on graphs and Markov chains
FIGURE 10.14: Lollipop graph on nine vertices.
FIGURE 10.15: Weighted graph.
People also search for Probability With Applications and R 1st:
probability with applications
statistics and probability with applications
statistics and probability with applications pdf
statistics and probability with applications 4th edition pdf
statistics and probability with applications answers
Reviews
There are no reviews yet.