Understanding Bivariate Normal Distributions and Their Properties
School
University of Pittsburgh**We aren't endorsed by this school
Course
STAT 1631
Subject
Statistics
Date
Dec 10, 2024
Pages
10
Uploaded by MagistrateButterflyPerson3427
THEOREM5.36Suppose thatX1andX2have the joint distribution whose p.d.f. is given byEq. (5.15). Then there exist independent standard normal random variablesZ1andZ2suchthat Eqs. (5.14) hold. Also, the mean ofXiisμiand the variance ofXiisσ2ifori= 1,2.Furthermore the correlation betweenX1andX2isρ. Finally, the marginal distribution ofXiis the normal distribution with meanμiand varianceσ2ifori= 1,2.129
DEFINITION5.15Bivariate Normal Distributions.When the joint p.d.f. of two random vari-ablesX1andX2is of the form in Eq. (5.15), it is said thatX1andX2have the bivariate normaldistribution with meansμ1andμ2, variancesσ21andσ22, and correlationρ.Properties of Bivariate Normal DistributionsTHEOREM5.37 Independence and Correlation.Two random variablesX1andX2that have abivariate normal distribution are independent if and only if they are uncorrelated.THEOREM5.38 Conditional Distributions.LetX1andX2have the bivariate normal distribu-tion whose p.d.f. is Eq. (5.15). The conditional distribution ofX2given thatX1=x1is thenormal distribution with mean and variance given byE(X2|x1) =μ2+ρσ2x1-μ1σ1,Var(X2|x1) = (1-ρ2)σ22.REMARK5.1Conditional Distribution ofX1given thatX2=x2.130
EXAMPLE5.26Predicting a Person’s Weight.EXAMPLE5.27Determining a Marginal Distribution.Linear CombinationsTHEOREM5.39 Linear Combination of Bivariate Normals.Suppose that two random vari-ablesX1andX2have a bivariate normal distribution, for which the p.d.f. is specified by Eq.(5.15). LetY=a1X1+a2X2+b, wherea1,a2, andbare arbitrary given constants. ThenYhas the normal distribution with meana1μ1+a2μ2+band variancea21σ21+a22σ22+2a1a2ρσ1σ2.131
132
6Large Random Samples6.1The Law of Large NumbersSeveral ExamplesSee examples inR.The Markov and Chebyshev InequalitiesTHEOREM6.1 Markov Inequality.Suppose thatXis a random variable such that Pr(X≥0) = 1. Then for every real numbert >0,Pr(X≥t)≤E(X)t.THEOREM6.2 Chebyshev Inequality.LetXbe a random variable for which Var(X)exists.Then for every numbert >0,Pr(|X-E(X)| ≥t)≤Var(X)t2133
Properties of the Sample MeanTHEOREM6.3 Mean and Variance of the Sample Mean.LetX1, . . . , Xnbe a random samplefrom a distribution with meanμand varianceσ2. LetXnbe the sample mean. ThenE(Xn) =μand Var(Xn) =σ2/n.The Law of Large NumbersDEFINITION6.1Convergence in Probability.A sequenceZ1, Z2, . . .of random variablescon-verges tobin probabilityif for every number>0,limn→∞Pr(|Zn-b|<) = 1.This property is denoted byZnp-→b,and is sometimes stated simply asZnconverges tobin probability.THEOREM6.4 Law of Large Numbers.Suppose thatX1, . . . , Xnform a random sample froma distribution for which the mean isμand for which the variance is finite. LetXndenote thesample mean. ThenXnp-→μ.134
REMARK6.1Weak Laws and Strong Laws.6.2The Central Limit TheoremExamplesSee examples inR.Statement of the TheoremTHEOREM6.5 Central Limit Theorem (Lindeberg and Lèvy).If the random variablesX1, . . . , Xnform a random sample of sizenfrom a given distribution with meanμand varianceσ2(0<σ2<∞), then for each fixed numberx,limn→∞PrXn-μσ/n1/2≤x= Φ(x).whereΦdenotes the c.d.f. of the standard normal distribution.135
EXAMPLE6.1Determining a Simulation Size.An environmental engineer believes that thereare two contaminants in a water supply, arsenic and lead. The actual concentrations of the twocontaminants are independent random variablesXandY, measured in the same units. Theengineer is interested in what proportion of the contamination is lead on average. That is, theengineer wants to know the mean ofR=Y/(X+Y). We suppose that it is a simple matter togenerate as many independent pseudo-random numbers with the distributions ofXandYas wedesire. A common way to obtain an approximation toE[Y/(X+Y)]would be the following:If we samplenpairs(X1, Y1), . . . ,(Xn, Yn)and computeRi=Yi/(Xi+Yi)fori= 1, . . . , n,thenRn=∑ni=1Riis a sensible approximation toE(R). Now we want to decide how largenshould be.136
A first extension: Liapounov’s theoremWe will attempt to drop the “indentically distributed” assumption.THEOREM6.6Suppose that the random variablesX1, X2, . . .are independent and thatE(|Xi-μi|3)<∞fori= 1,2, . . .Also, suppose thatlimn→∞∑ni=1E(|Xi-μi|3)(∑ni=1σ2i)3/2= 0.LetYn=∑ni=1Xi-∑ni=1μi(∑ni=1σ2i)1/2.Then for each fixed numberxlimn→∞Pr(Yn≤x) = Φ(x).137