Expectation, aka Expected Value (Ch 4.1)๐ธ[๐] = เท ๐ฅ๐(๐ = ๐ฅ)เฏซโเฏ, ๐๐๐ ๐ธ[๐ + ๐] = ๐ธ[๐] + ๐ธ[๐]The sum is taken over all values of ๐.Expectation of a Function of Random Variable (Ch 4.2)Method 1: Let X be a random variable, and let ๐ = ๐(๐). If the probability distribution of ๐is known, then the expectation of ๐is๐ธ[๐] = เท ๐ฆ๐(๐ = ๐ฆ)เฏฌโเฏเณคNote: we need to find the PMF ๐(๐ = ๐ฆ)before finding ๐ธ[๐]when using this method.Method 2: Let ๐be a random variable with PMF ๐(๐ = ๐ฅ),๐ฅ โ ๐. Let ๐(๐)be a funtion of ๐. Then the expectation of ๐ฆ = ๐(๐)is ๐ธ[๐] = ๐ธ[๐(๐ฅ)] = เท๐(๐ฅ)๐(๐ = ๐ฅ)เฏซโเฏVariance, a measure of variability or spread (Ch 4.6)Let ๐be a random variable with mean ๐ธ[๐] = ๐ < โ. Then the variance of ๐is๐[๐] = ๐ธ[(๐ โ ๐)เฌถ] = เท (๐ฅ โ ๐)เฌถ๐(๐ = ๐ฅ)เฏซโเฏเณฃ= ๐ธ[๐เฌถ] โ (๐ธ[๐])เฌถStandard Deviation, measure of spread (Ch 4.6)Let ๐be a random variable with variance ๐[๐]. Then the standard deviation of ๐is๐๐ท[๐] = เถฅ๐[๐]โฏโฏโฏโฏProperties of Variance and Standard Deviation (Ch 4.6)Let ๐be a random variable where ๐ธ[๐]and ๐[๐]exist, then for constants ๐and ๐,๐ธ[๐๐ + ๐] = ๐๐ธ[๐] + ๐,๐[๐๐ + ๐] = ๐เฌถ๐[๐],๐๐ท[๐๐ + ๐] = |๐| ร ๐๐ท[๐]General Formula for Variance of a Sum (Ch 4.7)For RVs ๐and ๐with finite variance,๐[๐ + ๐] = ๐[๐] + ๐[๐] + 2๐ถ๐๐ฃ(๐, ๐)and,๐[๐ โ ๐] = ๐[๐] + ๐[๐] โ 2๐ถ๐๐ฃ(๐, ๐)If ๐and ๐are uncorrelated (aka independent),๐[๐ + ๐] = ๐[๐] + ๐[๐]Expectation and Variance of a Sum of INDEPENDENT Random Variables (Ch 4.7)If ๐and ๐are uncorrelated, i.e. ๐ถ๐๐ฃ(๐, ๐) = 0, then๐[๐ ยฑ ๐] = ๐[๐] + ๐[๐]Let ๐เฌต,โฆ , ๐เฏกbe ๐independent random variables, then๐ธ[๐เฌต+ โฏ + ๐เฏก] = ๐ธ[๐เฌต] + โฏ + ๐ธ[๐เฏก]and ๐[๐เฌต+ โฏ + ๐เฏก] = ๐[๐เฌต] + โฏ + ๐[๐เฏก]General Formula for Variance of a Linear Combination, aka Sum (Ch 4.7)For RVs ๐and ๐with finite variance,๐[๐๐ ยฑ ๐๐] = ๐เฌถ๐[๐] + ๐เฌถ๐[๐] + 2๐๐๐ถ๐๐ฃ(๐, ๐)In general, let ๐เฌต, โฆ , ๐เฏกbe random variables, then๐[๐เฌต+ โฏ + ๐เฏก] = เท เท๐ถ๐๐ฃเตซ๐เฏ, ๐เฏเตฏ = เท ๐[๐เฏ] + เท ๐ถ๐๐ฃ(๐เฏ, ๐เฏ)เฏเฎทเฏเฏกเฏเญเฌตเฏกเฏเฏกเฏIn particular,๐[๐เฌต+ ๐เฌถ+ ๐เฌท] = ๐[๐ฅเฌต] + ๐[๐ฅเฌถ] + ๐[๐ฅเฌท] + 2๐ถ๐๐ฃ(๐เฌต,๐เฌถ) + 2๐ถ๐๐ฃ(๐เฌต, ๐เฌท) + 2๐ถ๐๐ฃ(๐เฌถ, ๐เฌท)Covariance (Ch 4.7)For random variables ๐and ๐, with repsective means ๐ธ[๐] = ๐เฏand ๐ธ[๐] = ๐เฏ, the covariance between ๐and ๐is๐ถ๐๐ฃ(๐, ๐) = ๐ธ[(๐ โ ๐เฏ)(๐ โ ๐เฏ)] = ๐ธ[๐๐] โ ๐เฏ๐เฏ= ๐ธ[๐๐] โ ๐ธ[๐]๐ธ[๐]Correlation (Ch 4.7)The correlation between ๐and ๐is๐ถ๐๐๐(๐, ๐) =๐ถ๐๐ฃ(๐, ๐)๐๐ท[๐]๐๐ท[๐]โฏโฏโฏโฏโฏโฏโฏโฏโฏโฏ =๐ธ[(๐ โ ๐เฏ)(๐ โ ๐เฏ)]เถฅ๐[๐]๐[๐]โฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏJoint Probability Mass Function, aka Joint Distribution (Ch 4.3)๐เฏ,เฏ(๐ฅ, ๐ฆ) = ๐(๐ = ๐ฅ, ๐ = ๐ฆ), ๐ฅ โ ๐เฏซ, ๐ฆ โ ๐เฏฌMarginal Distributions (Ch 4.3)If ๐takes values in a set ๐เฏซand ๐takes values in a set ๐เฏฌ, then the marginal distribution of ๐ฟis๐เฏ(๐ฅ) = ๐(๐ = ๐ฅ) = เท ๐(๐ = ๐ฅ, ๐ = ๐ฆ)เฏฌโเฏเณคand the marginal distribution of ๐is๐เฏ(๐ฆ) = ๐(๐ = ๐ฆ) = เท ๐(๐ = ๐ฅ, ๐ = ๐ฆ)เฏซโเฏเณฃTable of Joint PMF and Marginal PMFConditional Probability Mass Function (Ch 4.8)If ๐and ๐are two discrete random variables, then the conditional probability mass function of ๐given ๐ = ๐ฅis๐(๐ = ๐ฆ|๐ = ๐ฅ) =๐(๐ = ๐ฆ, ๐ = ๐ฅ)๐(๐ = ๐ฅ)โฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏ=๐เฏ,เฏ(๐ฅ, ๐ฆ)๐เฏ(๐ฅ)โฏโฏโฏโฏโฏโฏโฏโฏwhen ๐(๐ = ๐ฅ) > 0.Conditional Probability Mass Function for Independent Random Variables(Ch 4.8)When ๐and ๐are independent, then the conditional probability of ๐given ๐ = ๐ฅis๐(๐ = ๐ฆ|๐ = ๐ฅ) =๐(๐ = ๐ฆ, ๐ = ๐ฅ)๐(๐ = ๐ฅ)โฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏ= ๐(๐ = ๐ฆ)Conditional Expectation and Variance of ๐Given ๐ฟ = ๐For discrete random variables ๐and ๐, the conditional expectationof ๐given ๐ = ๐ฅis๐ธ[๐|๐ = ๐ฅ] = เท ๐ฆ๐(๐ = ๐ฆ|๐ = ๐ฅ)เฏฌโเฏเณคand the Conditional Variance of ๐given ๐ = ๐ฅis๐[๐|๐ = ๐ฅ] = ๐ธ[(๐ โ ๐ธ[๐|๐ = ๐ฅ])เฌถ|๐ = ๐ฅ] = ๐ธ[๐เฌถ|๐ = ๐ฅ] โ (๐ธ[๐|๐ = ๐ฅ])เฌถMoment Function (Ch 5.2)Let ๐be a random variable. The ๐เฏงเฏmoment function of ๐is defined as๐ธเตฃ๐เฏเตง = เท ๐ฅเฏ๐(๐ = ๐ฅ)เฏซโเฏเณฃwhere ๐ = 1, 2, โฆWhen ๐ = 1, the expectation ๐ธ[๐]is the first moment function.Moment Generating Functions MGF (Ch 5.2)Let ๐be a random variable. The MGF of ๐is the real-valued function๐เฏ(๐ก) = ๐ธ[๐เฏงเฏ] = เท๐เฏงเฏซ๐เฏ(๐ฅ) = เท ๐เฏงเฏซ๐(๐ = ๐ฅ)เฏซเฏซNote that MGF is a function of ๐กand is determined by the PMF ๐(๐ = ๐ฅ).MGF for Geometric DistributionLet ๐~๐บ๐๐๐(๐), then the MGF of ๐is๐เฏ(๐ก) = ๐ธ[๐เฏงเฏ] = เท ๐เฏงเฏ(1 โ ๐)เฏเฌฟเฌต ๐ =๐๐เฏง1 โ ๐เฏง(1 โ ๐)โฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏโฏเฎถเฏเญเฌตMGF for Binomial DistributionLet ๐~๐ต๐๐๐๐(๐, ๐), then the MGF of ๐is๐เฏ(๐ก) = ๐ธ[๐เฏงเฏ] เท ๐เฏงเฏเตฌ๐๐เตฐ ๐เฏ(1 โ ๐)เฏกเฌฟเฏ= (๐เฏง๐ + 1 โ ๐)เฏกเฏกเฏเญเฌดMGF for Poisson DistributionLet ๐~๐๐๐๐ (๐), then the MGF of ๐is๐เฏ(๐ก) = ๐ธ[๐เฏงเฏ] = เท ๐เฏงเฏ๐เฌฟเฐ๐เฏ๐!โฏโฏ = ๐เฐเตซเฏเณเฌฟเฌตเตฏเฎถเฏเญเฌดProperties of Moment Generating Functions (Ch 5.2)Let ๐เฏ(๐ก)be the MGF of ๐ where the ๐เฏงเฏderivative exists, then๐ธเตฃ๐เฏเตง = ๐เฏ(เฏ)(0), for ๐ = 1, 2, โฆ1.If ๐and ๐are independent random variables, then the MGF of their sum is the product of their MGFs,๐เฏเฌพเฏ(๐ก) = ๐ธเตฃ๐เฏง(เฏเฌพเฏ)เตง = ๐ธ[๐เฏงเฏ]๐ธ[๐เฏงเฏ] = ๐เฏ(๐ก)๐เฏ(๐ก)2.Let ๐be a random variable with MGF ๐เฏ(๐ก)and constants ๐ โ 0and ๐, then๐เฏเฏเฌพเฏ(๐ก) = ๐ธเตฃ๐เฏง(เฏเฏเฌพเฏ)เตง = ๐เฏงเฏ๐ธ[๐เฏเฏงเฏ] = ๐เฏงเฏ๐เฏ(๐๐ก)3.MGFs uniquely determine the underlying probability distribution. That is, if two random variables have the same MGF, then they have the same probability distribution.4.Midterm 2 Cheat Sheet (page 2)Friday, November 15, 20246:17 PMSTAT 550 Applied Probability Page 1