18 175 lecture 5 more integration and expectation
play

18.175: Lecture 5 More integration and expectation Scott Sheffield - PowerPoint PPT Presentation

18.175: Lecture 5 More integration and expectation Scott Sheffield MIT 1 18.175 Lecture 5 Outline Integration Expectation 2 18.175 Lecture 5 Outline Integration Expectation 3 18.175 Lecture 5 Recall Lebesgue integration Lebesgue: If you can


  1. 18.175: Lecture 5 More integration and expectation Scott Sheffield MIT 1 18.175 Lecture 5

  2. Outline Integration Expectation 2 18.175 Lecture 5

  3. Outline Integration Expectation 3 18.175 Lecture 5

  4. Recall Lebesgue integration � Lebesgue: If you can measure, you can integrate. � In more words: if (Ω , F ) is a measure space with a measure µ with µ (Ω) < ∞ ) and f : Ω → R is F -measurable, then we < can define fd µ (for non-negative f , also if both f ∨ 0 and − f ∧ 0 and have finite integrals...) � Idea: define integral, verify linearity and positivity (a.e. non-negative functions have non-negative integrals) in 4 cases: � f takes only finitely many values. � f is bounded (hint: reduce to previous case by rounding down or up to nearest multiple of E for E → 0). � f is non-negative (hint: reduce to previous case by taking f ∧ N for N → ∞ ). � f is any measurable function (hint: treat positive/negative parts separately, difference makes sense if both integrals finite). 4 18.175 Lecture 5

  5. Lebesgue integration Theorem: if f and g are integrable then: � � < If f ≥ 0 a.s. then fd µ ≥ 0. � � < < < For a , b ∈ R , have ( af + bg ) d µ = a fd µ + b gd µ . � � < < If g ≤ f a.s. then < gd µ ≤ < fd µ . � � If g = f a.e. then gd µ = fd µ . � � < < | fd µ | ≤ | f | d µ . � � When (Ω , F , µ ) = ( R d , R d , λ ), write < < f ( x ) dx = 1 E fd λ . � � E 5 18.175 Lecture 5

  6. Outline Integration Expectation 6 18.175 Lecture 5

  7. Outline Integration Expectation 7 18.175 Lecture 5

  8. Expectation Given probability space (Ω , F , P ) and random variable X , we � � < write EX = XdP . Always defined if X ≥ 0, or if integrals of max { X , 0 } and min { X , 0 } are separately finite. k is called k th moment of X . Also, if m = EX then EX � � E ( X − m ) 2 is called the variance of X . 8 18.175 Lecture 5

  9. Properties of expectation/integration Jensen’s inequality: If µ is probability measure and � � < < φ : R → R is convex then φ ( fd µ ) ≤ φ ( f ) d µ . If X is random variable then E φ ( X ) ≥ φ ( EX ). Main idea of proof: Approximate φ below by linear function � � L that agrees with φ at EX . Applications: Utility, hedge fund payout functions. � � older’s inequality: Write l f l p = ( | f | p d µ ) 1 / p for < H¨ � � < 1 ≤ p < ∞ . If 1 / p + 1 / q = 1, then | fg | d µ ≤ l f l p l g l q . Main idea of proof: Rescale so that l f l p l g l q = 1. Use � � some basic calculus to check that for any positive x and y we have xy ≤ x p / p + y q / p . Write x = | f | , y = | g | and integrate | fg | d µ ≤ 1 + 1 < to get = 1 = l f l p l g l q . p q Cauchy-Schwarz inequality: Special case p = q = 2. Gives � � < | fg | d µ ≤ l f l 2 l g l 2 . Says that dot product of two vectors is at most product of vector lengths. 9 18.175 Lecture 5

  10. Bounded convergence theorem Bounded convergence theorem: Consider probability � � measure µ and suppose | f n | ≤ M a.s. for all n and some fixed M > 0, and that f n → f in probability (i.e., lim n →∞ µ { x : | f n ( x ) − f ( x ) | > E } = 0 for all E > 0). Then fd µ = lim f n d µ. n →∞ (Build counterexample for infinite measure space using wide and short rectangles?...) Main idea of proof: for any E , δ can take n large enough so � � < | f n − f | d µ < M δ + E . 10 18.175 Lecture 5

  11. Fatou’s lemma Fatou’s lemma: If f n ≥ 0 then � � lim inf f n d µ ≥ lim inf f n ) d µ. n →∞ n →∞ (Counterexample for opposite-direction inequality using thin and tall rectangles?) Main idea of proof: first reduce to case that the f n are � � increasing by writing g n ( x ) = inf m ≥ n f m ( x ) and observing that g n ( x ) ↑ g ( x ) = lim inf n →∞ f n ( x ). Then truncate, used bounded convergence, take limits. 11 18.175 Lecture 5

  12. More integral properties Monotone convergence: If f n ≥ 0 and f n ↑ f then � � f n d µ ↑ fd µ. Main idea of proof: one direction obvious, Fatou gives other. � � Dominated convergence: If f n → f a.e. and | f n | ≤ g for all � � < < n and g is integrable, then f n d µ → fd µ . Main idea of proof: Fatou for functions g + f n ≥ 0 gives one � � side. Fatou for g − f n ≥ 0 gives other. 12 18.175 Lecture 5

  13. MIT OpenCourseWare http://ocw.mit.edu 18.175 Theory of Probability Spring 2014 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

  14. MIT OpenCourseWare http://ocw.mit.edu 18.175 Theory of Probability Spring 2014 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend