18 175 lecture 26 more on martingales
play

18.175: Lecture 26 More on martingales Scott Sheffield MIT 18.175 - PowerPoint PPT Presentation

18.175: Lecture 26 More on martingales Scott Sheffield MIT 18.175 Lecture 26 1 Outline Conditional expectation Regular conditional probabilities Martingales Arcsin law, other SRW stories 18.175 Lecture 26 2 Outline Conditional expectation Regular


  1. 18.175: Lecture 26 More on martingales Scott Sheffield MIT 18.175 Lecture 26 1

  2. Outline Conditional expectation Regular conditional probabilities Martingales Arcsin law, other SRW stories 18.175 Lecture 26 2

  3. Outline Conditional expectation Regular conditional probabilities Martingales Arcsin law, other SRW stories 18.175 Lecture 26 3

  4. Recall: conditional expectation Say we’re given a probability space (Ω , F 0 , P ) and a σ -field � � F ⊂ F 0 and a random variable X measurable w.r.t. F 0 , with E | X | < ∞ . The conditional expectation of X given F is a new random variable, which we can denote by Y = E ( X |F ). We require that Y is F measurable and that for all A in F , � � we have � XdP = � YdP . A A Any Y satisfying these properties is called a version of � � E ( X |F ). Theorem: Up to redefinition on a measure zero set, the � � random variable E ( X |F ) exists and is unique. This follows from Radon-Nikodym theorem. � � 18.175 Lecture 26 4

  5. Conditional expectation observations Linearity: E ( aX + Y |F ) = aE ( X |F ) + E ( Y |F ). � � If X ≤ Y then E ( E |F ) ≤ E ( Y |F ). � � If X n ≥ 0 and X n ↑ X with EX < ∞ , then E ( X n |F ) ↑ E ( X |F ) � � (by dominated convergence). If F 1 ⊂ F 2 then � � � E ( E ( X |F 1 ) |F 2 ) = E ( X |F 1 ). � E ( E ( X |F 2 ) |F 1 ) = E ( X |F 1 ). Second is kind of interesting: says, after I learn F 1 , my best � � guess of what my best guess for X will be after learning F 2 is simply my current best guess for X . Deduce that E ( X |F i ) is a martingale if F i is an increasing � � sequence of σ -algebras and E ( | X | ) < ∞ . 18.175 Lecture 26 5

  6. Outline Conditional expectation Regular conditional probabilities Martingales Arcsin law, other SRW stories 18.175 Lecture 26 6

  7. Outline Conditional expectation Regular conditional probabilities Martingales Arcsin law, other SRW stories 18.175 Lecture 26 7

  8. Regular conditional probability Consider probability space (Ω , F , P ), a measurable map � � X : (Ω , F ) → ( S , S ) and G ⊂ F a σ -field. Then µ : Ω × S → [0 , 1] is a regular conditional distribution for X given G if � For each A , ω → µ ( ω, A ) is a version of P ( X ∈ A |G ). � For a.e. ω , A → µ ( ω, A ) is a probability measure on ( S , S ). Theorem: Regular conditional probabilities exist if ( S , S ) is � � nice. 18.175 Lecture 26 8

  9. Outline Conditional expectation Regular conditional probabilities Martingales Arcsin law, other SRW stories 18.175 Lecture 26 9

  10. Outline Conditional expectation Regular conditional probabilities Martingales Arcsin law, other SRW stories 18.175 Lecture 26 10

  11. Martingales Let F n be increasing sequence of σ -fields (called a filtration ). � � A sequence X n is adapted to F n if X n ∈ F n for all n . If X n is � � an adapted sequence (with E | X n | < ∞ ) then it is called a martingale if E ( X n +1 |F n ) = X n for all n . It’s a supermartingale (resp., submartingale ) if same thing holds with = replaced by ≤ (resp., ≥ ). 18.175 Lecture 26 11

  12. Martingale observations Claim: If X n is a supermartingale then for n > m we have � � E ( X n |F m ) ≤ X m . Proof idea: Follows if n = m + 1 by definition; take � � n = m + k and use induction on k . Similar result holds for submartingales. Also, if X n is a � � martingale and n > m then E ( X n |F m ) = X m . Claim: if X n is a martingale w.r.t. F n and φ is convex with � � E | φ ( X n ) | < ∞ then φ ( X n ) is a submartingale. Proof idea: Immediate from Jensen’s inequality and � � martingale definition. Example: take φ ( x ) = max { x , 0 } . � � 18.175 Lecture 26 12

  13. Predictable sequence Call H n predictable if each H + n is F n − 1 measurable. � � Maybe H n represents amount of shares of asset investor has at � � n th stage. n Write ( H · X ) n = � H m ( X m − X m − 1 ). � � m =1 Observe: If X n is a supermartingale and the H n ≥ 0 are � � bounded, then ( H · X ) n is a supermartingale. Example: take H n = 1 N ≥ n for stopping time N . � � 18.175 Lecture 26 13

  14. Two big results Optional stopping theorem: Can’t make money in � � expectation by timing sale of asset whose price is non-negative martingale. Proof: Just a special case of statement about ( H · X ). � � Martingale convergence: A non-negative martingale almost � � surely has a limit. Idea of proof: Count upcrossings (times martingale crosses a � � fixed interval) and devise gambling strategy that makes lots of money if the number of these is not a.s. finite. 18.175 Lecture 26 14

  15. Problems How many primary candidates ever get above twenty percent � � in expected probability of victory? (Asked by Aldous.) Compute probability of having conditional probability reach a � � before b . 18.175 Lecture 26 15

  16. Wald Wald’s equation: Let X i be i.i.d. with E | X i | < ∞ . If N is a � � stopping time with EN < ∞ then ES N = EX 1 EN . Wald’s second equation: Let X i be i.i.d. with E | X i | = 0 and � � 2 = σ 2 < ∞ . If N is a stopping time with EN < ∞ then EX i ES N = σ 2 EN . 18.175 Lecture 26 16

  17. Wald applications to SRW S 0 = a ∈ Z and at each time step S j independently changes � � by ± 1 according to a fair coin toss. Fix A ∈ Z and let N = inf { k : S k ∈ { 0 , A } . What is E S N ? What is E N ? � � 18.175 Lecture 26 17

  18. Outline Conditional expectation Regular conditional probabilities Martingales Arcsin law, other SRW stories 18.175 Lecture 26 18

  19. Outline Conditional expectation Regular conditional probabilities Martingales Arcsin law, other SRW stories 18.175 Lecture 26 19

  20. Reflection principle How many walks from (0 , x ) to ( n , y ) that don’t cross the � � horizontal axis? Try counting walks that do cross by giving bijection to walks � � from (0 , − x ) to ( n , y ). 18.175 Lecture 26 20

  21. Ballot Theorem Suppose that in election candidate A gets α votes and B gets � � β < α votes. What’s probability that A is ahead throughout the counting? Answer: ( α − β ) / ( α + β ). Can be proved using reflection � � principle. 18.175 Lecture 26 21

  22. Arcsin theorem Theorem for last hitting time. � � Theorem for amount of positive positive time. � � 18.175 Lecture 26 22

  23. MIT OpenCourseWare http://ocw.mit.edu 18.175 Theory of Probability Spring 2014 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend