hypothesis testing
play

Hypothesis Testing Saravanan Vijayakumaran sarva@ee.iitb.ac.in - PowerPoint PPT Presentation

Hypothesis Testing Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay March 13, 2013 1 / 17 What is a Hypothesis? One situation among a set of possible situations Example


  1. Hypothesis Testing Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay March 13, 2013 1 / 17

  2. What is a Hypothesis? One situation among a set of possible situations Example (Radar) EM waves are transmitted and the reflections observed. Null Hypothesis Plane absent Alternative Hypothesis Plane present For a given set of observations, either hypothesis may be true. 2 / 17

  3. What is Hypothesis Testing? • A statistical framework for deciding which hypothesis is true • Under each hypothesis the observations are assumed to have a known distribution • Consider the case of two hypotheses (binary hypothesis testing) H 0 : Y ∼ P 0 H 1 : Y ∼ P 1 Y is the random observation vector belonging to observation set Γ ⊆ R n for n ∈ N • The hypotheses are assumed to occur with given prior probabilities Pr ( H 0 is true ) = π 0 Pr ( H 1 is true ) = π 1 where π 0 + π 1 = 1. 3 / 17

  4. Location Testing with Gaussian Error • Let observation set Γ = R and µ > 0 Y ∼ N ( − µ, σ 2 ) H 0 : Y ∼ N ( µ, σ 2 ) H 1 : p 0 ( y ) p 1 ( y ) − µ µ y • Any point in Γ can be generated under both H 0 and H 1 • What is a good decision rule for this hypothesis testing problem which takes the prior probabilities into account? 4 / 17

  5. What is a Decision Rule? • A decision rule for binary hypothesis testing is a partition of Γ into Γ 0 and Γ 1 such that � 0 if y ∈ Γ 0 δ ( y ) = 1 if y ∈ Γ 1 We decide H i is true when δ ( y ) = i for i ∈ { 0 , 1 } • For the location testing with Gaussian error problem, one possible decision rule is Γ 0 = ( −∞ , 0 ] Γ 1 = ( 0 , ∞ ) and another possible decision rule is Γ 0 = ( −∞ , − 100 ) ∪ ( − 50 , 0 ) Γ 1 = [ − 100 , − 50 ] ∪ [ 0 , ∞ ) • Given that partitions of the observation set define decision rules, what is the optimal partition? 5 / 17

  6. Which is the Optimal Decision Rule? • Minimizing the probability of decision error gives the optimal decision rule • For the binary hypothesis testing problem of H 0 versus H 1 , the conditional decision error probability given H i is true is P e | i = Pr [ Deciding H 1 − i is true | H i is true ] = Pr [ Y ∈ Γ 1 − i | H i ] = 1 − Pr [ Y ∈ Γ i | H i ] = 1 − P c | i • Probability of decision error is P e = π 0 P e | 0 + π 1 P e | 1 • Probability of correct decision is P c = π 0 P c | 0 + π 1 P c | 1 = 1 − P e 6 / 17

  7. Which is the Optimal Decision Rule? • Maximizing the probability of correct decision will minimize probability of decision error • Probability of correct decision is P c = π 0 P c | 0 + π 1 P c | 1 � � = π 0 p 0 ( y ) dy + π 1 p 1 ( y ) dy y ∈ Γ 0 y ∈ Γ 1 • If a point y in Γ belongs to Γ i , its contribution to P c is proportional to π i p i ( y ) • To maximize P c , we choose the partition { Γ 0 , Γ 1 } as Γ 0 = { y ∈ Γ | π 0 p 0 ( y ) ≥ π 1 p 1 ( y ) } Γ 1 = { y ∈ Γ | π 1 p 1 ( y ) > π 0 p 0 ( y ) } • The points y for which π 0 p 0 ( y ) = π 1 p 1 ( y ) can be in either Γ 0 and Γ 1 (the optimal decision rule is not unique) 7 / 17

  8. Location Testing with Gaussian Error • Let µ 1 > µ 0 and π 0 = π 1 = 1 2 : Y = µ 0 + Z H 0 H 1 : Y = µ 1 + Z where Z ∼ N ( 0 , σ 2 ) p 0 ( y ) p 1 ( y ) µ 0 µ 1 y − ( y − µ 0 ) 2 1 p 0 ( y ) = √ 2 πσ 2 e 2 σ 2 − ( y − µ 1 ) 2 1 p 1 ( y ) = √ 2 πσ 2 e 2 σ 2 8 / 17

  9. Location Testing with Gaussian Error • Optimal decision rule is given by the partition { Γ 0 , Γ 1 } Γ 0 = { y ∈ Γ | π 0 p 0 ( y ) ≥ π 1 p 1 ( y ) } Γ 1 = { y ∈ Γ | π 1 p 1 ( y ) > π 0 p 0 ( y ) } • For π 0 = π 1 = 1 2 � � � y ≤ µ 1 + µ 0 � � Γ 0 = y ∈ Γ � 2 � � � � y > µ 1 + µ 0 � Γ 1 = y ∈ Γ � 2 9 / 17

  10. Location Testing with Gaussian Error P e | 0 P e | 1 µ 0 µ 1 µ 0 + µ 1 y 2 � � Y > µ 0 + µ 1 � � µ 1 − µ 0 � � P e | 0 = Pr � H 0 = Q � 2 2 σ � � Y ≤ µ 0 + µ 1 � � µ 0 − µ 1 � µ 1 − µ 0 � � � P e | 1 = Pr � H 1 = Φ = Q � 2 2 σ 2 σ � µ 1 − µ 0 � P e = π 0 P e | 0 + π 1 P e | 1 = Q 2 σ This P e is for π 0 = π 1 = 1 2 10 / 17

  11. Location Testing with Gaussian Error • Suppose π 0 � = π 1 • Optimal decision rule is still given by the partition { Γ 0 , Γ 1 } Γ 0 = { y ∈ Γ | π 0 p 0 ( y ) ≥ π 1 p 1 ( y ) } Γ 1 = { y ∈ Γ | π 1 p 1 ( y ) > π 0 p 0 ( y ) } • The partitions specialized to this problem are σ 2 � � � � y ≤ µ 1 + µ 0 ( µ 1 − µ 0 ) log π 0 � Γ 0 = y ∈ Γ + � 2 π 1 � σ 2 � � y > µ 1 + µ 0 ( µ 1 − µ 0 ) log π 0 � � Γ 1 = y ∈ Γ + � 2 π 1 11 / 17

  12. Location Testing with Gaussian Error Suppose π 0 = 0 . 6 and π 1 = 0 . 4 σ 2 + 0 . 4054 σ 2 τ = µ 1 + µ 0 ( µ 1 − µ 0 ) log π 0 = µ 1 + µ 0 + 2 π 1 2 ( µ 1 − µ 0 ) P e | 0 P e | 1 µ 0 τ µ 1 y 12 / 17

  13. Location Testing with Gaussian Error Suppose π 0 = 0 . 4 and π 1 = 0 . 6 σ 2 − 0 . 4054 σ 2 τ = µ 1 + µ 0 ( µ 1 − µ 0 ) log π 0 = µ 1 + µ 0 + 2 π 1 2 ( µ 1 − µ 0 ) P e | 0 P e | 1 µ 0 τ µ 1 y 13 / 17

  14. M -ary Hypothesis Testing • M hypotheses with prior probabilities π i , i = 1 , . . . , M H 1 : Y ∼ P 1 H 2 : Y ∼ P 2 . . . . . . H M : Y ∼ P M • A decision rule for M -ary hypothesis testing is a partition of Γ into M disjoint regions { Γ i | i = 1 , . . . , M } such that δ ( y ) = i if y ∈ Γ i We decide H i is true when δ ( y ) = i for i ∈ { 1 , . . . , M } • Minimum probability of error rule is δ MPE ( y ) = arg max 1 ≤ i ≤ M π i p i ( y ) 14 / 17

  15. Maximum A Posteriori Decision Rule • The a posteriori probability of H i being true given observation y is � � � = π i p i ( y ) � P H i is true � y � p ( y ) • The MAP decision rule is given by � � � � δ MAP ( y ) = arg max 1 ≤ i ≤ M P H i is true � y = δ MPE ( y ) � MAP decision rule = MPE decision rule 15 / 17

  16. Maximum Likelihood Decision Rule • The ML decision rule is given by δ ML ( y ) = arg max 1 ≤ i ≤ M p i ( y ) 1 • If the M hypotheses are equally likely, π i = M • The MPE decision rule is then given by δ MPE ( y ) = arg max 1 ≤ i ≤ M π i p i ( y ) = δ ML ( y ) For equal priors, ML decision rule = MPE decision rule 16 / 17

  17. Questions? 17 / 17

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend