surajit ray ray samsi june 3 2005 slide 1 outline outline
play

. Surajit Ray Ray SAMSI, June 3 2005 - slide #1 Outline Outline - PowerPoint PPT Presentation

Alternative Statistical Models . Surajit Ray Ray SAMSI, June 3 2005 - slide #1 Outline Outline Recap of (ordinary) least-squares OLS estimation Recap of (ordinary) least-squares OLS Estimation Violations of statistical assumptions


  1. Alternative Statistical Models . Surajit Ray Ray SAMSI, June 3 2005 - slide #1

  2. Outline Outline Recap of (ordinary) least-squares OLS estimation ■ Recap of (ordinary) least-squares OLS Estimation Violations of statistical assumptions ■ Violations of statistical assumptions Analysis of Residual plot Weighted least-squares Weighted Least Squares ■ Weighted least-squares Weighted Least Squares Weighted Least Squares Generalized least-squares (GLS) ■ Generalized least-squares GLS GLS ■ Concluding comments GLS Concluding comments Ray SAMSI, June 3 2005 - slide #2

  3. Recap of (ordinary) least-squares Outline Recap of (ordinary) least-squares OLS estimation Model for data ( t j , y j ) , j = 1 ,..., n : OLS Estimation Violations of statistical assumptions Analysis of Residual plot y j = y ( t j ; q )+ ε j Weighted least-squares Weighted Least Squares Weighted Least Squares Weighted Least Squares ■ y ( t j ; q ) is deterministic model, with parameters q . Generalized least-squares (GLS) GLS ■ ε j are random errors. GLS GLS Concluding comments Goal of the inverse problem: estimate q . Standard statistical assumptions for the model: 1. y ( t j ; q ) is correct model ⇒ mean of ε j is 0 for all j . 2. Variance of ε j is contant for all j , equal to σ 2 . 3. Error terms ε j , ε k are independent for j � = k . Ray SAMSI, June 3 2005 - slide #3

  4. OLS estimation ■ Minimize Outline Recap of (ordinary) least-squares n OLS estimation ∑ | y i − y ( t i ; q ) | 2 OLS Estimation J ( q ) = (1) Violations of statistical assumptions Analysis of Residual plot i = 1 Weighted least-squares Weighted Least Squares in q , to give � q ols . Weighted Least Squares Weighted Least Squares ■ Estimate σ 2 by Generalized least-squares (GLS) GLS GLS GLS 1 Concluding comments σ 2 � ols = n − pJ ( � q ols ) where p = dim ( q ) . Ray SAMSI, June 3 2005 - slide #4

  5. OLS Estimation ■ converges to q as n increases Outline Recap of (ordinary) least-squares OLS estimation ■ makes efficient use of the data, i.e., has small OLS Estimation Violations of statistical assumptions standard error Analysis of Residual plot Weighted least-squares Weighted Least Squares ■ approximate s.e. ( � q ols , k ) = square root of ( k , k ) Weighted Least Squares Weighted Least Squares element in Generalized least-squares (GLS) � � − 1 GLS σ 2 X T X GLS q ) = � Cov ( � GLS ols � ∂ y ( t r ; q ) � Concluding comments where X r , c = evaluated at � q ols . ∂ q c For example, if q = ( C , K ) T , then   ∂ y ( t 1 ; q ) ∂ y ( t 1 ; q ) ∂ C ∂ K   ∂ y ( t 2 ; q ) ∂ y ( t 2 ; q )   ∂ C ∂ K   X = . . .   . .  . .  ∂ y ( t n ; q ) ∂ y ( t n ; q ) ∂ C ∂ K Ray SAMSI, June 3 2005 - slide #5

  6. Violations of statistical assumptions Outline Recap of (ordinary) least-squares Compute residuals, r j = y j − y ( t j ; � q ols ) , plot against t j : OLS estimation OLS Estimation Violations of statistical assumptions Residual vs. Time (Frequency 57 Hz − Damped) Analysis of Residual plot 0.6 Weighted least-squares Weighted Least Squares Weighted Least Squares 0.4 Weighted Least Squares Generalized least-squares (GLS) 0.2 GLS GLS GLS 0 Residual, r j Concluding comments −0.2 −0.4 −0.6 −0.8 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 Time, t j Figure 1: Residual plot for the (damped) spring-mass-dashpot model, fitted using OLS. Ray SAMSI, June 3 2005 - slide #6

  7. Analysis of Residual plot 1. Do we have the correct deterministic model? Outline Recap of (ordinary) least-squares OLS estimation 2. Is variance of ε j constant across time range? No! OLS Estimation Violations of statistical assumptions Analysis of Residual plot 3. Are errors independent? No! Weighted least-squares Weighted Least Squares Weighted Least Squares Weighted Least Squares Implications: � q ols is no longer a good estimator for q . Generalized least-squares (GLS) GLS GLS GLS Concluding comments Assuming that answer to #1 is “Yes”, how can we change our statistical model assumptions to better model reality? ■ Transform the data (e.g., log transform)? ■ Explicitly model nonconstant variance, and correlations between measurements. ■ Incorporate this into the estimation method. Ray SAMSI, June 3 2005 - slide #7

  8. Weighted least-squares Outline Recap of (ordinary) least-squares OLS estimation (a) Deal with nonconstant variance: OLS Estimation Violations of statistical assumptions Assume Analysis of Residual plot Weighted least-squares Var ( ε j ) = σ 2 Weighted Least Squares Weighted Least Squares j = 1 ,..., n Weighted Least Squares , Generalized least-squares (GLS) w j GLS GLS GLS for known w j . Concluding comments w j large ⇔ observation ( t j , y j ) is of high quality. Instead of OLS, minimize n ∑ � w i | y i − y ( t i ; q ) | 2 J ( q ) = i = 1 in q . Ray SAMSI, June 3 2005 - slide #8

  9. Weighted Least Squares In practice, don’t know w j : Outline Recap of (ordinary) least-squares OLS estimation OLS Estimation ■ Estimate Var ( ε j ) from repeated measurements at time Violations of statistical assumptions Analysis of Residual plot Weighted least-squares t j : Weighted Least Squares Weighted Least Squares w j = σ 2 Weighted Least Squares Generalized least-squares (GLS) . GLS σ 2 � GLS j GLS Concluding comments ■ If error is larger for larger | y j | , let w − 1 = y 2 j . j ■ Alternatively, assume that w − 1 = y 2 ( t j ; q ) . j ■ Assume some other model for w j , e.g., = y 2 θ ( t j ; q ) w − 1 j where θ is to be estimated from the data. Ray SAMSI, June 3 2005 - slide #9

  10. Weighted Least Squares Outline Recap of (ordinary) least-squares OLS estimation Deal with correlated observations and nonconstant OLS Estimation Violations of statistical assumptions variance: Analysis of Residual plot Let ε = ( ε 1 , ε 2 ,..., ε n ) T and assume Weighted least-squares Weighted Least Squares Weighted Least Squares Weighted Least Squares Cov ( ε ) = σ 2 V , Generalized least-squares (GLS) for known matrix V . GLS GLS GLS Concluding comments Let a = ( a 1 , a 2 ,..., a n ) T , y ( q ) = ( y ( t 1 ; q ) ,..., y ( t n ; q )) T , W = V − 1 . Ray SAMSI, June 3 2005 - slide #10

  11. Weighted Least Squares Outline Recap of (ordinary) least-squares OLS estimation The weighted least-squares (WLS) estimator minimizes OLS Estimation Violations of statistical assumptions Analysis of Residual plot J wls ( q ) = { a − y ( q ) } T W { a − y ( q ) } (2) Weighted least-squares Weighted Least Squares Weighted Least Squares Weighted Least Squares in q , to give � q wls . Generalized least-squares (GLS) GLS GLS GLS If above covariance model holds (together with Concluding comments assumption 1) then � q wls has good properties. Ray SAMSI, June 3 2005 - slide #11

  12. Generalized least-squares (GLS) Outline Recap of (ordinary) least-squares Estimate V from the data too! OLS estimation OLS Estimation Violations of statistical assumptions Model Cov ( ε ) = σ 2 V in two stages, based on additional Analysis of Residual plot Weighted least-squares Weighted Least Squares parameters θ , α to be estimated from the data. Weighted Least Squares Weighted Least Squares Generalized least-squares (GLS) GLS (a) Model Var ( ε j ) . For example, GLS GLS Concluding comments Var ( ε j ) = σ 2 y 2 θ ( t j ; q ) , where θ = θ (i.e., scalar) . Define diagonal matrices: G ( q , θ ) = diag { y 2 θ ( t 1 ; q ) , y 2 θ ( t 2 ; q ) ,..., y 2 θ ( t n ; q ) } , and { G ( q , θ ) } 1 / 2 = diag { y θ ( t 1 ; q ) , y θ ( t 2 ; q ) ,..., y θ ( t n ; q ) } . Ray SAMSI, June 3 2005 - slide #12

  13. GLS (b) Model Corr ( ε j , ε k ) for j � = k . For example, Outline Recap of (ordinary) least-squares OLS estimation OLS Estimation Corr ( ε j , ε k ) = α | j − k | , Violations of statistical assumptions Analysis of Residual plot Weighted least-squares where α = α (i.e., scalar) and | α | < 1 . Weighted Least Squares Weighted Least Squares Weighted Least Squares Generalized least-squares (GLS) Organize into a matrix: GLS GLS   GLS Concluding comments α α 2 α n − 1 ··· 1   α α α n − 2  ···  1 Corr ( ε ) = Γ ( α ) =   . . . . .   . . . . ··· . . . .   α n − 1 α n − 2 α n − 3 ··· 1 Put pieces together, write Cov ( ε ) = σ 2 { V ( q , θ , α ) } = σ 2 { G ( q , θ ) } 1 / 2 Γ ( α ) { G ( q , θ ) } 1 / 2 . Ray SAMSI, June 3 2005 - slide #13

  14. GLS Typical GLS algorithm follows three steps: Outline Recap of (ordinary) least-squares OLS estimation OLS Estimation Violations of statistical assumptions (i) Estimate q with OLS. Set � q gls = � q ols . Analysis of Residual plot Weighted least-squares Weighted Least Squares (ii) Estimate (somehow) ( � θ , � α ) . Plug into the model for V Weighted Least Squares Weighted Least Squares Generalized least-squares (GLS) to form estimated weight matrix, GLS GLS GLS q gls , � θ , � α ) } − 1 . � W = { V ( � Concluding comments (iii) Minimize (approximate) WLS objective function (2) in q , namely, J wls ( q ) = { a − y ( q ) } T � � W { a − y ( q ) } . Set the minimizing value equal to � q gls . Return to step (ii). Iterate until “convergence”. Ray SAMSI, June 3 2005 - slide #14

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend