restricted likelihood ratio tests for functional effects
play

Restricted Likelihood Ratio Tests for Functional Effects in the - PowerPoint PPT Presentation

Restricted Likelihood Ratio Tests for Functional Effects in the Functional Linear Model by Swihart, Goldsmith, and Crainiceanu (2014) September 1, 2017 1 / 16 Modeling Framework Consider { Y i , X i , W i } modeled with a Functional Linear


  1. Restricted Likelihood Ratio Tests for Functional Effects in the Functional Linear Model by Swihart, Goldsmith, and Crainiceanu (2014) September 1, 2017 1 / 16

  2. Modeling Framework Consider { Y i , X i , W i } modeled with a Functional Linear Model (FLM) � E [ Y i ] = α + X i β + W i ( s ) γ ( s ) ds (1) Y i : Continuous, scalar response for the i -th subject X i : Non-functional covariates with coefficients β W i ( s ): Functional predictor for s ∈ [0 , 1] α : Mean parameter γ ( s ): Coefficient function of interest Objective: Determine if a functional predictor should be included in FLM. 2 / 16

  3. Hypothesis Test 1. Test of functional form H 0 : E ( Y i ) = α + X i β + ¯ W i β W (2) � H A : E ( Y i ) = α + X i β + W i ( s ) γ ( s ) ds Does γ ( s ) need to be written as a function, or is the mean sufficient? Equivalently, H 0 : γ ( s ) = c vs H A : γ ( s ) � = c for all c ∈ R . 3 / 16

  4. Hypothesis Test 2. Test of inclusion � H 0 : E ( Y i ) = α + X i β + W i 1 ( s ) γ 1 ( s ) ds (3) � � H A : E ( Y i ) = α + X i β + W i 1 ( s ) γ 1 ( s ) ds + W i 2 ( s ) γ 2 ( s ) ds Is the second functional predictor necessary? Equivalently, H 0 : γ 2 ( s ) = 0 vs H A : γ s � = 0. 4 / 16

  5. Outline of Testing Procedure The authors propose 2 testing procedures. The basic procedure is: Step 1. Decompose the functional predictor W i ( s ) using functional principal components (FPC). Step 2. Express the coefficient function, γ ( s ), using basis functions. Step 3. Rewrite the FLM as a linear model or linear mixed model. Step 4. Test H 0 under the linear (mixed) model using a (restricted)-likelihood ratio test. 5 / 16

  6. Method 1: Functional Principal Components Reg. (FPCR) Step 1. Use FPC to decompose the functional predictor W i ( s ). ∞ � cov [ W i ( s ) , W i ( s ′ )] = λ k ψ k ( s ) ψ k ( s ′ ) k =1 By a truncated Karhunen-Lo´ eve approximation: K w � W i ( s ) = µ ( s ) + c ik ψ k ( s ) k =1 λ k : (Non-decreasing) k -th eigenvalue ψ k : k -th eigenfunction µ ( s ): Mean function � c ik = { W i ( s ) − µ ( s ) } ψ k ( s ) ds ; k -th score for i -th subject. K w : Truncation parameter 6 / 16

  7. Method 1: Functional Principal Components Reg. (FPCR) Deviating from standard FPCR, subtract the subject-specific predictor means. K g W i ( s ) − ¯ � c ∗ W i = ik ψ k ( s ) k =1 W i : Predictor mean for i -th subject, defined as ¯ ¯ � W i = W i ( s ) ds { W i ( s ) − ¯ � c ∗ ik = W i } ψ k ( s ) ds This is important for re-formulating the linear model to test H 0 . 7 / 16

  8. Method 1: Functional Principal Components Reg. (FPCR) Step 2. Express γ ( s ) using basis set φ ( s ). Define the PC basis φ ( s ) = { φ 1 ( s ) , . . . , φ K g ( s ) } , and let K g � γ ( s ) = φ ( s ) γ = γ k φ k ( s ) k =1 Where γ = { γ 1 , . . . , γ K g } are fixed basis coefficients. 8 / 16

  9. Method 1: Functional Principal Components Reg. (FPCR) Step 3. Rewrite the FLM as a linear model. � E [ Y i ] = α + X i β + W i ( s ) γ ( s ) ds � = α + X i β + ¯ c ∗ T ψ T ( s ) φ ( s ) γ ds W i γ 0 + i W i γ 0 + c ∗ T = α + X i β + ¯ γ i Let β T = [ α, β, γ 0 , γ ] and X [ i , ] = � � c ∗ T Then, 1 X i i E ( Y ) = X β 9 / 16

  10. Method 1: Functional Principal Components Reg. (FPCR) Step 4. Test under the linear model E [ Y i ] = α + X i β + ¯ W i γ 0 + c ∗ T γ . i 1. Test of Functional form H 0 : E ( Y i ) = α + X i β + ¯ W i β W ⇔ γ = 0 � H A : E ( Y i ) = α + X i β + W i ( s ) γ ( s ) ds ⇔ γ � = 0 2.Test of Inclusion � H 0 : E ( Y i ) = α + X i β + W i 1 ( s ) γ 1 ( s ) ds ⇔ γ 0 = 0 and γ = 0 � � H A : E ( Y i ) = α + X i β + W i 1 ( s ) γ 1 ( s ) ds + W i 2 ( s ) γ 2 ( s ) ds ⇔ γ 0 � = 0 or γ � = 0 10 / 16

  11. Method 2: Penalized Functional Regression (PFR) Step 1. Use FPC to decompose the functional predictor W i ( s ). ∞ � cov [ W i ( s ) , W i ( s ′ )] = λ k ψ k ( s ) ψ k ( s ′ ) k =1 By a truncated Karhunen-Lo´ eve approximation: K w � W i ( s ) = µ ( s ) + c ik ψ k ( s ) k =1 λ k : (Non-decreasing) k -th eigenvalue ψ k : k -th eigenfunction µ ( s ): Smooth mean function � c ik = { W i ( s ) − µ ( s ) } ψ k ( s ) ds ; k -th score for i -th subject. K w : Truncation parameter 11 / 16

  12. Method 2: Penalized Functional Regression (PFR) Step 2. Express γ ( s ) using basis φ ( s ). Define the B-spline basis φ ( s ) = { φ 1 ( s ) = 1 , φ 2 ( s ) , . . . , φ K g ( s ) } , and let K g � γ ( s ) = φ ( s ) g = γ 0 + g k φ k ( s ) k =1 Where g = { γ 0 , g 1 , . . . , g K g } are basis coefficients, γ 0 is fixed, and g k are random (from mixed models). Use a modified first-order random walk prior for g k , where g 1 ∼ N (0 , σ 2 g ) and g l ∼ N ( g l − 1 , σ 2 g ). 12 / 16

  13. Method 2: Penalized Functional Regression (PFR) Step 3. Rewrite the FLM as a linear mixed model. � E [ Y i ] = α + X i β + W i ( s ) γ ( s ) ds � c ∗ T ψ T ( s ) φ ( s ) γ ds = α + X i β + a + i = α + X i β + a + c ∗ T Mg i � � Where M [ m , n ] = ψ m ( s ) φ n ( s ) ds and a = µ ( s ) γ ( s ) ds . 13 / 16

  14. Method 2: Penalized Functional Regression (PFR) In matrix notation, let β T = [ α + a , β, γ 0 ] � � ( c ∗ T X [ i , ] = 1 X i M ) [1] i Z [ i , ] = ( c ∗ T M ) [2: K g ] i u T = { g k } K g k =1 Then, E [ Y | X , u ] = X β + Zu u ∼ N ( 0 , σ 2 g D ) Where D is the penalty matrix induced by the random walk prior. 14 / 16

  15. Method 2: Penalized Functional Regression (PFR) Step 4. Test under the linear mixed model g D ) , β T = [ α, β, γ 0 , γ ] E [ Y | X , u ] = X β + Zu , u ∼ N ( 0 , σ 2 1. Test of Functional form H 0 : E ( Y i ) = α + X i β + ¯ ⇔ σ 2 g = 0 W i β W � ⇔ σ 2 H A : E ( Y i ) = α + X i β + W i ( s ) γ ( s ) ds g � = 0 2.Test of Inclusion � H 0 : E ( Y i ) = α + X i β + W i 1 ( s ) γ 1 ( s ) ds ⇔ γ 0 = 0 and σ 2 g = 0 � � H A : E ( Y i ) = α + X i β + W i 1 ( s ) γ 1 ( s ) ds + W i 2 ( s ) γ 2 ( s ) ds ⇔ γ 0 � = 0 or σ 2 g � = 0 15 / 16

  16. Comparison of Approaches Method 1: Functional Principal Components Regression (FPCR) Testing is done with standard likelihood-ratio tests of fixed effects. Selection of K g controls the smoothness of γ ( s ) and is very important. The authors suggested using cross-validation. Overall, simpler and more straightforward than PFR to implement. Method 2: Penalized Functional Regression (PFR) Testing is done with non-standard likelihood-ratio test for random and fixed effects (Crainiceanu and Ruppert (2004), Greven et al. (2008). Smoothness of γ ( s ) is induced through the mixed model framework for g , for sufficiently large # of PCs. Overall, more flexible than FPCR but more complex to implement. 16 / 16

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend