Restricted Likelihood Ratio Tests for Functional Effects in the - - PowerPoint PPT Presentation

restricted likelihood ratio tests for functional effects
SMART_READER_LITE
LIVE PREVIEW

Restricted Likelihood Ratio Tests for Functional Effects in the - - PowerPoint PPT Presentation

Restricted Likelihood Ratio Tests for Functional Effects in the Functional Linear Model by Swihart, Goldsmith, and Crainiceanu (2014) September 1, 2017 1 / 16 Modeling Framework Consider { Y i , X i , W i } modeled with a Functional Linear


slide-1
SLIDE 1

Restricted Likelihood Ratio Tests for Functional Effects in the Functional Linear Model

by Swihart, Goldsmith, and Crainiceanu (2014) September 1, 2017

1 / 16

slide-2
SLIDE 2

Modeling Framework

Consider {Yi, Xi, Wi} modeled with a Functional Linear Model (FLM) E[Yi] = α + Xiβ +

  • Wi(s)γ(s)ds

(1) Yi: Continuous, scalar response for the i-th subject Xi: Non-functional covariates with coefficients β Wi(s): Functional predictor for s ∈ [0, 1] α: Mean parameter γ(s): Coefficient function of interest Objective: Determine if a functional predictor should be included in FLM.

2 / 16

slide-3
SLIDE 3

Hypothesis Test

  • 1. Test of functional form

H0 : E(Yi) = α + Xiβ + ¯ WiβW HA : E(Yi) = α + Xiβ +

  • Wi(s)γ(s)ds

(2) Does γ(s) need to be written as a function, or is the mean sufficient? Equivalently, H0 : γ(s) = c vs HA : γ(s) = c for all c ∈ R.

3 / 16

slide-4
SLIDE 4

Hypothesis Test

  • 2. Test of inclusion

H0 : E(Yi) = α + Xiβ +

  • Wi1(s)γ1(s)ds

HA : E(Yi) = α + Xiβ +

  • Wi1(s)γ1(s)ds +
  • Wi2(s)γ2(s)ds

(3) Is the second functional predictor necessary? Equivalently, H0 : γ2(s) = 0 vs HA : γs = 0.

4 / 16

slide-5
SLIDE 5

Outline of Testing Procedure

The authors propose 2 testing procedures. The basic procedure is: Step 1. Decompose the functional predictor Wi(s) using functional principal components (FPC). Step 2. Express the coefficient function, γ(s), using basis functions. Step 3. Rewrite the FLM as a linear model or linear mixed model. Step 4. Test H0 under the linear (mixed) model using a (restricted)-likelihood ratio test.

5 / 16

slide-6
SLIDE 6

Method 1: Functional Principal Components Reg. (FPCR)

Step 1. Use FPC to decompose the functional predictor Wi(s). cov[Wi(s), Wi(s′)] =

  • k=1

λkψk(s)ψk(s′) By a truncated Karhunen-Lo´ eve approximation: Wi(s) = µ(s) +

Kw

  • k=1

cikψk(s) λk: (Non-decreasing) k-th eigenvalue ψk: k-th eigenfunction µ(s): Mean function cik =

  • {Wi(s) − µ(s)}ψk(s)ds; k-th score for i-th subject.

Kw: Truncation parameter

6 / 16

slide-7
SLIDE 7

Method 1: Functional Principal Components Reg. (FPCR)

Deviating from standard FPCR, subtract the subject-specific predictor means. Wi(s) − ¯ Wi =

Kg

  • k=1

c∗

ikψk(s)

¯ Wi: Predictor mean for i-th subject, defined as ¯ Wi =

  • Wi(s)ds

c∗

ik =

  • {Wi(s) − ¯

Wi}ψk(s)ds This is important for re-formulating the linear model to test H0.

7 / 16

slide-8
SLIDE 8

Method 1: Functional Principal Components Reg. (FPCR)

Step 2. Express γ(s) using basis set φ(s). Define the PC basis φ(s) = {φ1(s), . . . , φKg (s)}, and let γ(s) = φ(s)γ =

Kg

  • k=1

γkφk(s) Where γ = {γ1, . . . , γKg } are fixed basis coefficients.

8 / 16

slide-9
SLIDE 9

Method 1: Functional Principal Components Reg. (FPCR)

Step 3. Rewrite the FLM as a linear model. E[Yi] = α + Xiβ +

  • Wi(s)γ(s)ds

= α + Xiβ + ¯ Wiγ0 +

  • c∗T

i

ψT(s)φ(s)γds = α + Xiβ + ¯ Wiγ0 + c∗T

i

γ Let βT = [α, β, γ0, γ] and X[i,] =

  • 1

Xi c∗T

i

  • Then,

E(Y ) = Xβ

9 / 16

slide-10
SLIDE 10

Method 1: Functional Principal Components Reg. (FPCR)

Step 4. Test under the linear model E[Yi] = α + Xiβ + ¯ Wiγ0 + c∗T

i

γ.

  • 1. Test of Functional form

H0 : E(Yi) = α + Xiβ + ¯ WiβW ⇔ γ = 0 HA : E(Yi) = α + Xiβ +

  • Wi(s)γ(s)ds

⇔ γ = 0 2.Test of Inclusion H0 : E(Yi) = α + Xiβ +

  • Wi1(s)γ1(s)ds

⇔γ0 = 0 and γ = 0 HA : E(Yi) = α + Xiβ +

  • Wi1(s)γ1(s)ds +
  • Wi2(s)γ2(s)ds

⇔γ0 = 0 or γ = 0

10 / 16

slide-11
SLIDE 11

Method 2: Penalized Functional Regression (PFR)

Step 1. Use FPC to decompose the functional predictor Wi(s). cov[Wi(s), Wi(s′)] =

  • k=1

λkψk(s)ψk(s′) By a truncated Karhunen-Lo´ eve approximation: Wi(s) = µ(s) +

Kw

  • k=1

cikψk(s) λk: (Non-decreasing) k-th eigenvalue ψk: k-th eigenfunction µ(s): Smooth mean function cik =

  • {Wi(s) − µ(s)}ψk(s)ds; k-th score for i-th subject.

Kw: Truncation parameter

11 / 16

slide-12
SLIDE 12

Method 2: Penalized Functional Regression (PFR)

Step 2. Express γ(s) using basis φ(s). Define the B-spline basis φ(s) = {φ1(s) = 1, φ2(s), . . . , φKg (s)}, and let γ(s) = φ(s)g = γ0 +

Kg

  • k=1

gkφk(s) Where g = {γ0, g1, . . . , gKg } are basis coefficients, γ0 is fixed, and gk are random (from mixed models). Use a modified first-order random walk prior for gk, where g1 ∼ N(0, σ2

g)

and gl ∼ N(gl−1, σ2

g).

12 / 16

slide-13
SLIDE 13

Method 2: Penalized Functional Regression (PFR)

Step 3. Rewrite the FLM as a linear mixed model. E[Yi] = α + Xiβ +

  • Wi(s)γ(s)ds

= α + Xiβ + a +

  • c∗T

i

ψT(s)φ(s)γds = α + Xiβ + a + c∗T

i

Mg Where M[m,n] =

  • ψm(s)φn(s)ds and a =
  • µ(s)γ(s)ds.

13 / 16

slide-14
SLIDE 14

Method 2: Penalized Functional Regression (PFR)

In matrix notation, let βT = [α + a, β, γ0] X[i,] =

  • 1

Xi (c∗T

i

M)[1]

  • Z[i,] = (c∗T

i

M)[2:Kg] uT = {gk}Kg

k=1

Then, E[Y |X, u] = Xβ + Zu u ∼ N(0, σ2

gD)

Where D is the penalty matrix induced by the random walk prior.

14 / 16

slide-15
SLIDE 15

Method 2: Penalized Functional Regression (PFR)

Step 4. Test under the linear mixed model E[Y |X, u] = Xβ + Zu, u ∼ N(0, σ2

gD), βT = [α, β, γ0, γ]

  • 1. Test of Functional form

H0 : E(Yi) = α + Xiβ + ¯ WiβW ⇔ σ2

g = 0

HA : E(Yi) = α + Xiβ +

  • Wi(s)γ(s)ds

⇔ σ2

g = 0

2.Test of Inclusion H0 : E(Yi) = α + Xiβ +

  • Wi1(s)γ1(s)ds

⇔γ0 = 0 and σ2

g = 0

HA : E(Yi) = α + Xiβ +

  • Wi1(s)γ1(s)ds +
  • Wi2(s)γ2(s)ds

⇔γ0 = 0 or σ2

g = 0

15 / 16

slide-16
SLIDE 16

Comparison of Approaches

Method 1: Functional Principal Components Regression (FPCR) Testing is done with standard likelihood-ratio tests of fixed effects. Selection of Kg controls the smoothness of γ(s) and is very

  • important. The authors suggested using cross-validation.

Overall, simpler and more straightforward than PFR to implement. Method 2: Penalized Functional Regression (PFR) Testing is done with non-standard likelihood-ratio test for random and fixed effects (Crainiceanu and Ruppert (2004), Greven et al. (2008). Smoothness of γ(s) is induced through the mixed model framework for g, for sufficiently large # of PCs. Overall, more flexible than FPCR but more complex to implement.

16 / 16