Additional Topics on Linear Regression
Ping Yu
School of Economics and Finance The University of Hong Kong
Ping Yu (HKU) Additional Topics 1 / 49
Additional Topics on Linear Regression Ping Yu School of Economics - - PowerPoint PPT Presentation
Additional Topics on Linear Regression Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Additional Topics 1 / 49 Tests for Functional Form Misspecification 1 Nonlinear Least Squares 2 Omitted and
School of Economics and Finance The University of Hong Kong
Ping Yu (HKU) Additional Topics 1 / 49
1
2
3
4
5
6
7
Ping Yu (HKU) Additional Topics 2 / 49
Ping Yu (HKU) Additional Topics 2 / 49
Ping Yu (HKU) Additional Topics 3 / 49
Ping Yu (HKU) Additional Topics 4 / 49
Tests for Functional Form Misspecification
Ping Yu (HKU) Additional Topics 5 / 49
Tests for Functional Form Misspecification
i e
ie
iβ + ui.
i
i
i b
i e
ie
Ping Yu (HKU) Additional Topics 6 / 49
Tests for Functional Form Misspecification
d
m1 distribution.
i e
i b
i b
Ping Yu (HKU) Additional Topics 7 / 49
Nonlinear Least Squares
Ping Yu (HKU) Additional Topics 8 / 49
Nonlinear Least Squares
xλ 1 λ
Ping Yu (HKU) Additional Topics 9 / 49
Nonlinear Least Squares 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2
0.5 1 1.5
Ping Yu (HKU) Additional Topics 10 / 49
Nonlinear Least Squares
1x1 + θ0 2x1G
θ 4
1x1)1(x2 θ3) + (θ0 2x1)1(x2 > θ3): Threshold Regression
Ping Yu (HKU) Additional Topics 11 / 49
Nonlinear Least Squares 0.5 1 1.5 2 0.5 1 1.5 2 0.5 1 1.5 2 0.5 1 1.5 2 0.5 1 1.5 2 0.5 1 1.5 2 0.5 1 1.5 2 0.5 1 1.5 2
Ping Yu (HKU) Additional Topics 12 / 49
Nonlinear Least Squares
θ n
i=1
i=1 mθ (xijb
∂θ m(xjθ).
θi
θiu2 i
θi
n
i=1
θi
n
i=1
θib
i
n
i=1
θi
Ping Yu (HKU) Additional Topics 13 / 49
Omitted and Irrelevant Variables
Ping Yu (HKU) Additional Topics 14 / 49
Omitted and Irrelevant Variables
1iβ 1 + x0 2iβ 2 + ui,E[xiui] = 0,
1iγ1 + vi,E[x1ivi] = 0.
1i
1i
1iβ 1 + x0 2iβ 2 + ui
1i
2i]β 2
1i
2i] is the coefficient from a regression of x2i on
Ping Yu (HKU) Additional Topics 15 / 49
Omitted and Irrelevant Variables
Ping Yu (HKU) Additional Topics 16 / 49
Omitted and Irrelevant Variables
Ping Yu (HKU) Additional Topics 17 / 49
Omitted and Irrelevant Variables
1X1)1X0 1y, is
1i
11 σ2,
11.2σ2
22 Q21
2i] = 0 (so the variables are orthogonal) then the two estimators
22 Q21 > 0, Q11 > Q11.2
11 σ2 < Q1 11.2σ2.
Ping Yu (HKU) Additional Topics 18 / 49
Omitted and Irrelevant Variables
x + µ2 ,
x
x.
Ping Yu (HKU) Additional Topics 19 / 49
Omitted and Irrelevant Variables
Ping Yu (HKU) Additional Topics 20 / 49
Omitted and Irrelevant Variables
Ping Yu (HKU) Additional Topics 21 / 49
Model Selection
Ping Yu (HKU) Additional Topics 22 / 49
Model Selection
Ping Yu (HKU) Additional Topics 23 / 49
Model Selection
1 and b
2, etc.
i jx1i,x2i] = σ2.
Ping Yu (HKU) Additional Topics 24 / 49
Model Selection
k2 > cα
Ping Yu (HKU) Additional Topics 25 / 49
Model Selection
m
m is the variance estimate for model m and is roughly 2ln (neglecting the
Ping Yu (HKU) Additional Topics 26 / 49
Model Selection
1
2
k2,
1
2
k2 < 2k2
Ping Yu (HKU) Additional Topics 27 / 49
Model Selection
m
p
p
Ping Yu (HKU) Additional Topics 28 / 49
Model Selection
Ping Yu (HKU) Additional Topics 29 / 49
Model Selection
Ping Yu (HKU) Additional Topics 30 / 49
Model Selection
Ping Yu (HKU) Additional Topics 31 / 49
Model Selection
Ping Yu (HKU) Additional Topics 32 / 49
Generalized Least Squares
Ping Yu (HKU) Additional Topics 33 / 49
Generalized Least Squares
1, ,σ2 n
i = σ2(xi) = E[u2 i jxi].
β n
i=1
iβ
i
β n
i=1
i
i
Ping Yu (HKU) Additional Topics 34 / 49
Generalized Least Squares
1, , b
n
i = α0 + z0 1iα1 = α0zi,
i . Then
1iα1
1iα1 + ξ i,
i jxi
i jxi
i jxi
Ping Yu (HKU) Additional Topics 35 / 49
Generalized Least Squares
d
i
iξ 2 i
i
i b
i u2 i = 2uix0 i(b
i(b
n
i=1
n
i=1
i
n
i=1
i
p
d
Ping Yu (HKU) Additional Topics 36 / 49
Generalized Least Squares
i = z0 iα by
i = e
i > 0 for all i. Then set
1, , e
n
i < 0, or e
i 0 for some i, use a trimming rule
i = max
i ,σ2o
Ping Yu (HKU) Additional Topics 37 / 49
Generalized Least Squares
i
n
i=1
i
i
Ping Yu (HKU) Additional Topics 38 / 49
Generalized Least Squares
i , we have shown
i
i xix0 i
i
i = α0zi, e
n
i=1
i
i
n
i=1
i
i xix0 i
n
i=1
i
i
1, ,b
n
Ping Yu (HKU) Additional Topics 39 / 49
Generalized Least Squares
Ping Yu (HKU) Additional Topics 40 / 49
Testing for Heteroskedasticity
Ping Yu (HKU) Additional Topics 41 / 49
Testing for Heteroskedasticity
i
i
q.
Ping Yu (HKU) Additional Topics 42 / 49
Testing for Heteroskedasticity
n
i=1
n
i=1
i
n
i=1
i b
i
i
i=1 f 2 i rather than 2b
Ping Yu (HKU) Additional Topics 43 / 49
Testing for Heteroskedasticity
i=1 xix0 ib
i and
i=1 b
i
i=1 xix0 i
i, denoted as zi = (1,z0 1i)0 as above, we
i=1 z1i
i b
n b
n Dn d
q
n
i=1
i b
Ping Yu (HKU) Additional Topics 44 / 49
Testing for Heteroskedasticity
n
i=1
i b
n
i=1
i )0, where b
i on 1,b
i has a limiting χ2 2 distribution under H0.
Ping Yu (HKU) Additional Topics 45 / 49
Regression Intervals and Forecast Intervals
Ping Yu (HKU) Additional Topics 46 / 49
Regression Intervals and Forecast Intervals
1x cannot be the same as any xi observed, why? Ping Yu (HKU) Additional Topics 47 / 49
Regression Intervals and Forecast Intervals
Ping Yu (HKU) Additional Topics 48 / 49
Regression Intervals and Forecast Intervals
i
i jxi = x
i jxi
ββ) b s(x)
Ping Yu (HKU) Additional Topics 49 / 49