Projection
Ping Yu
School of Economics and Finance The University of Hong Kong
Ping Yu (HKU) Projection 1 / 42
Projection Ping Yu School of Economics and Finance The University - - PowerPoint PPT Presentation
Projection Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Projection 1 / 42 Hilbert Space and Projection Theorem 1 Projection in the L 2 Space 2 Projection in R n 3 Projection Matrices Partitioned Fit
School of Economics and Finance The University of Hong Kong
Ping Yu (HKU) Projection 1 / 42
1
2
3
4
Ping Yu (HKU) Projection 2 / 42
Ping Yu (HKU) Projection 2 / 42
Hilbert Space and Projection Theorem
Ping Yu (HKU) Projection 3 / 42
Hilbert Space and Projection Theorem
aA metric space (H,d) is complete if every Cauchy sequence in H converges in H, where d is a metric on
N such that for all natural numbers m,n > N, d(xm,xn) < ε.
Ping Yu (HKU) Projection 4 / 42
Hilbert Space and Projection Theorem
2 ; we call x is orthogonal to y and denote it as x ? y.
Ping Yu (HKU) Projection 5 / 42
Hilbert Space and Projection Theorem
Ping Yu (HKU) Projection 6 / 42
Hilbert Space and Projection Theorem
h2M ky hk2 .
Ping Yu (HKU) Projection 7 / 42
Hilbert Space and Projection Theorem
1 , we call P as an orthogonal projector.
Ping Yu (HKU) Projection 8 / 42
Hilbert Space and Projection Theorem
Ping Yu (HKU) Projection 9 / 42
Hilbert Space and Projection Theorem
Ping Yu (HKU) Projection 10 / 42
Hilbert Space and Projection Theorem
Ping Yu (HKU) Projection 11 / 42
Hilbert Space and Projection Theorem
2 (y). Then
2 (y)) = Π1(Π2 (y)) + Π1(Π? 2 (y)) = Π1(Π2 (y)),
2 (y),x
Ping Yu (HKU) Projection 12 / 42
Projection in the L2 Space
Ping Yu (HKU) Projection 13 / 42
Projection in the L2 Space
h2ME
β2RkE
1span(x) =
.
Ping Yu (HKU) Projection 14 / 42
Projection in the L2 Space
β2RkE
2 ∂ ∂x (a0x) = ∂ ∂x (x0a) = a Ping Yu (HKU) Projection 15 / 42
Projection in the L2 Space
h2ME
?
ε2RE
Ping Yu (HKU) Projection 16 / 42
Projection in the L2 Space
β2RkE
β2Rk
Ping Yu (HKU) Projection 17 / 42
Projection in the L2 Space
Ping Yu (HKU) Projection 18 / 42
Projection in the L2 Space
Ping Yu (HKU) Projection 19 / 42
Projection in the L2 Space
Ping Yu (HKU) Projection 20 / 42
Projection in the L2 Space
Ping Yu (HKU) Projection 21 / 42
Projection in Rn
Ping Yu (HKU) Projection 22 / 42
Projection in Rn
β2RkSSR(β) = arg min β2Rk n
i=1
iβ
β2RkEn
n
i=1
iβ
n
i=1
i 2β 0 n
i=1
n
i=1
iβ
Ping Yu (HKU) Projection 23 / 42
Projection in Rn
Ping Yu (HKU) Projection 24 / 42
Projection in Rn
n
i=1
n
i=1
i b
3 ∂ ∂x (a0x) = ∂ ∂x (x0a) = a, and ∂ ∂x (x0Ax) = (A+ A0)x. Ping Yu (HKU) Projection 25 / 42
Projection in Rn
i, i.e.,
1
n
Ping Yu (HKU) Projection 26 / 42
Projection in Rn
h2M kyhk2
β2Rk kyXβk2
β2Rk n
i=1
iβ
i=1
iβ
4span(X) =
is called the column space or range space of X.
5Recall that for x = (x1, ,xn), and z = (z1, ,zn), the Euclidean inner product of x and z is
hx,zi = ∑n
i=1 xizi, so kxk2 = hx,xi = ∑n i=1 x2 i . Ping Yu (HKU) Projection 27 / 42
Projection in Rn
Ping Yu (HKU) Projection 28 / 42
Projection in Rn
Ping Yu (HKU) Projection 29 / 42
Projection in Rn
β2Rk kyXβk2 W .
Ping Yu (HKU) Projection 30 / 42
Projection in Rn Projection Matrices
Ping Yu (HKU) Projection 31 / 42
Projection in Rn Projection Matrices
X =
X =
6A square matrix A is idempotent if A2 = AA = A. 7Trace of a square matrix is the sum of its diagonal elements. tr(A+ B) =tr(A)+tr(B) and tr(AB) =tr(BA). Ping Yu (HKU) Projection 32 / 42
Partitioned Fit and Residual Regression
Ping Yu (HKU) Projection 33 / 42
Partitioned Fit and Residual Regression
1, b
2)0, where we partition
Ping Yu (HKU) Projection 34 / 42
Partitioned Fit and Residual Regression
1?2X1?2
1?2y?2 =
1M2X1
1M2y.
1?2X1
1?2y P12y = P12(Π(y)).
Ping Yu (HKU) Projection 35 / 42
Partitioned Fit and Residual Regression
Ping Yu (HKU) Projection 36 / 42
Partitioned Fit and Residual Regression
trailing term
1(IP2)X1
1(IP2)
leading term
1X1)1X0 1Π1(y).
Ping Yu (HKU) Projection 37 / 42
Partitioned Fit and Residual Regression
Ping Yu (HKU) Projection 38 / 42
Partitioned Fit and Residual Regression
2X2)1X0 2y = M2y.
2X2)1X0 2X1 = M2X1.
x1 b
x1b
1M2X1
1M2y
1X1 X0 1X2(X0 2X2)1X0 2X1
1yX0 1X2
2X2
2y
1yX0 1X2
2X2
2y
Ping Yu (HKU) Projection 39 / 42
Partitioned Fit and Residual Regression
1X1
1X2
2X1
2X2
1y
2y
1X2(X0 2X2)1
1y
2y
1yW1X0 1X2(X0 2X2)1X0 2y
1yX0 1X2
2X2
2y
11
11 A12A1 22
22 A21e
11
22 + A1 22 A21e
11 A12A1 22
22 A21.
Ping Yu (HKU) Projection 40 / 42
Partitioned Fit and Residual Regression
1?2X1?2
1?2y?2, we need only show that
1M2y =
1M2X1
1M2 on both sides, we have
1M2y = X0 1M2X1b
1M2X2b
1M2b
1M2X1b
1M2b
1b
Ping Yu (HKU) Projection 41 / 42
Partitioned Fit and Residual Regression Projection along a Subspace
Ping Yu (HKU) Projection 42 / 42