Statistical modeling and analysis of neural data (NEU 560), Fall - - PowerPoint PPT Presentation

statistical modeling and analysis of neural data neu 560
SMART_READER_LITE
LIVE PREVIEW

Statistical modeling and analysis of neural data (NEU 560), Fall - - PowerPoint PPT Presentation

Statistical modeling and analysis of neural data (NEU 560), Fall 2020 Jonathan Pillow Princeton University Lecture 6: PCA part 2 Quick recap of PCA 2nd moment matrix the data d SVD } } 2 3 ~ x 1 ~ x 2 6 7 X = N . 6


slide-1
SLIDE 1

Jonathan Pillow Princeton University

Statistical modeling and analysis of neural data (NEU 560), Fall 2020 Lecture 6: PCA part 2

slide-2
SLIDE 2

Quick recap of PCA

X = 2 6 6 6 4 — ~ x1 — — ~ x2 — . . . — ~ xN — 3 7 7 7 5

the data d N

}

}

2nd moment matrix SVD first k PCs: sum of squares of data within subspace:

slide-3
SLIDE 3

Quick recap of PCA

X = 2 6 6 6 4 — ~ x1 — — ~ x2 — . . . — ~ xN — 3 7 7 7 5

the data d N

}

}

2nd moment matrix SVD first k PCs: fraction of sum of squares:

slide-4
SLIDE 4

Quick recap of PCA

X = 2 6 6 6 4 — ~ x1 — — ~ x2 — . . . — ~ xN — 3 7 7 7 5

the data d N

}

}

2nd moment matrix SVD first k PCs: sum of squares of all data

slide-5
SLIDE 5

Discussion Questions

X = 2 6 6 6 4 — ~ x1 — — ~ x2 — . . . — ~ xN — 3 7 7 7 5

slide-6
SLIDE 6

Discussion Questions

X = 2 6 6 6 4 — ~ x1 — — ~ x2 — . . . — ~ xN — 3 7 7 7 5

1. Let where What is ?

slide-7
SLIDE 7

Discussion Questions

X = 2 6 6 6 4 — ~ x1 — — ~ x2 — . . . — ~ xN — 3 7 7 7 5

1. Let where What is ? 2. Let be the SVD of X. What is the relationship between U, S, and P , Q, V?

slide-8
SLIDE 8

Discussion Questions

answers on white-board (see end of slides)

slide-9
SLIDE 9

PCA is equivalent to fitting an ellipse to your data

dimension 1 dimension 2

slide-10
SLIDE 10

PCA is equivalent to fitting an ellipse to your data

dimension 1 dimension 2 1st PC

slide-11
SLIDE 11

PCA is equivalent to fitting an ellipse to your data

1st PC dimension 1 dimension 2

}

slide-12
SLIDE 12

PCA is equivalent to fitting an ellipse to your data

2nd PC dimension 1 dimension 2

}

1st PC

slide-13
SLIDE 13

PCA is equivalent to fitting an ellipse to your data

2nd PC dimension 1 dimension 2

}

1st PC

}

  • PCs are major axes of ellipse(oid)
  • singular values specify lengths of axes
slide-14
SLIDE 14

what is the dominant eigenvector of ?

dim 1 dim 2

slide-15
SLIDE 15

what is the dominant eigenvector of ?

dim 1 dim 2

slide-16
SLIDE 16

dim 1 dim 2

Centering the data

slide-17
SLIDE 17

Centering the data

dim 1 dim 2 1st PC

slide-18
SLIDE 18

Centering the data

dim 1 dim 2 now it’s a covariance! 1st PC

  • In practice, we almost

always do PCA on centered data!

  • C = np.cov(X)
slide-19
SLIDE 19

Projecting onto the PCs

PC-1 projection PC-2 projection

  • visualize low-dimensional projection that captures most variance
slide-20
SLIDE 20

Full derivation of PCA: see notes

ˆ Bpca = arg max

B

||XB||2

F

such that B>B = I.

1. Two equivalent formulations: find subspace that preserves maximal sum-of-squares

slide-21
SLIDE 21

Full derivation of PCA: see notes

ˆ Bpca = arg max

B

||XB||2

F

ˆ Bpca = arg min

B ||X − XBB>||2 F

such that B>B = I.

1. 2. Two equivalent formulations:

such that B>B = I.

find subspace that preserves maximal sum-of-squares minimize sum-of-squares of

  • rthogonal component

}

reconstruction of X in subspace spanned by B

slide-22
SLIDE 22
slide-23
SLIDE 23
slide-24
SLIDE 24
slide-25
SLIDE 25