Dimensionality Reduction
St Stanfor
- rd University
31-Oct-2019 1
Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles - - PowerPoint PPT Presentation
Dimensionality Reduction Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab 31-Oct-2019 1 St Stanfor ord University CS 131 Roadmap Dimensionality Reduction Pixels
Dimensionality Reduction
St Stanfor
31-Oct-2019 1
Dimensionality Reduction
St Stanfor
31-Oct-2019 2
Convolutions Edges Descriptors
Resizing Segmentation Clustering Recognition Detection Machine learning
Motion Tracking
Neural networks Convolutional neural networks
Dimensionality Reduction
St Stanfor
31-Oct-2019 3
– In 1-dimension, we must go a distance of 5/5000=0.001 on the average to capture 5 nearest neighbors. – In 2 dimensions, we must go to get a square that contains 0.001 of the volume. – In d dimensions, we must go
0.001
0.001
( )
1/d
Dimensionality Reduction
St Stanfor
31-Oct-2019 4
Dimensionality Reduction
St Stanfor
31-Oct-2019 5
Dimensionality Reduction
St Stanfor
31-Oct-2019 6
– [U,S,V]= numpy.linalg.svd(A)
Dimensionality Reduction
St Stanfor
31-Oct-2019 7
Dimensionality Reduction
St Stanfor
31-Oct-2019 8
– In general, if A is m x n, then U will be m x m, Σ will be m x n, and VT will be n x n. – (Note the dimensions work out to produce m x n after multiplication)
Dimensionality Reduction
St Stanfor
31-Oct-2019 9
– Geometric rotation may not be an applicable concept, depending on the matrix. So we call them “unitary” matrices – each column is a unit vector.
– The number of nonzero entries = rank of A – The algorithm always sorts the entries high to low
Dimensionality Reduction
St Stanfor
31-Oct-2019 10
Dimensionality Reduction
St Stanfor
31-Oct-2019 11
columns of A
Dimensionality Reduction
St Stanfor
31-Oct-2019 12
Dimensionality Reduction
St Stanfor
31-Oct-2019 13
Dimensionality Reduction
St Stanfor
31-Oct-2019 14
Dimensionality Reduction
St Stanfor
31-Oct-2019 15
Dimensionality Reduction
St Stanfor
31-Oct-2019 16
Dimensionality Reduction
St Stanfor
31-Oct-2019 17
Dimensionality Reduction
St Stanfor
31-Oct-2019 18
Dimensionality Reduction
St Stanfor
31-Oct-2019 19
Dimensionality Reduction
St Stanfor
31-Oct-2019 20
Dimensionality Reduction
St Stanfor
31-Oct-2019 21
Dimensionality Reduction
St Stanfor
31-Oct-2019 22
– X = random unit vector – while(x hasn’t converged)
Dimensionality Reduction
St Stanfor
31-Oct-2019 23
– Take eigenvectors of AAT (matrix is always square).
– Take eigenvectors of ATA (matrix is always square).
Dimensionality Reduction
St Stanfor
31-Oct-2019 24
– Python’s np.linalg.svd() command has options to efficiently compute only what you need, if performance becomes an issue
A detailed geometric explanation of SVD is here: http://www.ams.org/samplings/feature-column/fcarc-svd
Dimensionality Reduction
St Stanfor
31-Oct-2019 25
Dimensionality Reduction
St Stanfor
31-Oct-2019 26
Dimensionality Reduction
St Stanfor
31-Oct-2019 27
Dimensionality Reduction
St Stanfor
31-Oct-2019 28
Dimensionality Reduction
St Stanfor
31-Oct-2019 29
– e.g.: 2 dimensional data set – x: number of hours studied for a subject – y: marks obtained in that subject – covariance value is say: 104.53 – what does this value mean?
Dimensionality Reduction
St Stanfor
31-Oct-2019 30
Dimensionality Reduction
St Stanfor
31-Oct-2019 31
Dimensionality Reduction
St Stanfor
31-Oct-2019 32
Dimensionality Reduction
St Stanfor
31-Oct-2019 33
Dimensionality Reduction
St Stanfor
31-Oct-2019 34
1D subspace in 2D
Dimensionality Reduction
St Stanfor
31-Oct-2019 35
– The answer is to look into the correlation between the points – The tool for doing this is called PCA
Dimensionality Reduction
St Stanfor
31-Oct-2019 36
Slide inspired by N. Vasconcelos
1D subspace in 2D 2D subspace in 3D
Dimensionality Reduction
St Stanfor
31-Oct-2019 37
Dimensionality Reduction
St Stanfor
31-Oct-2019 38
– Σ = UΛUT = UΛ1/2(UΛ1/2)T
Dimensionality Reduction
St Stanfor
31-Oct-2019 39
– Principal components φi are the eigenvectors of Σ – Principal lengths λi are the eigenvalues of Σ
– Not flat if λ1 ≈ λ2 – Flat if λ1 >> λ2
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 40
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 41
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 42
– Any real n x m matrix (n>m) can be decomposed as – Where M is an (n x m) column orthonormal matrix of left singular vectors (columns of M) – Π is an (m x m) diagonal matrix of singular values – NT is an (m x m) row orthonormal matrix of right singular vectors (columns of N)
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 43
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 44
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 45
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 46
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 47
– The eigenvectors of Σ are the columns of N – The eigenvalues of Σ are
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 48
– Create the centered data matrix – Compute its SVD – Principal components are columns of N, eigenvalues are
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 49
Slide inspired by N. Vasconcelos
Dimensionality Reduction
St Stanfor
31-Oct-2019 50
Dimensionality Reduction
St Stanfor
31-Oct-2019 51
Dimensionality Reduction
St Stanfor
31-Oct-2019 52
Dimensionality Reduction
St Stanfor
31-Oct-2019 53
Dimensionality Reduction
St Stanfor
31-Oct-2019 54
Dimensionality Reduction
St Stanfor
31-Oct-2019 55
2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12
Dimensionality Reduction
St Stanfor
31-Oct-2019 56
Dimensionality Reduction
St Stanfor
31-Oct-2019 57
2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12
Dimensionality Reduction
St Stanfor
31-Oct-2019 58
Dimensionality Reduction
St Stanfor
31-Oct-2019 59
2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12 2 4 6 8 10 12
Dimensionality Reduction
St Stanfor
31-Oct-2019 60
Dimensionality Reduction
St Stanfor
31-Oct-2019 61