E9 205 Machine Learning for Signal Processing - - PowerPoint PPT Presentation

e9 205 machine learning for signal processing
SMART_READER_LITE
LIVE PREVIEW

E9 205 Machine Learning for Signal Processing - - PowerPoint PPT Presentation

E9 205 Machine Learning for Signal Processing Supervised-Dimensionality-Reduction. Decision Theory 26-08-2019 Probability Distributions Advantages and Disadvantages of PCA Simple linear transform Eigen decomposition of Data Covariance


slide-1
SLIDE 1

E9 205 Machine Learning for Signal Processing

26-08-2019

Supervised-Dimensionality-Reduction. Decision Theory Probability Distributions

slide-2
SLIDE 2

Advantages and Disadvantages of PCA

❖ Simple linear transform ❖ Eigen decomposition of Data Covariance matrix is

straight forward.

❖ PCA for high dimensional data ? ❖ Variance maximization may not be the ideal loss

function in dimensionality reduction.

❖ If the data contains discrete class labels, we can do better

than PCA to maximize class separation.

slide-3
SLIDE 3

Need for Supervised Dimensionality Reduction

slide-4
SLIDE 4

Linear Discriminant Analysis

slide-5
SLIDE 5

Without the Within Class Factor

slide-6
SLIDE 6

Linear Discriminant Analysis

Find a linear transform with a criterion which maximizes the class separation

  • Maximize the between class distance in the projected space

while minimizing the within class covariance

❖ Generalized Eigenvalue problem ❖ Eigenvectors of

PRML - C. Bishop (Sec. 4.1.4, Sec. 4.1.6)

slide-7
SLIDE 7

Linear Discriminant Analysis

Projecting on line joining means Fisher Discriminant PRML - C. Bishop (Sec. 4.1.4, Sec. 4.1.6)

slide-8
SLIDE 8

PCA versus LDA

PCA LDA PRML - C. Bishop (Sec. 4.1.4, Sec. 4.1.6)

slide-9
SLIDE 9

PCA versus LDA