metaheuristics applied to the feature selection problem
play

Metaheuristics applied to the feature selection problem Tom Fredrik - PowerPoint PPT Presentation

Introduction Problem Algorithms Case study Fontainebleu Conclusion Metaheuristics applied to the feature selection problem Tom Fredrik B. Klaussen Department of Mathematics University of Oslo Master presentation 28th June - 2006 Klaussen


  1. Introduction Problem Algorithms Case study Fontainebleu Conclusion Metaheuristics applied to the feature selection problem Tom Fredrik B. Klaussen Department of Mathematics University of Oslo Master presentation 28th June - 2006 Klaussen Metaheuristics applied to the feature selection problem

  2. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion I will present 2 new methods for the FSP I will present a number of adapted methods, from other scientific fields. I will present a visualization algorithm for the FSP Klaussen Metaheuristics applied to the feature selection problem

  3. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion I will present 2 new methods for the FSP I will present a number of adapted methods, from other scientific fields. I will present a visualization algorithm for the FSP Klaussen Metaheuristics applied to the feature selection problem

  4. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion I will present 2 new methods for the FSP I will present a number of adapted methods, from other scientific fields. I will present a visualization algorithm for the FSP Klaussen Metaheuristics applied to the feature selection problem

  5. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion Segmentation We want to separate a set of data into distinct subregions: Figure: Ground truth for Fontainebleu dataset Curse of dimensionality Natural to add more data to get better separation, but results often deteriorate Solution: Try to use the “best” set of data Klaussen Metaheuristics applied to the feature selection problem

  6. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion Segmentation We want to separate a set of data into distinct subregions: Figure: Ground truth for Fontainebleu dataset Curse of dimensionality Natural to add more data to get better separation, but results often deteriorate Solution: Try to use the “best” set of data Klaussen Metaheuristics applied to the feature selection problem

  7. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion Segmentation We want to separate a set of data into distinct subregions: Figure: Ground truth for Fontainebleu dataset Curse of dimensionality Natural to add more data to get better separation, but results often deteriorate Solution: Try to use the “best” set of data Klaussen Metaheuristics applied to the feature selection problem

  8. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion Feature selection The Feature Selection Problem is: An Image Analysis Problem A Parameter Estimation Problem A Discrete Optimization Problem Klaussen Metaheuristics applied to the feature selection problem

  9. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion Feature selection The Feature Selection Problem is: An Image Analysis Problem A Parameter Estimation Problem A Discrete Optimization Problem Klaussen Metaheuristics applied to the feature selection problem

  10. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion Feature selection The Feature Selection Problem is: An Image Analysis Problem A Parameter Estimation Problem A Discrete Optimization Problem Klaussen Metaheuristics applied to the feature selection problem

  11. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion Feature selection The Feature Selection Problem is: An Image Analysis Problem A Parameter Estimation Problem A Discrete Optimization Problem Klaussen Metaheuristics applied to the feature selection problem

  12. Introduction Problem What we will see Algorithms Motivation Case study Fontainebleu Feature selection Conclusion Feature selection The Feature Selection Problem is: An Image Analysis Problem A Parameter Estimation Problem A Discrete Optimization Problem Klaussen Metaheuristics applied to the feature selection problem

  13. Introduction Problem Problem formulation Algorithms Regularization Case study Fontainebleu Motivation for metaheuristics Conclusion Standard formulation (2 π ) d / 2 | Σ c | 1 / 2 exp( − 1 1 µ c ) T Σ − 1 max p ( x | ω c ) = 2( x − ˆ c ( x − ˆ µ c )) c Cholesky factorization L c L T c = Σ c Equivalent, more easily calculated formulation N p ∗ ( x | ω c ) = − 1 � 2 || L − 1 ( L − 1 max c ( x − ˆ µ c ) || 2 − log( c ) i , i ) c i =1 Klaussen Metaheuristics applied to the feature selection problem

  14. Introduction Problem Problem formulation Algorithms Regularization Case study Fontainebleu Motivation for metaheuristics Conclusion Standard formulation (2 π ) d / 2 | Σ c | 1 / 2 exp( − 1 1 µ c ) T Σ − 1 max p ( x | ω c ) = 2( x − ˆ c ( x − ˆ µ c )) c Cholesky factorization L c L T c = Σ c Equivalent, more easily calculated formulation N p ∗ ( x | ω c ) = − 1 � 2 || L − 1 ( L − 1 max c ( x − ˆ µ c ) || 2 − log( c ) i , i ) c i =1 Klaussen Metaheuristics applied to the feature selection problem

  15. Introduction Problem Problem formulation Algorithms Regularization Case study Fontainebleu Motivation for metaheuristics Conclusion Standard formulation (2 π ) d / 2 | Σ c | 1 / 2 exp( − 1 1 µ c ) T Σ − 1 max p ( x | ω c ) = 2( x − ˆ c ( x − ˆ µ c )) c Cholesky factorization L c L T c = Σ c Equivalent, more easily calculated formulation N p ∗ ( x | ω c ) = − 1 � 2 || L − 1 ( L − 1 max c ( x − ˆ µ c ) || 2 − log( c ) i , i ) c i =1 Klaussen Metaheuristics applied to the feature selection problem

  16. Introduction Problem Problem formulation Algorithms Regularization Case study Fontainebleu Motivation for metaheuristics Conclusion Motivation Inverting nearly singular matrices is numerically very unstable. How can we get more stability? Regularization Side effect: stabilizes the parameter estimation on its own merit. Choice of regularizer Σ c ( α ) = α ˆ ˆ Σ c + (1 − α )ˆ Σ (1) Σ( λ ) = λ ˆ ˆ σ 2 I Σ + (1 − λ )ˆ (2) which combined yields: Σ c ( α, λ ) = α ˆ ˆ Σ c + (1 − α )ˆ Σ( λ ) (3) Klaussen Metaheuristics applied to the feature selection problem

  17. Introduction Problem Problem formulation Algorithms Regularization Case study Fontainebleu Motivation for metaheuristics Conclusion Motivation Inverting nearly singular matrices is numerically very unstable. How can we get more stability? Regularization Side effect: stabilizes the parameter estimation on its own merit. Choice of regularizer Σ c ( α ) = α ˆ ˆ Σ c + (1 − α )ˆ Σ (1) Σ( λ ) = λ ˆ ˆ σ 2 I Σ + (1 − λ )ˆ (2) which combined yields: Σ c ( α, λ ) = α ˆ ˆ Σ c + (1 − α )ˆ Σ( λ ) (3) Klaussen Metaheuristics applied to the feature selection problem

  18. Introduction Problem Problem formulation Algorithms Regularization Case study Fontainebleu Motivation for metaheuristics Conclusion Motivation Inverting nearly singular matrices is numerically very unstable. How can we get more stability? Regularization Side effect: stabilizes the parameter estimation on its own merit. Choice of regularizer Σ c ( α ) = α ˆ ˆ Σ c + (1 − α )ˆ Σ (1) Σ( λ ) = λ ˆ ˆ σ 2 I Σ + (1 − λ )ˆ (2) which combined yields: Σ c ( α, λ ) = α ˆ ˆ Σ c + (1 − α )ˆ Σ( λ ) (3) Klaussen Metaheuristics applied to the feature selection problem

  19. Introduction Problem Problem formulation Algorithms Regularization Case study Fontainebleu Motivation for metaheuristics Conclusion Motivation Inverting nearly singular matrices is numerically very unstable. How can we get more stability? Regularization Side effect: stabilizes the parameter estimation on its own merit. Choice of regularizer Σ c ( α ) = α ˆ ˆ Σ c + (1 − α )ˆ Σ (1) Σ( λ ) = λ ˆ ˆ σ 2 I Σ + (1 − λ )ˆ (2) which combined yields: Σ c ( α, λ ) = α ˆ ˆ Σ c + (1 − α )ˆ Σ( λ ) (3) Klaussen Metaheuristics applied to the feature selection problem

  20. Introduction Problem Problem formulation Algorithms Regularization Case study Fontainebleu Motivation for metaheuristics Conclusion Running time of exact algorithms Exact algorithms are available for the Feature Selection Problem, but running time is exponential. Local search Local search is an easy way, to obtain relatively good solutions, but cannot exit a local optimum. Metaheuristics Metaheuristics are a class of algorithms that uses knowledge of the problem topology in order to move from one place in the search-space to another place in hopefully an intelligent manner. Common to them all are the fact that they use more or less local information to decide where to go next. This use of local information, while maybe carefully devised, and often effective, can in general not guarantee that we find the best solution to our problem. In fact we can only hope that we are in the vicinity of a good solution. Klaussen Metaheuristics applied to the feature selection problem

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend