machine learning
play

Machine learning Boltzmann Machines Dima Kochkov 1 1 Department of - PowerPoint PPT Presentation

Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Machine learning Boltzmann Machines Dima Kochkov 1 1 Department of Physics University of Illinois at Urbana-Champaign Algorithm interest


  1. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Machine learning Boltzmann Machines Dima Kochkov 1 1 Department of Physics University of Illinois at Urbana-Champaign Algorithm interest meeting, 2016

  2. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Outline Algorithmic improvements 4 Machine learning 1 Markov Chain Monte Carlo Motivation Mean Field approximation Hall of fame Restricted Boltzmann Machine learning basics 2 Machines General framework Neural networks Examples 5 Energy based models Matching probability 3 Hopfield Nets distribution Boltzmann machines Image features extraction

  3. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Outline Algorithmic improvements 4 Machine learning 1 Markov Chain Monte Carlo Motivation Mean Field approximation Hall of fame Restricted Boltzmann Machine learning basics 2 Machines General framework Neural networks Examples 5 Energy based models Matching probability 3 Hopfield Nets distribution Boltzmann machines Image features extraction

  4. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Motivation for machine learning Machine learning is a problem solving approach that comes in handy when algorithmic solution is hard to obtain. Pros: requires minimum prior knowledge solution can adapt to a new environment Cons: inefficient use of hardware weaker guarantees of correctness requires big datasets for complex problems

  5. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Outline Algorithmic improvements 4 Machine learning 1 Markov Chain Monte Carlo Motivation Mean Field approximation Hall of fame Restricted Boltzmann Machine learning basics 2 Machines General framework Neural networks Examples 5 Energy based models Matching probability 3 Hopfield Nets distribution Boltzmann machines Image features extraction

  6. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Examples of successful applications Just to name a few: Speech recognition - Siri, ok Google Image recognition - ImageNet Fraud detection Recommendation systems - Netflix competition Games - AlphaGo Funky stuff like self driving cars, robotics etc

  7. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Outline Algorithmic improvements 4 Machine learning 1 Markov Chain Monte Carlo Motivation Mean Field approximation Hall of fame Restricted Boltzmann Machine learning basics 2 Machines General framework Neural networks Examples 5 Energy based models Matching probability 3 Hopfield Nets distribution Boltzmann machines Image features extraction

  8. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Optimizational point of view Solution to the problem is given in a variational form of a black box with a gazillion of knobs. Algorithm tunes those knobs to have a better solution. Algorithm is data driven. supervised learning (SVM, BackProp, Decision Trees etc) unsupervised learning (Kmeans, EM, Boltzmann Machines )

  9. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Outline Algorithmic improvements 4 Machine learning 1 Markov Chain Monte Carlo Motivation Mean Field approximation Hall of fame Restricted Boltzmann Machine learning basics 2 Machines General framework Neural networks Examples 5 Energy based models Matching probability 3 Hopfield Nets distribution Boltzmann machines Image features extraction

  10. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Artificial Neural Networks Artificial Neural Networks represent a class of models that constitute a set of connected units (neurons). Most of the time one can define following properties of a neuron: input values, x i vector < bool > vector < double > output value, f(input, links), usually f( w i x i ) f = tanh ( w i x i ) 1 f = 1+ e − ( wi xi ) f = max (0 , w i x i ) Activity pattern evolves according to a specific rule of the network. I will use s i to represent i th neuron, or v i and h i .

  11. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Outline Algorithmic improvements 4 Machine learning 1 Markov Chain Monte Carlo Motivation Mean Field approximation Hall of fame Restricted Boltzmann Machine learning basics 2 Machines General framework Neural networks Examples 5 Energy based models Matching probability 3 Hopfield Nets distribution Boltzmann machines Image features extraction

  12. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Memory storage One of the simplest energy based models - Hopfield Net: Binary units s i ∈ 0 , 1 Symmetric weights w i , j = w j , i Features a global energy function E Energy minimas correspond to memories � � E = − b i s i − w i , j s i s j (1) i i , j Figure: Hopfield Net

  13. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Outline Algorithmic improvements 4 Machine learning 1 Markov Chain Monte Carlo Motivation Mean Field approximation Hall of fame Restricted Boltzmann Machine learning basics 2 Machines General framework Neural networks Examples 5 Energy based models Matching probability 3 Hopfield Nets distribution Boltzmann machines Image features extraction

  14. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Basic Boltzmann machine Ingredients for the Boltzmann machine: Hopfield net + hidden units Gibbs probability − E distribution P = e T Z � � E = − b i s i − w i , j s i s j (2) i i , j Figure: Boltzmann machine s can be either visible ( v ) or hidden ( h ) units of the model A generative model with a potential for data interpretation.

  15. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary How does BM ”interpret” the data States of the hidden units correspond to interpretations of the data. Low energy states of the hidden units given visible units correspond to ”good” interpretations Common structure allows low energy interpretations Figure: Data interpretation

  16. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary How do we learn? Learning objective We want minimize the ”distance” between the probability distribution our Boltzmann machine generates and the distribution from which the data was drawn. � � P = log( P ( v = d i )) = log( P ( v = d i ) (3) i ∈ data i ∈ data ∂ P ∂ ′ ) − ′ , h = h ′ )) � � � = ( − E ( v = d i , h = h − E ( v = v ∂ w α,β ∂ w α,β i h ′ v ′ , h ′ (4) ∂ P � � � = ( s α s β ) | v = d i − ( s α s β ) (5) ∂ w α,β i h ′ v ′ , h ′ ∂ P = < s α s β > data − < s α s β > (6) ∂ w α,β

  17. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Algorithm To train a Boltzmann machine on a given dataset we: fix visible units to the values of the data instance compute < s α s β > (positive phase) set visible units free and again compute < s α s β > (negative phase) after processing a batch of data, update parameters Potential issues Can’t efficiently compute < s α s β >

  18. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary Outline Algorithmic improvements 4 Machine learning 1 Markov Chain Monte Carlo Motivation Mean Field approximation Hall of fame Restricted Boltzmann Machine learning basics 2 Machines General framework Neural networks Examples 5 Energy based models Matching probability 3 Hopfield Nets distribution Boltzmann machines Image features extraction

  19. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary MCMC Markov Chain Monte Carlo: e − E � � < s α s β > = s α s β p ( s ) = (7) s α s β Z s s clamp the data on the visible neurons sample s α s β from the Markov chain (positive phase) set visible units free and again sample s α s β (negative phase) repeat for the dataset, update weights Potential issues Markov chain might take a very long time to equilibrate How do we know if we have a good estimate?

  20. Machine learning Machine learning basics Energy based models Algorithmic improvements Examples Summary better MCMC We can use a clever trick to have a warm start. We keep a set of equilibrated Markov chains with fixed and free visible units. Equilibrated chains with clamped units (”Particles”) are used to evaluate < s α s β > data Equilibrated chains with free visible units (”Fantasy particles”) are used to evaluate < s α s β > Status Still to slow for most applications In theory should work well only for a full batch learning Much better than previously described method

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend