latent force models
play

Latent Force Models Neil D. Lawrence (work with Magnus Rattray, - PowerPoint PPT Presentation

Latent Force Models Neil D. Lawrence (work with Magnus Rattray, Mauricio Alvarez , Pei Gao, Antti Honkela, David Luengo, Guido Sanguinetti, Michalis Titsias, Jennifer Withers) University of Sheffield University of Edinburgh Bayes 250


  1. Latent Force Models Neil D. Lawrence (work with Magnus Rattray, Mauricio ´ Alvarez , Pei Gao, Antti Honkela, David Luengo, Guido Sanguinetti, Michalis Titsias, Jennifer Withers) University of Sheffield University of Edinburgh Bayes 250 Workshop 6th September 2011

  2. Outline Motivation and Review Motion Capture Example

  3. Outline Motivation and Review Motion Capture Example

  4. Styles of Machine Learning Background: interpolation is easy, extrapolation is hard ◮ Urs H¨ olzle keynote talk at NIPS 2005. ◮ Emphasis on massive data sets. ◮ Let the data do the work—more data, less extrapolation. ◮ Alternative paradigm: ◮ Very scarce data: computational biology, human motion. ◮ How to generalize from scarce data? ◮ Need to include more assumptions about the data (e.g. invariances).

  5. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling

  6. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling let the data“speak”

  7. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling let the data“speak” impose physical laws

  8. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling let the data“speak” impose physical laws data driven

  9. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling let the data“speak” impose physical laws data driven knowledge driven

  10. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling let the data“speak” impose physical laws data driven knowledge driven adaptive models

  11. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling let the data“speak” impose physical laws data driven knowledge driven adaptive models differential equations

  12. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling let the data“speak” impose physical laws data driven knowledge driven adaptive models differential equations digit recognition

  13. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling let the data“speak” impose physical laws data driven knowledge driven adaptive models differential equations digit recognition climate, weather models

  14. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling Weakly Mechanistic let the data“speak” impose physical laws data driven knowledge driven adaptive models differential equations digit recognition climate, weather models

  15. General Approach Broadly Speaking: Two approaches to modeling data modeling mechanistic modeling Strongly Mechanistic Weakly Mechanistic let the data“speak” impose physical laws data driven knowledge driven adaptive models differential equations digit recognition climate, weather models

  16. Weakly Mechanistic vs Strongly Mechanistic ◮ Underlying data modeling techniques there are weakly mechanistic principles (e.g. smoothness). ◮ In physics the models are typically strongly mechanistic . ◮ In principle we expect a range of models which vary in the strength of their mechanistic assumptions. ◮ This work is one part of that spectrum: add further mechanistic ideas to weakly mechanistic models.

  17. Dimensionality Reduction ◮ Linear relationship between the data, X ∈ ℜ n × p , and a reduced dimensional representation, F ∈ ℜ n × q , where q ≪ p . X = FW + ǫ , ǫ ∼ N ( 0 , Σ ) ◮ Integrate out F , optimize with respect to W . ◮ For Gaussian prior, F ∼ N ( 0 , I ) ◮ and Σ = σ 2 I we have probabilistic PCA (Tipping and Bishop, 1999; Roweis, 1998) . ◮ and Σ constrained to be diagonal, we have factor analysis.

  18. Dimensionality Reduction: Temporal Data ◮ Deal with temporal data with a temporal latent prior. ◮ Independent Gauss-Markov priors over each f i ( t ) leads to : Rauch-Tung-Striebel (RTS) smoother (Kalman filter). ◮ More generally consider a Gaussian process (GP) prior, q � � � p ( F | t ) = N f : , i | 0 , K f : , i , f : , i . i =1

  19. Joint Gaussian Process ◮ Given the covariance functions for { f i ( t ) } we have an implied covariance function across all { x i ( t ) } —(ML: semi-parametric latent factor model (Teh et al., 2005) , Geostatistics: linear model of coregionalization). ◮ Rauch-Tung-Striebel smoother has been preferred ◮ linear computational complexity in n . ◮ Advances in sparse approximations have made the general GP framework practical. (Titsias, 2009; Snelson and Ghahramani, nonero Candela and Rasmussen, 2005) . 2006; Qui˜

  20. Gaussian Process: Exponentiated Quadratic Covariance ◮ Take, for example, exponentiated quadratic form for covariance. � −|| t − t ′ || 2 � t , t ′ � � k = α exp 2 ℓ 2 1 5 0.5 10 ◮ Gaussian process over m 0 latent functions. 15 −0.5 20 −1 25 5 10 15 20 25 n

  21. Mechanical Analogy Back to Mechanistic Models! ◮ These models rely on the latent variables to provide the dynamic information. ◮ We now introduce a further dynamical system with a mechanistic inspiration. ◮ Physical Interpretation: ◮ the latent functions, f i ( t ) are q forces. ◮ We observe the displacement of p springs to the forces., ◮ Interpret system as the force balance equation, XD = FS + ǫ . ◮ Forces act, e.g. through levers — a matrix of sensitivities, S ∈ ℜ q × p . ◮ Diagonal matrix of spring constants, D ∈ ℜ p × p . ◮ Original System: W = SD − 1 .

  22. Extend Model ◮ Add a damper and give the system mass. FS = ¨ XM + ˙ XC + XD + ǫ . ◮ Now have a second order mechanical system. ◮ It will exhibit inertia and resonance. ◮ There are many systems that can also be represented by differential equations. ◮ When being forced by latent function(s), { f i ( t ) } q i =1 , we call this a latent force model .

  23. Physical Analogy

  24. Gaussian Process priors and Latent Force Models Driven Harmonic Oscillator ◮ For Gaussian process we can compute the covariance matrices for the output displacements. ◮ For one displacement the model is q � m k ¨ x k ( t ) + c k ˙ x k ( t ) + d k x k ( t ) = b k + s ik f i ( t ) , (1) i =0 where, m k is the k th diagonal element from M and similarly for c k and d k . s ik is the i , k th element of S . ◮ Model the latent forces as q independent, GPs with exponentiated quadratic covariances − ( t − t ′ ) 2 � � k f i f l ( t , t ′ ) = exp δ il . 2 ℓ 2 i

  25. Covariance for ODE Model ◮ Exponentiated Quadratic Covariance function for f ( t ) � t q 1 � x j ( t ) = s ji exp( − α j t ) f i ( τ ) exp( α j τ ) sin( ω j ( t − τ )) d τ m j ω j 0 i =1 ◮ Joint distribution f(t) 0.8 for x 1 ( t ), x 2 ( t ), 0.6 y 1 (t) x 3 ( t ) and f ( t ). 0.4 0.2 Damping ratios: y 2 (t) 0 ζ 1 ζ 2 ζ 3 −0.2 y 3 (t) 0.125 2 1 −0.4 f(t) y 1 (t) y 2 (t) y 3 (t)

  26. Covariance for ODE Model ◮ Analogy � � � � e ⊤ e ⊤ x = i f i f i ∼ N ( 0 , Σ i ) → x ∼ N 0 , i Σ i e i i i ◮ Joint distribution f(t) 0.8 for x 1 ( t ), x 2 ( t ), 0.6 y 1 (t) x 3 ( t ) and f ( t ). 0.4 0.2 Damping ratios: y 2 (t) 0 ζ 1 ζ 2 ζ 3 −0.2 0.125 2 1 y 3 (t) −0.4 f(t) y 1 (t) y 2 (t) y 3 (t)

  27. Covariance for ODE Model ◮ Exponentiated Quadratic Covariance function for f ( t ) � t q 1 � x j ( t ) = s ji exp( − α j t ) f i ( τ ) exp( α j τ ) sin( ω j ( t − τ )) d τ m j ω j 0 i =1 ◮ Joint distribution f(t) 0.8 for x 1 ( t ), x 2 ( t ), 0.6 y 1 (t) x 3 ( t ) and f ( t ). 0.4 0.2 Damping ratios: y 2 (t) 0 ζ 1 ζ 2 ζ 3 −0.2 y 3 (t) 0.125 2 1 −0.4 f(t) y 1 (t) y 2 (t) y 3 (t)

  28. Joint Sampling of x ( t ) and f ( t ) ◮ lfmSample 1.5 1 0.5 0 −0.5 −1 −1.5 −2 50 55 60 65 70 Figure: Joint samples from the ODE covariance, black : f ( t ), red : x 1 ( t ) (underdamped), green : x 2 ( t ) (overdamped), and blue : x 3 ( t ) (critically damped).

  29. Joint Sampling of x ( t ) and f ( t ) ◮ lfmSample 1.5 2 1 1.5 0.5 1 0 0.5 −0.5 0 −1 −0.5 −1.5 −1 −2 50 50 55 55 60 60 65 65 70 70 Figure: Joint samples from the ODE covariance, black : f ( t ), red : x 1 ( t ) (underdamped), green : x 2 ( t ) (overdamped), and blue : x 3 ( t ) (critically damped).

  30. Joint Sampling of x ( t ) and f ( t ) ◮ lfmSample 1.5 2.5 2 2 1 1.5 1.5 0.5 1 1 0.5 0 0.5 0 −0.5 −0.5 0 −1 −1 −1.5 −0.5 −1.5 −2 −2.5 −2 −1 50 50 50 55 55 55 60 60 60 65 65 65 70 70 70 Figure: Joint samples from the ODE covariance, black : f ( t ), red : x 1 ( t ) (underdamped), green : x 2 ( t ) (overdamped), and blue : x 3 ( t ) (critically damped).

  31. Joint Sampling of x ( t ) and f ( t ) ◮ lfmSample 2.5 1.5 2 2 2 1.5 1 1.5 1.5 1 0.5 1 0.5 1 0.5 0 0 0.5 0 −0.5 −0.5 −0.5 −1 0 −1 −1 −1.5 −1.5 −0.5 −1.5 −2 −2 −2.5 −2.5 −2 −1 50 50 50 50 55 55 55 55 60 60 60 60 65 65 65 65 70 70 70 70 Figure: Joint samples from the ODE covariance, black : f ( t ), red : x 1 ( t ) (underdamped), green : x 2 ( t ) (overdamped), and blue : x 3 ( t ) (critically damped).

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend