efficient learning of smooth probability functions from
play

Efficient learning of smooth probability functions from Bernoulli - PowerPoint PPT Presentation

Efficient learning of smooth probability functions from Bernoulli tests with guarantees Paul Rolland paul.rolland@epfl.ch Laboratory for Information and Inference Systems (LIONS) Ecole Polytechnique F ed erale de Lausanne (EPFL)


  1. Efficient learning of smooth probability functions from Bernoulli tests with guarantees Paul Rolland paul.rolland@epfl.ch Laboratory for Information and Inference Systems (LIONS) ´ Ecole Polytechnique F´ ed´ erale de Lausanne (EPFL) Switzerland July 2019, Joint work with Ali Kavis, Alexander Immer, Adish Singla, Volkan Cevher @ LIONS

  2. Introduction • Setup f : X → [0 , 1] , X ⊂ R d compact. • Observations: ⊲ Static setting: y i ∼ Bernoulli ( f ( x i )) ⊲ Dynamic setting: y i ∼ Bernoulli ( A i f ( x i ) + B i ) , with 0 ≤ A i + B i ≤ 1 • Goal: Approximate f over X from observation set S = { ( x i , y i ) } i =1 ,...,n • Need regularity assumption on f Efficient learning of smooth probability functions from Bernoulli tests | Paul Rolland , paul.rolland@epfl.ch Slide 2/ 8

  3. Logistic Gaussian Process • Regularity assumption: f ( x ) = σ ( h ( x )) , h ∼ GP ( µ, κ ) 1 where σ ( x ) = 1+ e − x . • Observations: y i ∼ Bernoulli ( σ ( h (( x i ))) Figure: Sample from GP prior • Issues: ⊲ No analytically tractable posterior ⊲ Requires costly Bayesian computations Figure: Sample from LGP prior Efficient learning of smooth probability functions from Bernoulli tests | Paul Rolland , paul.rolland@epfl.ch Slide 3/ 8

  4. Smooth Beta Processes: Static setting • Regularity assumption: f is L -Lipschitz continuous, i.e., | f ( x ) − f ( x ′ ) | ≤ L � x − x ′ � 2 ∀ x , x ′ ∈ X • Observations: y i ∼ Bernoulli ( f ( x i )) • Prior: p ( y | x ) = Beta ( α ( x ) , β ( x )) • Update of ˜ f ( x | X ) after observing X = { ( x 1 , y 1 ) , . . . , ( x n , y n ) } 1 : � � n n � � p ( y | X, x ) = Beta α ( x ) + δ y i =1 κ ( x , x i ) , β ( x ) + δ y i =0 κ ( x , x i ) i =1 i =1 Theorem (Informal - Convergence of static Beta process) 2 1 Using kernel κ ( x , x ′ ) = δ � x − x ′ � 2 ≤ ∆ n,L where ∆ t,L = L − d +2 n − d +2 , �� ˜ � f ( x | X ) − f ( x ) � 2 �� � � 2 d 2 d +2 n − sup = O L . E X E d +2 x ∈X 1 ” Continuous Correlated Beta Processes” , Goetschalckx et al. Efficient learning of smooth probability functions from Bernoulli tests | Paul Rolland , paul.rolland@epfl.ch Slide 4/ 8

  5. Smooth Beta Processes: Dynamic setting • Regularity assumption: f is L -Lipschitz continuous, i.e., | f ( x ) − f ( x ′ ) | ≤ L � x − x ′ � 2 ∀ x , x ′ ∈ X • Observations: y i ∼ Bernoulli ( A i f ( x i ) + B i ) , with 0 ≤ A i + B i ≤ 1 . • Prior: p ( y | x ) = Beta ( α ( x ) , β ( x )) • Update of ˜ f ( x | X ) after observing X = { ( x 1 , y 1 ) , . . . , ( x n , y n ) } : n � C n p ( y | X, x ) = i Beta ( α ( x ) + i, β ( x ) + n − i ) i =1 where { C n i } i =1 ,...,n depend on { A i , B i } i =1 ,...,n and a kernel κ . Theorem (Informal - Convergence of dynamic Beta process) 2 1 Using kernel κ ( x , x ′ ) = δ � x − x ′ � 2 ≤ ∆ t,L where ∆ n,L = L − d +2 n − d +2 , and under the assumption A i + B i = 1 , �� ˜ � f ( x | X ) − f ( x ) � 2 �� � � 2 d 2 d +2 n − sup = O E X E L d +2 . x ∈X Efficient learning of smooth probability functions from Bernoulli tests | Paul Rolland , paul.rolland@epfl.ch Slide 5/ 8

  6. Numerical results in Dynamic setting Efficient learning of smooth probability functions from Bernoulli tests | Paul Rolland , paul.rolland@epfl.ch Slide 6/ 8

  7. Benefits of SBP • Fast computation of posterior update • Can include contextual features directly influencing success probabilities • Simple to implement Efficient learning of smooth probability functions from Bernoulli tests | Paul Rolland , paul.rolland@epfl.ch Slide 7/ 8

  8. For more details... Welcome to our poster #233!! Efficient learning of smooth probability functions from Bernoulli tests | Paul Rolland , paul.rolland@epfl.ch Slide 8/ 8

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend