inferring inference
play

Inferring Inference Xaq Pitkow Rajkumar Vasudeva Raju part of the - PowerPoint PPT Presentation

Inferring Inference Xaq Pitkow Rajkumar Vasudeva Raju part of the MICrONS project with Tolias, Bethge, Patel, Zemel, Urtasun, Xu, Siapas, Paninski, Baraniuk, Reid, Seung NICE workshop 2017 World Brain match Hypothesis: The brain


  1. Inferring Inference Xaq Pitkow Rajkumar Vasudeva Raju part of the MICrONS project with Tolias, Bethge, Patel, Zemel, Urtasun, Xu, Siapas, Paninski, Baraniuk, Reid, Seung NICE workshop 2017

  2. World Brain match Hypothesis: The brain approximates probabilistic inference over a probabilistic graphical model using a message-passing algorithm implicit in population dynamics

  3. What algorithms can we learn from the brain? Architectures ? cortex, hippocampus, cerebellum, basal ganglia, … Transformations ? M B nonlinear dynamics from population responses S M B Learning rules ? short and long-term plasticity

  4. Principles : Details : Probabilistic Graphical models Nonlinear Message-passing inference Distributed Multiplexed across neurons

  5. Events in the world can cause many neural responses. Neural responses can be caused by many events. So neural computation is inevitably statistical . This provides us with mathematical predictions. ? world brain ?

  6. Why does it matter whether processing is linear or nonlinear? If all computation were linear we wouldn’t need a brain. apples oranges linearly separable nonlinearly separable

  7. Two sources of nonlinearities Relationships between latent variables L R Image = Light × Reflectance I Relationships between uncertainties posteriors generally have nonlinear dependencies even for the simplest variables Product rule : p(x,y) = p(x) ∙ p(y) Sum rule : L(x) = log ∑ y exp L(x,y)

  8. Probabilistic Graphical Models: Simplify joint distribution p ( x | r ) by specifying how variables interact Y p ( x | r ) ∝ ψ α ( x α ) α

  9. Variable x 1 x 2 Factor ψ 123 x 3

  10. Example: Pairwise Markov Random Field J 1 J 2 J 3 x 1 x 2 x 3 J 12 J 23

  11. Approximate inference by message-passing : • Localize information so it is actionable • Summarize statistics relevant for targets • Send that information along graph • Iteratively update factors with new information general interactions equation message-passing posterior posterior for parameters neighbors parameters

  12. Example message-passing algorithms • Mean-field (assumes variables are independent) • Belief propagation (assumes tree graph) • Expectation propagation (updates parametric posterior) • … • Brain’s clever tricks?

  13. Spatial representation of uncertainty (e.g. Probabilistic Population Codes, PPCs) Posterior Neural p ( x | r ) response r i Neuron x index i b . r 1 σ = a . r µ = a . r a . r b . r Pattern of activity represents probability. More spikes generally means more certainty Ma, Beck, Latham, Pouget 2006, etc

  14. Message-passing updates embedding Neural dynamics

  15. J 1 J 2 J 3 x 1 x 2 x 3 J 12 J 23 r r r singleton 1 2 3 populations nonlinear connections linear connections pairwise populations r r 12 23

  16. J 1 J 2 J 3 x 1 x 2 x 3 J 12 J 23 singleton populations linear connections nonlinear pairwise connections populations

  17. Neural activity

  18. Neural activity

  19. Neural Neural Information activity encoding encoded

  20. Neural Neural Information activity encoding encoded

  21. Neural Neural Information interactions encoding interactions

  22. Neural Neural Information Probability interactions encoding interactions distributions

  23. Neural Neural Information Example: interactions encoding interactions orientation

  24. Network activity can implicitly perform inference max Neural activity r min Time N neurons N neurons N params = 1 N params = 5 no noise Inferred parameters Mean Variance True parameters Raju and Pitkow 2016

  25. Simulated brain Infer b time Encode r time Inferring inference Decode *within Fit* family Message-passing Interactions parameters

  26. Recovery results for simulated brain Message-passing • ≠ •! parameters G αβγ αβγ 0 Interactions J ij True Learnt ij 0

  27. Analysis reveals degenerate family of equivalent algorithms Distance towards local minimum 2 max degenerate degenerate * valley 2 valley 2 1 Mean Squared Error degenerate valley 1 * global min * 0 0 0 1 Distance towards local minimum 1

  28. From simulated neural data we have recovered: how variables are encoded Representation which variables interact Graphical model how they interact Message-Passing how the interactions are used algorithm

  29. Applying message-passing to novel tasks Brain Message Apply to Relax to neural passing new graphical OR novel neural network nonlinearity model structure network

  30. Next up: applying methods to real brains stimulus: orientation field recordings: V1 responses* *not to same stimulus recordings from Tolias lab

  31. mementos: • Neurons can perform inference implicitly in a graphical model distributed across a population. • New method to discover message-passing algorithms by modeling transformations of decoded task variables World Brain match

  32. acknowledgements funding: collaborators xaqlab .com Alex Pouget Jeff Beck Dora Angelaki Andreas Tolias Jacob Reimer Rajkumar Vasudeva Raju Fabian Sinz Alex Ecker Kaushik Lakshminarasimhan Ankit Patel Qianli Yang Emin Orhan Aram Giahi-Saravani KiJung Yoon James Bridgewater Zhengwei Wu Saurabh Daptardar

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend