information flow in neural circuits
play

Information Flow in Neural Circuits? Praveen Venkatesh with - PowerPoint PPT Presentation

How else can we define Information Flow in Neural Circuits? Praveen Venkatesh with Sanghamitra Dutta and Pulkit Grover Dept. of Electrical & Computer Engineering Carnegie Mellon University How should we define Information Flow in Neural


  1. How else can we define Information Flow in Neural Circuits? Praveen Venkatesh with Sanghamitra Dutta and Pulkit Grover Dept. of Electrical & Computer Engineering Carnegie Mellon University “How should we define Information Flow in Neural Circuits?”, ISIT 2019 “Information Flow in Computational Systems”, IEEE Trans. IT 2020 (https://praveenv253.github.io/publications)

  2. Acknowledgments Advisor Neuroscientists Clinicians Theorists Bobak Pulkit Marlene Rob Mark Vasily Todd Venkatesh Nazer Grover Behrmann Kass Richardson Kokkinos Coleman Saligrama (BU) (CMU) (CMU) (CMU) (MGH/Harvard) (MGH/Hvd) (UCSD) (BU) Labmates & Colleagues Fellowships • CMLH Fellowship in Digital Health • Dowd Fellowship • Henry L. Hillman Presidential Fellowship • CIT Dean’s Fellowship Grants Sanghamitra Aditya Haewon Ashwati Alireza • Center for Machine Learning and Health Dutta Gangrade Jeong Krishnan Chamanzar • Chuck Noll Foundation for Brain Injury (CMU) (BU) (CMU) (CMU) (CMU) Research

  3. Why study information flow? • Thousands of papers estimating information flow in the brain • Lot of controversy around information flow Our main insight : controversy exists because “information flow” is not formally defined ( Venkatesh & Grover SfN’15, Allerton’15) (Venkatesh & Grover, Cosyne’19) (Venkatesh, Dutta & Grover, IEEE Trans IT ’20)

  4. How else can we define Information Flow? 1. Why study information flow? 2. ISIT’19 recap: How should we define Information Flow? • Shortcoming: “ 𝑁 - information Orphans” • Potential Fix based on Pruning 3. Information Flow from a Causality Perspective • A counterexample to Pruning • Intro to Causality and Counterfactual Causal Influence 4. CCI is the intuitive definition, but not observational • An Alternative Observational Definition • A Comparison of Definitions 5. Conclusion 4

  5. Information flow for Neuroscientific Inferences “Stimulus” Region A • Info “flows” between brain regions • Info is often about a stimulus Region B • There could be feedback Region C (Almeida et al., Cortex, 2013) Goal: Find a definition for information flow , so that we can track info paths 5

  6. A Computational Model of the Brain 𝑢 = 0 • “Brain areas” 𝑢 = 1 𝑢 = 2 • Feedback communication 𝑌(𝐹 1 ) 𝑁 𝑌(𝐹 0 ) 𝐵 0 𝐵 1 𝐵 2 • 𝑌(𝐹 0 ) is the transmission on Stimulus the edge 𝐹 0 at 𝑢 = 0 𝑌(𝐹 1 ) = 𝑔 𝐵 1 𝑌 𝐹 0 , … Region A • Transmissions on edges are 𝐶 0 𝐶 1 𝐶 2 measured • Message (a.k.a. stimulus) Region B arrives at and only at 𝑢 = 0 𝑔(𝑁) 𝐷 0 𝐷 1 𝐷 2 (Thompson, 1980: VLSI) Region C (Ahlswede et al., 2000: Network Info Theory) 𝑁 : “Message” = stimulus (Peters et al., 2016: Causality) 𝑌(𝐹 𝑢 ) : Transmission on edge 𝐹 𝑢 at time 𝑢 6

  7. A Computational Model of the Brain 𝑢 = 0 • “Brain areas” 𝑢 = 1 𝑢 = 2 • Feedback communication 𝑌(𝐹 1 ) 𝑁 𝑌(𝐹 0 ) 𝐵 0 𝐵 1 𝐵 2 • 𝑌(𝐹 0 ) is the transmission on the edge 𝐹 0 at 𝑢 = 0 𝑌(𝐹 1 ) = 𝑔 𝐵 1 𝑌 𝐹 0 , … • Transmissions on edges are 𝐶 0 𝐶 1 𝐶 2 measured • Message (a.k.a. stimulus) arrives at and only at 𝑢 = 0 𝑔(𝑁) 𝐷 0 𝐷 1 𝐷 2 (Thompson, 1980: VLSI) (Ahlswede et al., 2000: Network Info Theory) Goal: Define info flow + track info path (Peters et al., 2016: Causality) 7

  8. How should we define Information Flow? (ISIT ‘19) Definition [ 𝑁 -Information Flow]: (ISIT ’19, Trans IT ‘20) We say that an edge 𝐹 𝑢 has 𝑁 -information flow if ′ ⊆ ℰ 𝑢 s.t. 𝐽 𝑁; 𝑌 𝐹 𝑢 | 𝑌(ℰ 𝑢 ′ ) > 0 . ∃ ℰ 𝑢 Why not a simpler definition like: ? 𝐽 𝑁; 𝑌 𝐹 𝑢 > 0 𝑁, 𝑎 ~ iid Ber(1/2) 𝑁 ⊕ 𝑎 = 𝑁 xor 𝑎 𝐽 𝑁; 𝑁 ⊕ 𝑎 = 0 8

  9. How should we define Information Flow? (ISIT ‘19) Definition [ 𝑁 -Information Flow]: (ISIT ‘19, Trans IT ‘20) We say that an edge 𝐹 𝑢 has 𝑁 -information flow if ′ ⊆ ℰ 𝑢 s.t. 𝐽 𝑁; 𝑌 𝐹 𝑢 | 𝑌(ℰ 𝑢 ′ ) > 0 . ∃ ℰ 𝑢 Using the above definition: Conditioning on 𝑎 reveals the dependence between 𝑁 ⊕ 𝑎 and 𝑁 : 𝐽 𝑁; 𝑁 ⊕ 𝑎 𝑎 > 0 9

  10. How should we define Information Flow? (ISIT ‘19) Definition [ 𝑁 -Information Flow]: (ISIT ’19, Trans IT ‘20) We say that an edge 𝐹 𝑢 has 𝑁 -information flow if ′ ⊆ ℰ 𝑢 s.t. 𝐽 𝑁; 𝑌 𝐹 𝑢 | 𝑌(ℰ 𝑢 ′ ) > 0 . ∃ ℰ 𝑢 10

  11. The Existence of Information Paths 𝑗𝑞 𝑊 0 𝑌(𝐹 1 ) 𝑌(𝐹 0 ) 𝑁 𝐵 0 𝐵 1 𝐵 2 𝑝𝑞 𝑊 𝑢 𝑔 𝑁 𝐶 0 𝐶 1 𝐶 2 Theorem [Existence of Information Paths]: 𝑝𝑞 depend on 𝑁 , then there If the transmissions of an “output” node 𝑊 𝑢 𝑗𝑞 to 𝑊 𝑝𝑞 , every edge of which has 𝑁 -info flow. exists a path from 𝑊 𝑢 0 (Venkatesh, Dutta & Grover, ISIT 2019; Trans. IEEE 2020) 11

  12. But! This definition gives rise to “Orphans” An edge 𝐹 𝑢 has 𝑁 -info flow if ′ ⊆ ℰ 𝑢 s.t. 𝐽 𝑁; 𝑌 𝐹 𝑢 | 𝑌(ℰ 𝑢 ′ ) > 0 . ∃ ℰ 𝑢 𝑁 -info flows out of a node ⇒ 𝑁 -info flows into the node 𝐽 𝑁; 𝑁 ⊕ 𝑎 𝑎 > 0 𝐽 𝑁; 𝑎 𝑁 ⊕ 𝑎 > 0 𝐷 1 is an orphan 𝑎 has 𝑁 -info flow! 12

  13. Pruning as a Solution to Orphans 𝑗𝑞 using Depth First Search Algorithm: Prune edges that do not lead to 𝑊 0 (Venkatesh, Dutta & Grover, IEEE Trans. IT 2020; to appear) 13

  14. Pruning as a Solution to Orphans 𝑗𝑞 using Depth First Search Algorithm: Prune edges that do not lead to 𝑊 0 (Venkatesh, Dutta & Grover, IEEE Trans. IT 2020; to appear) 14

  15. Pruning as a Solution to Orphans 𝑗𝑞 using Depth First Search Algorithm: Prune edges that do not lead to 𝑊 0 (Venkatesh, Dutta & Grover, IEEE Trans. IT 2020; to appear) Remove orphans such as 𝐷 1 Hopefully prune out edges like 𝑎 , which are not “computed” from 𝑁 15

  16. How else can we define Information Flow? 1. Why study information flow? 2. ISIT’19 recap: Defining information flow • Shortcoming: “ 𝑁 - information Orphans” • Potential Fix based on Pruning 3. Information Flow from a Causality Perspective • A counterexample to Pruning • Intro to Causality and Counterfactual Causal Influence 4. CCI is the intuitive definition, but not observational • An Alternative Observational Definition • A Comparison of Definitions 5. Conclusion 16

  17. A Counterexample to Pruning (this work) 𝑁, 𝑎 ~ Ber(1/2) 𝑁 ⊥ 𝑎 (𝐵 1 , 𝐵 2 ) has no 𝑁 -info flow: The orphan removed by pruning 𝐽 𝑁; 𝑁 ⊕ 𝑎 = 0 transmits 𝑁 ⊕ 𝑎 ! 𝐽 𝑁; 𝑁 ⊕ 𝑎 | [𝑁, 𝑎] = 0 The only information path from 𝐵 0 to 𝐶 3 is through an edge carrying 𝑎 ! 𝐵 2 is an 𝑁 -information Orphan 17

  18. What makes this a Counterexample ? We disliked orphans for two reasons: 1. No “conservation” of info flow at an orphan 𝑎 is not “computed” from 𝑁 2. Is 𝑎 really all that different from 𝑁 ⊕ 𝑎 in these examples? Joint distribution Using 𝑍 for 𝑁 ⊕ 𝑎 , is symmetric in 𝑞 𝑛, 𝑨, 𝑧 = 1 4 𝕁 𝑧 = 𝑛 ⊕ 𝑨 = 1 𝑁 ⊕ 𝑎 and 𝑎 ! 4 𝕁 𝑧 ⊕ 𝑨 = 𝑛 𝑁, 𝑎 ~ Ber(1/2) 𝑁 ⊥ 𝑎 To differentiate, we need to go beyond the joint distribution! 18

  19. A Brief Introduction to Causality Structural Causal Model Computational System Model (Peters, Janzing & Schölkopf, (Venkatesh, Dutta & Grover, ISIT 2019; Trans. IT 2020) “Elements of Causal Inference”, 2016 ) 𝑌 3 𝑌 1 𝑌 0 ≔ 𝑦 𝑌 2 𝑁 𝐵 0 𝐵 1 𝐵 2 𝑌 5 𝑌 0 𝑌 2 𝑞 𝑌 5 ? 𝐶 2 𝑔(𝑁) 𝐶 0 𝐶 1 𝑌 4 𝑌 1 𝑌 3 𝑌 3 = 𝑔 𝐵 1 𝑌 1 , 𝑌 2 , 𝑋 𝐵 1 𝑌 𝑗 = 𝑔 𝑗 𝑄𝑏 𝑌 𝑗 , 𝑋 𝑗 The Computational System is Directed Acyclic Graph with also a Structural Causal Model functional relationships 19

  20. Counterfactual Causal Influence (of 𝑁 ) = 0 1 For a particular realization of all RVs, = 0 1 = 1 What would have happened 0 (to downstream variables) If 𝑁 had taken a different value? = 0 = 1 1 = 1 (keeping all other sources of randomness fixed) = 1 We can now differentiate: 𝑁 ⊕ 𝑎 is CCI’d by 𝑁 , but 𝑎 is not! 20

  21. Counterfactual Causal Influence (of 𝑁 ) = 0 1 Defining info flow using CCI: = 0 • 𝑁 may be constant over some 1 = 1 0 of its values 𝑛 ∈ ℳ e.g. 𝑌 𝐹 𝑢 = 𝕁 𝑁 ≥ 0 = 0 • = 1 𝑁 may be constant over all 𝑛 1 = 1 for some values of 𝑨 ∈ 𝒶 e.g. 𝑌 𝐹 𝑢 = 𝑁 ⋅ 𝑎 = 1 We can now differentiate: 𝑁 ⊕ 𝑎 is CCI’d by 𝑁 , but 𝑎 is not! 21

  22. Defining Info Flow using 𝑁 -CCI Definition [ 𝑁 -Counterfactual Causal Influence]: We say that an edge 𝐹 𝑢 is counterfactually causally influenced by 𝑁 if ∃ 𝑛, 𝑛 ′ ∈ ℳ and 𝑥 ∈ 𝒳 𝑛,𝑥 ≠ 𝑌 𝐹 𝑢 𝑛 ′ ,𝑥 𝑌 𝐹 𝑢 s.t. 𝑁 -information flow 𝑁 -counterfactual causal influence 22

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend