graphical models for neuroscience part i
play

Graphical models for Neuroscience Part I Giuseppe Vinci - PowerPoint PPT Presentation

Graphical models for Neuroscience Part I Giuseppe Vinci Department of Statistics Rice University 1/1 Outline 1. Neural connectivity 2. Graphical models 3. Gaussian Graphical models 4. Functional connectivity in Macaque V4 2/1 Neural


  1. Graphical models for Neuroscience Part I Giuseppe Vinci Department of Statistics Rice University 1/1

  2. Outline 1. Neural connectivity 2. Graphical models 3. Gaussian Graphical models 4. Functional connectivity in Macaque V4 2/1

  3. Neural connectivity How does the brain work? Through concerted ac- tions of many connected neurons. We need to know these connections to: • Understand causes of brain disorders. E.g. Parkinson, Alzheimer. • Interface prosthetic devices. 3/1

  4. Neural connectivity Two different perspectives • structural connectivity = physical connections • functional connectivity = neurons activity dependence → Understanding brain functions under differing experimental conditions. → It needs to be inferred from ( in vivo ) experimental data recordings, i.e. we need to see how neurons behave. 4/1

  5. Functional Connectivity = neurons activity dependence Dependence between brain areas ( macroscopic scale ) fMRI, EEG, ECOG, LFP 5/1

  6. Functional Connectivity = neurons activity dependence Dependence between neurons ( microscopic scale ) video link 6/1

  7. Dependence • Let X = ( X 1 , ..., X d ) ∼ P be neural signals • How could we describe the dependence structure of X ? • Analytically • Contour density plots, spectral analysis • Graphs 7/1

  8. Dependence Graph Graph G = { N , E } where N = { 1 , ..., d } = nodes and E = [ E ij ] = edges with E ij ∈ { 0 , 1 } . 8/1

  9. Markov Random Field • A random vector X forms a Markov Random Field with respect to a graph G if it satisfies the Markov properties 1. Pairwise: no edge ⇔ indep. conditionally on all other variables 2. Local: indep. conditionally on neighbors 3. Global: indep. conditionally on separators 9/1

  10. Gaussian Graphical Model • A Gaussian Graphical model is a Markov Random Field • The p.d.f. of X ∼ N (0 , Θ − 1 ), with Θ = [ θ ij ] ≻ 0, is � � (2 π ) − d / 2 √ − 1 2 x ′ Θ x p Θ ( x ) = det Θ exp     d � 1 = exp  − 2 θ ij x i x j + const .  i , j =1 • θ ij = 0 ⇔ no edge between X i , X j ⇔ cond. independence θ ij √ • Note that partial correlation ρ ij = − θ ii θ jj ∈ [ − 1 , 1] is a more interpretable measure of conditional dependence. 10/1

  11. Gaussian Graphical Model 11/1

  12. GGM - Estimation • We need to estimate Θ. • Suppose we collect data vectors Data = { X 1 , ..., X n } from n repeat experiments. • To estimate Θ, we start by defining the likelihood function n � p Θ ( X r ) L (Θ; Data ) = r =1 12/1

  13. GGM - MLE � Θ MLE = arg max L (Θ; Data ) Θ ≻ 0 • Simple but can have undesirable statistical properties because: � d � = d ( d +1) • Major problem: number of parameters in Θ is d + and sample size may 2 2 be small in some applications. E.g. d = 100 ⇒ 5050 parameters and n = 200 • Minor problem: � Θ MLE does not give exact zeros. 13/1

  14. GGM - Penalized MLE • Penalized MLE � Θ( λ ) = arg max log L (Θ; Data ) − P λ (Θ) Θ ≻ 0 where P λ ≥ 0, and λ is chosen in some way (CV, BIC, Stability). E.g. P λ (Θ) = λ � Θ � q q (usually excluding diagonal elements) • q = 0 penalizes number of edges. • q = 1 is the graphical lasso 1 . • q = 2 is the ridge regularization. 1Yuan and Lin (2007), Friedman et al.(2008), Rothman et al.(2008), Fan et al.(2009), Ravikumar et al.(2011), Mazumder and Hastie (2012), Hsieh et al.(2014). 14/1

  15. GGM - Bayesian regularization • Penalized MLE can be seen as the maximum of the posterior distribution π (Θ | Data ) ∝ L (Θ; Data ) × π (Θ | λ ) � �� � � �� � likelihood prior where π (Θ | λ ) ∝ e − P λ (Θ) I (Θ ≻ 0). • Note that we can also compute posterior means instead of max. • The parameter λ can be selected by imposing a hyper prior π ( λ ). 15/1

  16. What if data is not Gaussian? • Transform the data: e.g. √ spike-count can be approximately Gaussian • Hierarchical model: Y ∼ P X , and X ∼ N (0 , Θ − 1 ). • Use another graphical model: e.g. Poisson graphs 16/1

  17. Example: macaque V4 (Vinci et al. 2018) square-root transformed spike-count (500ms) of d = 100 neurons, n = 126 for each 17/1 condition.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend