cholinergic modulation of the hippocampus
play

Cholinergic Modulation of the Hippocampus Computational Models of - PowerPoint PPT Presentation

Cholinergic Modulation of the Hippocampus Computational Models of Neural Systems Lecture 2.5 David S. Touretzky September, 2007 A Theory of Hippocampus Suppose CA1 is a hetero- associator that learns: to mimic EC patterns, and to


  1. Cholinergic Modulation of the Hippocampus Computational Models of Neural Systems Lecture 2.5 David S. Touretzky September, 2007

  2. A Theory of Hippocampus ● Suppose CA1 is a hetero- associator that learns: – to mimic EC patterns, and – to map CA3 patterns to learned EC patterns ● Imagine a partial/noisy pattern in EC triggering a partial/noisy response in CA3, cleaned up by auto- association in CA3 recurrent collaterals ● CA1 could use the EC What happens if recall response to call up the isn't turned off during complete, correct EC pattern learning? 2 Computational Models of Neural Systems 09/20/07

  3. Acetylcholine Effects (1) Acetylcholine (ACh) has a variety of effects on HC: ● Suppresses synaptic transmission in CA1: – Mostly at Schaffer collaterals in stratum radiatum – Less so for perforant path input in stratum lacunosum-moleculare patch clamp recording 3 Computational Models of Neural Systems 09/20/07

  4. Effect of Carbachol Experiment 1 ● Carbachol is a cholinergic agonist. ● Can use carbachol to test the effects of 46.0% suppression 90.7% suppression ACh. ● It only activates metabotropic ACh Experiment 2 receptors. ● Brain slice recording experiments show that carbachol suppresses synaptic 54.6% suppression 87.6% suppression transmission in CA1. 4 Computational Models of Neural Systems 09/20/07

  5. Effect of Atropine ● Atropine affects muscarinic-type ACh receptors, not nicotinic type. ● Blocks the suppression of synaptic transmission by carbachol. ● Therefore, cholinergic suppression in s. rad. and s. l.-m. is by muscarinic ACh receptors. same same 5 Computational Models of Neural Systems 09/20/07

  6. Summary of Blockade Experiments 6 Computational Models of Neural Systems 09/20/07

  7. Acetylcholine Effects (2) Acetylcholine also: ● Reduces neuronal adaptation in CA1 by suppressing voltage and Ca 2+ dependent potassium currents. – This keeps the cells excitable. ● Enhances synaptic modification in CA1 – possibly by affecting NMDA currents. ● Activates inhibitory interneurons – but decreases inhibitory synaptic transmission. 7 Computational Models of Neural Systems 09/20/07

  8. Hasselmo's Model: Block Diagram EC pp ACh Sch CA1 CA3 fimbria/fornix Medial Septum 8 Computational Models of Neural Systems 09/20/07

  9. Hasselmo's Model 9 Computational Models of Neural Systems 09/20/07

  10. Initial CA1 Activation Function n  L ij − EC H ij  ⋅ g  EC a j  t  = ∑ a i  t  j = 1 n  R ik − CA3 H ik  ⋅ g  CA3 a k  t   ∑ k = 1 n − ∑ CA1 H il ⋅ g  CA1 a l  t  l = 1 a i  t  is activation of unit i at time t g(x) g  x  is a threshold function: max  x − 0.4, 0  L ij is feedforward synaptic strength for s. lacunosum (EC input) R ij is feedforward synaptic strength for s. radiatum (CA3 input) xx H i _ is feedforward or feedback inhibition of CA1 from layer xx 10 Computational Models of Neural Systems 09/20/07

  11. S. Radiatum Learning Rule ● Note: the only learning in this model is in R ik , the weights on the CA3 → CA1 connections. T wo factors: – Linear potentiation when pre- and post-synaptic cells are simultaneously active. – Exponential decay whenever the postsynaptic cell is active. R ik  t  1  = R ik  t   ⋅ [  CA1 a i  t −  ⋅ g  CA3 a k  t  −  CA1 a i  t − ⋅ R ik  t  ]  is the synaptic modification threshold for LTP to occur.  is the overall learning rate.  is the synaptic decay rate. 11 Computational Models of Neural Systems 09/20/07

  12. Learning Rule: Hebbian Facilitation Plus Synaptic Decay Presynaptic 0 1 0 Postsynaptic ↓ ↑ 1 ↓ 12 Computational Models of Neural Systems 09/20/07

  13. Exponential Weight Decay dx = − x dt x  t  1  = x  t  −  x  t  = x  t  ⋅  1 − x  t  ⋅  1 − n x  t  n  = Example:  = 0.04 x  t  1  = 0.96 x  t  13 Computational Models of Neural Systems 09/20/07

  14. Control of Cholinergic Modulation ● Cholinergic modulation Ψ was controlled by the amount of activity in CA1:  = [ 1  exp  ∑ g  CA1 a i − ] − 1 n i = 1  is a gain parameter  is a threshold value ● This is an inverted sigmoid activation function of form 1 - 1/(1+exp(x)): – With no CA1 activity, Ψ is close to 1. – With maximal CA1 activity, Ψ is close to 0. 14 Computational Models of Neural Systems 09/20/07

  15. ACh Modulation of Recall n   1 − C L  L ij − 1 − C H  H ij  ⋅ g  EC a j  t  ∑ a i  t  = j = 1 n   1 − C R  R ij − 1 − C H  H ik  ⋅ g  CA3 a k  t   ∑ k = 1 n − ∑  1 − C H  H il ⋅ g  CA1 a l  t  l = 1 C L ,C R ,C H are coefficients of ACh modulation. 15 Computational Models of Neural Systems 09/20/07

  16. ACh Modulation of Learning R ik  t  1  = ⋅ [ 1 − C   1 −] ⋅ [  CA1 a i  t  −  1 − C   ⋅ g  CA3 a i  t  ⋅ R ik  t  ] −  CA1 a i  t  −  1 − C    R ik  t  C  ,C  are coefficients of ACh modulation. Note: output threshold  in g ⋅ is also reduced by  1 − C  . This simulates ACh suppression of neuronal adapation. 16 Computational Models of Neural Systems 09/20/07

  17. What Do These T erms Look Like?  1 − C   1 − 1 − C   1 − 17 Computational Models of Neural Systems 09/20/07

  18. T rain T est Recovery from weight decay caused by recall of pattern 2. 18 Computational Models of Neural Systems 09/20/07

  19. Weak suppression in s. rad. and none in s. l.-m. Result: unwanted learning causes memory interference. Strong suppression in s. rad. and also in s. l.-m. Result: retrieval fails. 19 Computational Models of Neural Systems 09/20/07

  20. Larger Simulation Learned 5 patterns. After learning, CA3 input is sufficient to recall the patterns. 20 Computational Models of Neural Systems 09/20/07

  21. Memory Performance A is CA1 output, B is corresponding EC pattern (teacher). For perfect memory, A = B . Recall that A ⋅ B = ∥ A ∥ ∥ B ∥ cos  A ⋅ B = cos  = 1 for perfect memory Normalized dot product: ∥ A ∥ ∥ B ∥ C i is some other training pattern A ⋅ C i M A ⋅ B 1 M ∑ Performance P = − ∥ A ∥ ∥ B ∥ ∥ A ∥ ∥ C i ∥ i = 1 Average overlap with all stored patterns. 21 Computational Models of Neural Systems 09/20/07

  22. C L vs. C R Parameter Space ● Performance is plotted on z axis. ● Grey line shows C L = C R . ● White line shows dose- response plot from carbachol experiment. 22 Computational Models of Neural Systems 09/20/07

  23. Comparison with Marr Model ● Distinguishing learning vs. recall: – Marr assumed recall would always use small subpatterns, perhaps one tenth the size of a full memory pattern. Not enough activity to trigger learning. – Hasselmo assumes that unfamiliar patterns only weakly activate CA1, and that leads to elevated ACh which enhances learning. ● Input patterns: – Marr assumes inputs are sparse and random, so nearly orthogonal. – Hasselmo's simulations use small vectors so there is substantial overlap between patterns. Uses ACh modulation to suppress interference. 23 Computational Models of Neural Systems 09/20/07

  24. A Model of Episodic Memory 24 Computational Models of Neural Systems 09/20/07

  25. ACh Prevents Overlap w/Previously Stored Memories from Interfering with Learning 25 Computational Models of Neural Systems 09/20/07

  26. Simulation of ACh Effects 10 input neurons 2 inhibitory neurons 1 ACh signal 26 Computational Models of Neural Systems 09/20/07

  27. Episodic Memory Simulation ● Each layer contains both Context and Item units. ● T rain on list of 5 patterns. ● During recall, supply ony the context. ● An adaptation process causes recalled items to eventually fade so that another item can become active. 27 Computational Models of Neural Systems 09/20/07

  28. “Consolidation” T rain model on set of 6 patterns. During consolidation, use free recall to train slow-learning recurrent connections in EC layer IV. After training, a partial input pattern (not shown) recalls the full pattern in layer cortex. poor good 28 Computational Models of Neural Systems 09/20/07

  29. Summary ● Unwanted recall of old patterns can interfere with storing new ones. ● The hippocampus must have some way of preventing this interference. ● Cholinergic modulation in CA1 (and also CA3) affects both synaptic transmission and L TP . ● Acetylcholine may serve as the “novelty” signal: – Unfamiliar patterns → high ACh → learning – Familiar patterns → low ACh → recall ● CA1 might serve as a comparator of current EC input with reconstructed input from CA3 projection to determine pattern familiarity . 29 Computational Models of Neural Systems 09/20/07

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend