Cholinergic Modulation of the Hippocampus Computational Models of - - PowerPoint PPT Presentation

cholinergic modulation of the hippocampus
SMART_READER_LITE
LIVE PREVIEW

Cholinergic Modulation of the Hippocampus Computational Models of - - PowerPoint PPT Presentation

Cholinergic Modulation of the Hippocampus Computational Models of Neural Systems Lecture 2.5 David S. Touretzky September, 2007 A Theory of Hippocampus Suppose CA1 is a hetero- associator that learns: to mimic EC patterns, and to


slide-1
SLIDE 1

Cholinergic Modulation of the Hippocampus

Computational Models of Neural Systems

Lecture 2.5

David S. Touretzky September, 2007

slide-2
SLIDE 2

09/20/07 Computational Models of Neural Systems 2

A Theory of Hippocampus

  • Suppose CA1 is a hetero-

associator that learns:

– to mimic EC patterns, and – to map CA3 patterns to

learned EC patterns

  • Imagine a partial/noisy

pattern in EC triggering a partial/noisy response in CA3, cleaned up by auto- association in CA3 recurrent collaterals

  • CA1 could use the EC

response to call up the complete, correct EC pattern What happens if recall isn't turned off during learning?

slide-3
SLIDE 3

09/20/07 Computational Models of Neural Systems 3

Acetylcholine Effects (1)

Acetylcholine (ACh) has a variety of effects on HC:

  • Suppresses synaptic transmission in CA1:

– Mostly at Schaffer collaterals in stratum radiatum – Less so for perforant path input in

stratum lacunosum-moleculare

patch clamp recording

slide-4
SLIDE 4

09/20/07 Computational Models of Neural Systems 4

Effect of Carbachol

  • Carbachol is a

cholinergic agonist.

  • Can use carbachol to

test the effects of ACh.

  • It only activates

metabotropic ACh receptors.

  • Brain slice recording

experiments show that carbachol suppresses synaptic transmission in CA1.

46.0% suppression 90.7% suppression 54.6% suppression 87.6% suppression

Experiment 1 Experiment 2

slide-5
SLIDE 5

09/20/07 Computational Models of Neural Systems 5

Effect of Atropine

  • Atropine affects

muscarinic-type ACh receptors, not nicotinic type.

  • Blocks the suppression of

synaptic transmission by carbachol.

  • Therefore, cholinergic

suppression in s. rad. and

  • s. l.-m. is by muscarinic

ACh receptors.

same same

slide-6
SLIDE 6

09/20/07 Computational Models of Neural Systems 6

Summary of Blockade Experiments

slide-7
SLIDE 7

09/20/07 Computational Models of Neural Systems 7

Acetylcholine Effects (2)

Acetylcholine also:

  • Reduces neuronal adaptation in CA1 by suppressing

voltage and Ca2+ dependent potassium currents.

– This keeps the cells excitable.

  • Enhances synaptic modification in CA1

– possibly by affecting NMDA currents.

  • Activates inhibitory interneurons

– but decreases inhibitory synaptic transmission.

slide-8
SLIDE 8

09/20/07 Computational Models of Neural Systems 8

Hasselmo's Model: Block Diagram

EC CA1 CA3

Medial Septum ACh fimbria/fornix pp Sch

slide-9
SLIDE 9

09/20/07 Computational Models of Neural Systems 9

Hasselmo's Model

slide-10
SLIDE 10

09/20/07 Computational Models of Neural Systems 10

Initial CA1 Activation Function

ait = ∑

j=1 n

Lij − EC H ij⋅gEC a jt

∑

k=1 n

Rik − CA3 Hik⋅gCA3 akt

−∑

l=1 n

CA1 Hil⋅gCA1 alt

ait is activation of unit i at time t gx is a threshold function: maxx−0.4, 0 Lij is feedforward synaptic strength for s. lacunosum (EC input) Rij is feedforward synaptic strength for s. radiatum (CA3 input)

xx Hi _ is feedforward or feedback inhibition of CA1 from layer xx

g(x)

slide-11
SLIDE 11

09/20/07 Computational Models of Neural Systems 11

  • S. Radiatum Learning Rule
  • Note: the only learning in this model is in Rik,

the weights on the CA3→CA1 connections. T wo factors:

– Linear potentiation when pre- and post-synaptic cells are

simultaneously active.

– Exponential decay whenever the postsynaptic cell is active.

Rikt1 = Rikt  ⋅[CA1 ait−⋅gCA3 akt − CA1 ait− ⋅Rikt]  is the synaptic modification threshold for LTP to occur.  is the overall learning rate.  is the synaptic decay rate.

slide-12
SLIDE 12

09/20/07 Computational Models of Neural Systems 12

Learning Rule: Hebbian Facilitation Plus Synaptic Decay 1 1

Presynaptic Postsynaptic

↓ ↓↑

slide-13
SLIDE 13

09/20/07 Computational Models of Neural Systems 13

Exponential Weight Decay

dx dt = − x x t1 = xt −  xt = xt ⋅ 1− x tn = xt ⋅ 1−n Example:  = 0.04 x t1 = 0.96 x t

slide-14
SLIDE 14

09/20/07 Computational Models of Neural Systems 14

Control of Cholinergic Modulation

  • Cholinergic modulation Ψ was controlled by the amount
  • f activity in CA1:
  • This is an inverted sigmoid activation function of form

1 - 1/(1+exp(x)):

– With no CA1 activity, Ψ is close to 1. – With maximal CA1 activity, Ψ is close to 0.

 = [1  exp∑

i=1 n

gCA1 ai−]

−1

 is a gain parameter  is a threshold value

slide-15
SLIDE 15

09/20/07 Computational Models of Neural Systems 15

ACh Modulation of Recall

ait =

j=1 n

1−C LLij−1−C H Hij⋅gEC a jt

∑

k=1 n

1−C R Rij−1−C HHik⋅gCA3 akt

−∑

l=1 n

1−C HH il⋅gCA1 alt C L ,C R ,C H are coefficients of ACh modulation.

slide-16
SLIDE 16

09/20/07 Computational Models of Neural Systems 16

ACh Modulation of Learning

Rikt1 = ⋅ [1−C1−] ⋅

[CA1 ait − 1−C

⋅gCA3 ait − CA1 ait − 1−C ⋅Rikt]  Rikt C ,C are coefficients of ACh modulation. Note: output threshold  in g⋅ is also reduced by 1−C. This simulates ACh suppression of neuronal adapation.

slide-17
SLIDE 17

09/20/07 Computational Models of Neural Systems 17

What Do These T erms Look Like?

 1− C1− 1−C1−

slide-18
SLIDE 18

09/20/07 Computational Models of Neural Systems 18

T rain T est Recovery from weight decay caused by recall

  • f pattern 2.
slide-19
SLIDE 19

09/20/07 Computational Models of Neural Systems 19

Weak suppression in s. rad. and none in s. l.-m. Result: unwanted learning causes memory interference. Strong suppression in s. rad. and also in s. l.-m. Result: retrieval fails.

slide-20
SLIDE 20

09/20/07 Computational Models of Neural Systems 20

Larger Simulation

Learned 5 patterns. After learning, CA3 input is sufficient to recall the patterns.

slide-21
SLIDE 21

09/20/07 Computational Models of Neural Systems 21

Memory Performance

Ais CA1 output, B is corresponding EC pattern (teacher). For perfect memory, A=B. Recall that A ⋅ B = ∥A∥ ∥B∥ cos Normalized dot product: A⋅B ∥A∥ ∥B∥ = cos = 1 for perfect memory Ci is some other training pattern Performance P = A⋅B ∥A∥ ∥B∥ − 1 M ∑

i=1 M

A⋅Ci ∥A∥ ∥Ci∥

Average overlap with all stored patterns.

slide-22
SLIDE 22

09/20/07 Computational Models of Neural Systems 22

CL vs. CR Parameter Space

  • Performance is

plotted on z axis.

  • Grey line shows C

L = CR.

  • White line shows dose-

response plot from carbachol experiment.

slide-23
SLIDE 23

09/20/07 Computational Models of Neural Systems 23

Comparison with Marr Model

  • Distinguishing learning vs. recall:

– Marr assumed recall would always use small subpatterns,

perhaps one tenth the size of a full memory pattern. Not enough activity to trigger learning.

– Hasselmo assumes that unfamiliar patterns only weakly

activate CA1, and that leads to elevated ACh which enhances learning.

  • Input patterns:

– Marr assumes inputs are sparse and random, so nearly

  • rthogonal.

– Hasselmo's simulations use small vectors so there is substantial

  • verlap between patterns. Uses ACh modulation to suppress

interference.

slide-24
SLIDE 24

09/20/07 Computational Models of Neural Systems 24

A Model of Episodic Memory

slide-25
SLIDE 25

09/20/07 Computational Models of Neural Systems 25

ACh Prevents Overlap w/Previously Stored Memories from Interfering with Learning

slide-26
SLIDE 26

09/20/07 Computational Models of Neural Systems 26

Simulation of ACh Effects

10 input neurons 2 inhibitory neurons 1 ACh signal

slide-27
SLIDE 27

09/20/07 Computational Models of Neural Systems 27

Episodic Memory Simulation

  • Each layer contains both

Context and Item units.

  • T

rain on list of 5 patterns.

  • During recall, supply ony

the context.

  • An adaptation process

causes recalled items to eventually fade so that another item can become active.

slide-28
SLIDE 28

09/20/07 Computational Models of Neural Systems 28

“Consolidation”

T rain model on set

  • f 6 patterns.

During consolidation, use free recall to train slow-learning recurrent connections in EC layer IV. After training, a partial input pattern (not shown) recalls the full pattern in layer cortex.

poor good

slide-29
SLIDE 29

09/20/07 Computational Models of Neural Systems 29

Summary

  • Unwanted recall of old patterns can interfere with storing

new ones.

  • The hippocampus must have some way of preventing

this interference.

  • Cholinergic modulation in CA1 (and also CA3) affects

both synaptic transmission and L TP .

  • Acetylcholine may serve as the “novelty” signal:

– Unfamiliar patterns → high ACh → learning – Familiar patterns → low ACh → recall

  • CA1 might serve as a comparator of current EC input

with reconstructed input from CA3 projection to determine pattern familiarity .