Putting vision into context: Influence of behaviour and context on - - PowerPoint PPT Presentation

putting vision into context
SMART_READER_LITE
LIVE PREVIEW

Putting vision into context: Influence of behaviour and context on - - PowerPoint PPT Presentation

Putting vision into context: Influence of behaviour and context on sensory processing Sonja Hofer Sensory Systems Module PhD course 22/10/2019 Classical view of hierarchical feed-forward visual processing Problems with the hierarchical


slide-1
SLIDE 1

Sonja Hofer

Sensory Systems Module PhD course 22/10/2019

Putting vision into context:

Influence of behaviour and context on sensory processing

slide-2
SLIDE 2

Classical view of hierarchical feed-forward visual processing

slide-3
SLIDE 3

Problems with the hierarchical feed-forward model

Most properties of the environment cannot be directly deduced from sensory input Analyzing complex visual scenes requires a model of the world

slide-4
SLIDE 4

Our model of the world shapes our perception

slide-5
SLIDE 5

Our model of the world shapes our perception

slide-6
SLIDE 6

Our model of the world shapes our perception

slide-7
SLIDE 7

Effect of context on perception:

slide-8
SLIDE 8

Effect of context on perception:

slide-9
SLIDE 9

Integration of sensory and contextual ‘top-down’ signals

Visual system

Visual information Expectations/ Beliefs Actions

Knowledge

Behavioural relevance

slide-10
SLIDE 10

Top-down cortical inputs

Eye

dLGN

V1

Higher visual areas Association areas

Pulvinar

Cortex Thalamus

Superior colliculus

Neuromodulation Higher-order thalamic inputs

Integration of sensory and contextual ‘top-down’ signals

slide-11
SLIDE 11

Outline

  • Neuronal signals related to attention and reward

expectation

  • Behavioural relevance & Learning
  • Motor signals in sensory cortex
  • Bayesian inference and predictive coding
slide-12
SLIDE 12

Modulation of sensory responses by attention

Spatial attention (Top-down)

V1 V2 V4 Buffalo et al 2009

slide-13
SLIDE 13

Modulation of sensory responses by attention

Object-based attention

Curve-tracing task Roelfsema et al 1998

slide-14
SLIDE 14

Modulation of sensory responses by reward expectation

Attention or reward expectation?

Adapted curve-tracing task Stănişor et al, 2013

Relative reward value

Normalized response

V1

slide-15
SLIDE 15

Changes of sensory responses during learning How do responses to visual stimuli change as they become behaviourally relevant to an animal?

slide-16
SLIDE 16

Reward zone

Approach

Approach corridor Grating corridors Vertical: rewarded

(drop of soya milk)

Angled (40°): non-rewarded

Changes of sensory responses during learning

Visual discrimination task in virtual reality

Adil Khan

slide-17
SLIDE 17

Trained mouse performing the task

Head-fixed mouse on a cylinder, running through a virtual corridor (only half of virtual reality visible)

slide-18
SLIDE 18

Implantation of a chronic cranial window:

Access to the cortex for chronic recordings

Holtmaat et al., 2009

slide-19
SLIDE 19

Two-photon calcium imaging of GCaMP calcium indicators

GCaMP6-expressing neurons in visual cortex (V1)

slide-20
SLIDE 20

Trained mouse performing the task Neurons in visual cortex expressing GCaMP6 Eye position

In vivo two-photon calcium imaging during the discrimination task Speed 2.5x

slide-21
SLIDE 21

Example cell response to grating corridors:

100 50 100 50

Trials Trials Trials rewarded grating (vertical) Trials non-rewarded grating (angled)

  • 2

2 4

  • 2

2 4 Time (s) Time (s) Vertical grating Angled grating

Average response (ΔF/F)

Grating onset

Average response

10 s 50% ΔF/F

2 1

ΔF/F

Neuronal responses to task-relevant stimuli

slide-22
SLIDE 22

Cell 1

Vertical grating Angled grating

Day 1 Day 2 Day 5 Day 6

1 ΔF/F

Cell 2 Cell 3 Cell 4

0.2 ΔF/F 0.5 ΔF/F 1 ΔF/F

Neuronal responses to task-relevant stimuli

slide-23
SLIDE 23

Neuronal population performance Neuronal population discrimination

2 3 4 5 6 7 8

Session

Time (s)

  • 1
  • 0.5

0.5 1 0.1 0.2 0.3 0.4 0.5 0.6 1 1 2 3 4 5 6 7 8 9

  • 1

1 2 3 4 5

Mouse M2 Session

Session

Time (s)

  • 1
  • 0.5

0.5 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 1 2 3 4 5 6 7 8 9

Time (s)

1 2 3 4 5 6 7 8 9 10

  • 1

1 2 3

Mouse M5 Session

  • 1
  • 0.5

0.5 1 0.2 0.4 0.6 0.8 1 1.2 1.4 1 2 3 4 5 6 7 8 9 10

Session

1 2 3 4 5 6 7 8 1 2 3

Mouse M7

Behavioural discrimination (d’)

Session

Behavioural performance

Relationship between behavioural and neuronal performance

Poort, Khan et al., Neuron 2015

slide-24
SLIDE 24
  • 1
  • 0.5

0.5 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9

Time (s)

Population selectivity

  • 1.0 to -0.1 (N=19 sessions)
  • 0.1 to 0.8 (N=8)

0.8 to 1.7 (N=8) 1.7 to 2.7 (N=26) 2.7 to 3.6 (N=17) 3.6 to 4.5 (N=15)

Behavioural performance (d-prime)

Population performance

The visual cortex gets better at distinguishing the two task- relevant stimuli, tightly correlated with behavioural performance

Poort, Khan et al., Neuron 2015

Learning may increase the salience of task-relevant visual information to better inform behavioural decisions

Neuronal changes with learning

Learning

slide-25
SLIDE 25

Run up Odour1

Irrelevant Grating (vert or ang)

Odour2

Irrelevant Grating (vert or ang)

OLFACTORY BLOCK VISUAL BLOCK

Run up

Vertical grating Angled grating

Switching between visual and olfactory discrimination task

slide-26
SLIDE 26

Mice switch between a visual and an olfactory task (the same visual stimuli are shown but ignored)

0.5 1

0.2 0.4 0.6 0.8 1.0

Population selectivity

Olfactory blocks (N=11) Visual blocks (N=11 sessions)

Time (s)

  • 0.5

Population discriminability

Neurons in V1 are more selective when visual stimuli are relevant

Poort, Khan et al., Neuron 2015

Visual stimulus relevant Visual stimulus irrelevant

Switching between visual and olfactory discrimination task

slide-27
SLIDE 27

Modulation of sensory responses by task demands

Task-dependent changes in auditory cortex receptive fields

Fritz et al, 2003

STRF: spectrotemporal response field Average change in response field passive listening vs during task

Sensory response properties are not fixed but reflect behavioural demands!

slide-28
SLIDE 28

Motor signals in sensory areas

Electrophysiological recordings in primary visual cortex in head-fixed, running mice

Niell & Styker, 2010

Visual responses in V1 are increased during locomotion

slide-29
SLIDE 29

Motor signals in sensory areas

Circuit-mechanisms of locomotion-related signals in visual cortex?

Lee at al., 2014

MLR: mesencephalic locomotor region

slide-30
SLIDE 30

Motor signals in sensory areas

Circuit-mechanisms of locomotion-related signals in visual cortex?

Pakan at al., 2016

Complex networks!! -> Modelling

Del Molino at al., 2017 Fu at al., 2014

slide-31
SLIDE 31

Motor signals in sensory areas

Anterior cingulate cortex (+ secondary motor cortex)?

Origin of motor signals?

Leinweber at al., 2017

slide-32
SLIDE 32

Motor signals in sensory areas

Origin of motor signals?

V1

100 µm

LGN

axons

Pulvinar

axons Eye

LGN

Lateral geniculate nucleus

V1

Primary visual cortex

Higher visual areas Association areas

Pulvinar

(Lateral posterior nucleus LP)

Cortex Thalamus

Superior colliculus

Thalamus?

slide-33
SLIDE 33

1 mm

AM PM

V1

AL LM

Intrinsic signal imaging to determine position of visual areas

Pulvinar axons in V1

100 μm

Expression of calcium indicator in pulvinar or LGN

10 μm

1 2

20 s 2 ΔF/F

1 2

Two-photon imaging of thalamic projections in V1

Pulvinar/ LP

Imaging activity of thalamic projections in cortical areas

slide-34
SLIDE 34

In vivo two-photon calcium imaging of thalamic axons and boutons in layer 1 of V1

15 µm

Speed 5x

slide-35
SLIDE 35
  • Trained to run through

virtual corridor

  • Running uncoupled from

visual flow

Imaging activity of thalamic projections in V1

Visuo-motor ‘task’

slide-36
SLIDE 36

Visuo-motor signals in thalamic boutons in V1

5 ΔF/F 10 s 20 cm/s

Visual Flow Speed (VF) ΔF/F

VF dLGN dLGN

30 1 30 1 20

LP

1

Visual flow speed (cm/sec) Response strength dLGN

30 1 30 1 30 1

Running speed (cm/sec) LP LP ΔF/F RS

10 s

Running Speed (RS)

20 cm/s 5 ΔF/F

Proportion of highly informative boutons (%) Running speed Visual flow

10 15 5

dLGN Pulvinar

Roth, Dahmen, Muir et al., Nature Neurosciene 2016

slide-37
SLIDE 37

Motor signals in sensory areas

Motor signals seem to dominate neuronal activity across the cortical surface

Musall at al., bioRxiv 2018

Widefield calcium imaging of cortical activity during a simple spatial discrimination task

slide-38
SLIDE 38

Motor signals in sensory areas

Just gain control? No!

Erisken at al., 2014 Activity in visual cortex excitatory cells: modulated in the dark and carry detailed running speed information Saleem at al., 2013

slide-39
SLIDE 39

Motor signals in sensory areas

Motor signals as efference copy?

slide-40
SLIDE 40

Integration of sensory and contextual ‘top-down’ signals

Visual system

Visual information Expectations/ Beliefs Actions

Knowledge

Behavioural relevance

slide-41
SLIDE 41

Visual discrepancy

During eye or head movements: Motor system Visual information

Eye

Information about own body’s movement Difference calculator Motor command Prediction

Visual feedback Efference copy Predicted visual feedback

The importance of predictions for sensory perception

slide-42
SLIDE 42

Predictive Coding and Bayesian Inference

Kant, Helmholtz,…Friston, Clark, Mumford, Olshausen

Hierarchical Bayesian Inference Prior Sensory input Posterior

(Likelihood of model correct given data)

slide-43
SLIDE 43

Predictive coding framework

Keller et al., 2012

Experimental evidence for predictive coding in cortical circuits A subset of neurons in V1 shows strong mismatch (prediction error) responses Mismatch responses are dependent on experience

  • f visuo-motor coupling

Attinger et al., 2017

slide-44
SLIDE 44

Predictive coding framework

Potential circuit for mismatch computation in visual cortex

Mismatch response in V1 is weaker when ACC is silenced Attinger et al., 2017

ACC

Muscimol in Anterior Cingulate Cortex (ACC)

Leinweber et al., 2017

slide-45
SLIDE 45

Predictive coding framework

Potential circuit for mismatch computation in visual cortex

Optogenetic manipulation of SOM neurons alters mismatch response

(consistent with the model but no proof)

Attinger et al., 2017 Somatostatin (SOM) neurons are most strongly driven by visual flow

ACC SOM

slide-46
SLIDE 46

Predictive coding framework

Spatial prediction and prediction error signals in visual cortex

Some V1 neurons become selective to spatial location

Fiser et al., 2016

Some V1 neurons start firing in expectation of visual stimuli

slide-47
SLIDE 47

Predictive coding framework

Spatial prediction and prediction error signals in visual cortex

Strong response in V1 when an expected visual stimulus is omitted

Fiser et al., 2016

slide-48
SLIDE 48

Predictive Coding and Bayesian Inference

Kant, Helmholtz,…Friston, Clark, Mumford, Olshausen

Hierarchical Bayesian Inference Prior Sensory input Posterior

(Likelihood of model correct given data)

slide-49
SLIDE 49

Feed-back projections

  • What is the role of feed-back projections?
  • How does feed-back influence the target area?
  • How do cortical areas communicate? How dynamic is this communication?

What is computed where?

slide-50
SLIDE 50

V1 LM

(Higher visual area) ChR2 in PV interneurons

  • r

500ms, stationary Feed-forward Feed-back

200 ms

80

Spike rate (Hz)

Example neuron LM

200 ms

15

Spike rate (Hz)

Visual stimulus

Laser Control

Population response V1

A causal measure of effective inter-areal connectivity

slide-51
SLIDE 51
  • 100
  • 50

50 100 0.012

Influence of feed-forward vs feed-back projections

Feed-forward FF V1 LM

Silencing V1, Effect in LM

Silencing effect (%) Fraction of cells

Feed-back FB V1 LM

Silencing LM, Effect in V1

  • 100
  • 50

50 100 0.015

Silencing effect (%) Fraction of cells

slide-52
SLIDE 52

Feed-back suppresses responses and increases selectivity

Average population response V1

200 ms

Spike rate (Hz)

5 15

Laser Control

Go stimulus, silencing at 60ms

  • Feed-back influence is strongest when visual information is

most relevant

  • Feed-back increases selectivity in V1 after learning by suppressing

responses, consistent with the predictive coding framework

Trained mice, LM silenced Trained mice

(80-120 ms after stimulus onset)

0.1 0.2 0.3 0.4 0.5

Response selectivity in V1 Go - Nogo

Absolute selectivity

Naïve mice

**

Naïve mice, LM silenced

Visual stimulus

slide-53
SLIDE 53

Summary

  • Sensory processing is highly dynamic, allowing animals to flexibly access

and process sensory information according to their current perceptual and behavioural demands.

  • Still unclear to what degree top-down predictions influence or dominate

sensory representations

  • Subcortical structures such as the superior colliculus, thalamus,

cerebellum and the basal ganglia might also be important for shaping cortical information flow and integrating sensory and internal information

  • The sources of different internal signals are mostly still unknown and we

are only starting to determine the circuit mechanisms of how some of these signals are integrated with sensory information

  • “Sensory” cortical areas are strongly influenced by context and

behaviour

slide-54
SLIDE 54

Further reading

Kahn A, Hofer SB. Contextual signals in visual cortex, Current Opinion in Neurobiology, 2018, 52: 131 Gilbert CD, Li W. Top down influences on visual processing, Nature Reviews Neuroscience, 2013 Maunsell JHR. Neuronal Mechanisms of Visual Attention, Annu. Rev. Vis. Sci, 2015 Keller GB, Mrsic-Flogel TD. Predictive Processing: A Canonical Cortical Computation, Neuron, 2018 Olshausen BA. Perception as an Inference Problem, in: The Cognitive Neurosciences, MIT press 2013