Neurodynamics of expression coding in the core face network - - PowerPoint PPT Presentation

neurodynamics of expression coding in the core face
SMART_READER_LITE
LIVE PREVIEW

Neurodynamics of expression coding in the core face network - - PowerPoint PPT Presentation

Neurodynamics of expression coding in the core face network Yuanning Li, Michael J. Ward, Witold J. Lipski, R. Mark Richardson, and Avniel Singh Ghuman Carnegie Mellon University University of PiDsburgh Does neural activity in fusiform code


slide-1
SLIDE 1

Neurodynamics of expression coding in the core face network

Yuanning Li, Michael J. Ward, Witold J. Lipski, R. Mark Richardson, and Avniel Singh Ghuman Carnegie Mellon University University of PiDsburgh

slide-2
SLIDE 2

Does neural activity in fusiform code for facial expression information?

  • Contradictory evidence and theories can be found

in the literature about the coding in fusiform face area (FFA).

Classical model (Haxby et al., 2000) Recent proposed model (Duchaine & Yovel, 2015) FFA coding invariant aspect of faces general structural and shape information of faces FFA contributes to expression recognition No Yes Time of activity not specified ~170 ms after stim onset

2

slide-3
SLIDE 3

Does neural activity in fusiform code for facial expression information?

  • Contradictory evidence and theories can be found

in the literature about the coding in fusiform face area (FFA).

Classical model (Haxby et al., 2000) Recent proposed model (Duchaine & Yovel, 2015) FFA coding invariant aspect of faces general structural and shape information of faces FFA contributed to expression recognition No Yes Time of activity not specified ~170 ms after stim onset

2

slide-4
SLIDE 4

Does neural activity in fusiform code for facial expression information?

  • meta-analysis: 53 studies found on Neurosynth.org

with full brain functional mapping and comparison between emotions.

3

slide-5
SLIDE 5

Does neural activity in fusiform code for facial expression information?

  • meta-analysis: 53 studies found on Neurosynth.org

with full brain functional mapping and comparison between emotions.

  • 14/53 report significant contrast in fusiform.

3

slide-6
SLIDE 6

Does neural activity in fusiform code for facial expression information?

  • meta-analysis: 53 studies found on Neurosynth.org

with full brain functional mapping and comparison between emotions.

  • 14/53 report significant contrast in fusiform.

3

slide-7
SLIDE 7

Research questions

  • Can facial expression information be decoded from

fusiform?

  • What are the spatiotemporal dynamics of such

encoding in fusiform?

4

slide-8
SLIDE 8

Research questions

  • Can facial expression information be decoded from

fusiform?

  • What are the spatiotemporal dynamics of such

encoding in fusiform? intracranial EEG:

  • 19 subjects, 29 electrodes directly

recording from the human fusiform

  • Sensitive multivariate classification

approach

4

slide-9
SLIDE 9

Methods: intracranial EEG

  • 19 human epileptic patients
  • 29 fusiform electrodes selected
  • anatomical: electrode located in fusiform area
  • functional: face sensitivity over other categories in event-

related potential (ERP) and broadband activity (BB)

5

slide-10
SLIDE 10

Methods: intracranial EEG

  • 19 human epileptic patients
  • 29 fusiform electrodes selected
  • anatomical: electrode located in fusiform area
  • functional: face sensitivity over other categories in event-

related potential (ERP) and broadband activity (BB)

ERP BB

~200 ms after stim

  • nset

5

L R

slide-11
SLIDE 11

Methods:

  • Cognitive task: gender discriminant
  • 40 individuals (20 male), 5 expressions (neutral, angry,

happy, fear, sad)

6

slide-12
SLIDE 12

Methods:

  • Cognitive task: gender discriminant
  • 40 individuals (20 male), 5 expressions (neutral, angry,

happy, fear, sad)

  • Data analysis
  • sliding time window
  • multivariate pattern classification
  • consider both ERP and BB

6

slide-13
SLIDE 13

Results: expression decoding

  • mean binary expression classification across all

fusiform electrodes

  • peak accuracy 52.34% at 190 ms after stim
  • nset (p < 0.05, Bonferroni corrected)
  • 100

100 200 300 400 500 600

time (ms)

49 50 51 52 53

accuracy %

7

L R

slide-14
SLIDE 14

Results: spatiotemporal dynamics

  • pick the electrodes with significant facial

expression decoding (permutation test) 17/29 electrodes have significant expression decoding

  • 100

100 200 300 400 500 600

time (ms)

48 50 52 54 56

accuracy %

8

L R

slide-15
SLIDE 15

Research questions

  • Can facial expression information be decoded from

fusiform?

  • What are the spatiotemporal dynamics of such

encoding in fusiform?

Yes, fusiform activity encodes facial expressions

9

slide-16
SLIDE 16

Research questions

  • Can facial expression information be decoded from

fusiform?

  • What are the spatiotemporal dynamics of such

encoding in fusiform?

Yes, fusiform activity encodes facial expressions

9

slide-17
SLIDE 17

Results: spatiotemporal dynamics

  • No significant difference between the time courses
  • f left fusiform and right fusiform

left vs. right

  • 100

100 200 300 400 500 600

time (ms)

48 50 52 54 56

accuracy %

left right 10

L R

slide-18
SLIDE 18

Results: spatiotemporal dynamics

  • Significant difference between the timecourses of

posterior fusiform and anterior fusiform posterior vs. anterior

  • 100

100 200 300 400 500 600

time (ms)

48 50 52 54 56

accuracy %

posterior anterior

***

11

L R

slide-19
SLIDE 19

Results: spatiotemporal dynamics

  • Fusiform electrodes cluster into posterior and

anterior clusters

  • 65
  • 60
  • 55
  • 50
  • 45
  • 40
  • 35

y axis (mm)

100 200 300 400 500 600

peak time (ms)

posterior vs. anterior

early posterior late anterior

12

L R

slide-20
SLIDE 20

Research questions

  • Can facial expression information be decoded from

fusiform?

  • What are the spatiotemporal dynamics of such

encoding in fusiform?

Yes, bilateral fusiform activity encodes facial expressions

Posterior fusiform encodes expressions at the early stage Anterior fusiform encodes expressions at the late stage

13

slide-21
SLIDE 21

Discussion

  • Timing is an important factor in analyzing facial

expression processing.

  • Early (100-200 ms): core processing, intrinsic

coding for structural and general shape info

  • Late (300-500 ms): reciprocal, more deliberative

processing (Freiwald & Tsao 2010) Note: (Ghuman et al., 2014) FFA encodes face category in the early stage and individual faces in the late stage.

14

slide-22
SLIDE 22

Discussion

  • Spatial heterogeneity may explain the discrepancy
  • f expression encoding in fusiform from the

literature, esp. in fMRI studies.

  • 100

100 200 300 400 500 600

time (ms)

48 50 52 54

accuracy %

posterior fusiform anterior fusiform

15

L R

slide-23
SLIDE 23

Acknowledgments Coauthors:

  • Dr. Avniel Singh Ghuman (UPMC, CNBC)
  • Dr. R. Mark Richardson (UPMC, CNBC)
  • Dr. Witold Lipski (UPMC)
  • Michael Ward (UPMC)

iEEG data collection and preprocessing:

  • EMU staff (UPMC Presbyterian)
  • Matthew Boring (CNUP, CNBC)
  • Ari Kappel (UPMC)

Institutions: Funding support:

16

slide-24
SLIDE 24

Thank you!

slide-25
SLIDE 25

Future directions

  • Identity X Expression
  • FFA encodes face individuation (late stage,

200-500 ms after stim. onset) 
 (Ghuman et al., 2014)

  • 100

100 200 300 400 500 600

time (ms)

46 50 54 58

accuracy %

identity expression

slide-26
SLIDE 26

Future directions

  • What facial features underlie such spatiotemporal

processing?

slide-27
SLIDE 27

Methods: intracranial EEG

  • 19 human epileptic patients
  • 29 fusiform electrodes selected
  • anatomical: electrode located in fusiform area
  • functional: face sensitivity over other categories in

event-related potential (ERP) and broadband activity (BB)

  • 100

100 200 300 400 500

time (ms)

0.5 1 1.5

d'

L R

slide-28
SLIDE 28

Results: face sensitivity

  • mean face sensitivity across all fusiform electrodes

(face vs. non-face)

  • 100

100 200 300 400 500

time (ms)

0.5 1 1.5

d'

posterior anterior

posterior vs. anterior L R

slide-29
SLIDE 29

Results: face sensitivity

  • mean face sensitivity across all fusiform electrodes

(face vs. non-face) left vs. right

  • 100

100 200 300 400 500

time (ms)

0.5 1 1.5

d'

left right

L R

slide-30
SLIDE 30

Results: representational dissimilarity matrix (RDM)

early (50-250 ms) late (250-450 ms)

bilateral

AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA

left

AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA

right

AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA

p L R

slide-31
SLIDE 31

Results: representational dissimilarity matrix (RDM)

early (50-250 ms) late (250-450 ms)

AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA

bilateral

AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA

anterior

AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA AF AN HA NE SA

posterior p L R