Asymmetries in brain organization for sign language Karen Emmorey - - PowerPoint PPT Presentation

asymmetries in brain organization for sign language karen
SMART_READER_LITE
LIVE PREVIEW

Asymmetries in brain organization for sign language Karen Emmorey - - PowerPoint PPT Presentation

Asymmetries in brain organization for sign language Karen Emmorey San Diego State University Overview What determines asymmetries in the neural organization for language? Is left-hemisphere dominance found for signed languages?


slide-1
SLIDE 1

Asymmetries in brain organization for sign language Karen Emmorey San Diego State University

slide-2
SLIDE 2

Overview

  • What determines asymmetries in the neural organization

for language?

– Is left-hemisphere dominance found for signed languages?

  • Asymmetries in neural organization specific to sign

language

– Facial expression perception – Spatial language – Sensory-motoric iconicity (sign vs. pantomime)

  • Anatomical asymmetries due to deafness or experience

with sign language

– Auditory cortices – Motor cortices

slide-3
SLIDE 3

Some potential determinants of hemispheric asymmetries

  • Linguistic functions are left dominant

– Sign languages exhibit the same linguistic structure as spoken languages (including phonology)

  • Spatial functions are right dominant

– Sign languages depend on spatial contrasts at all linguistic levels

  • Rapid temporal processing is left dominant (P. Tallal)

– Phonological transitions in sign are five times slower than in speech (200 ms vs. 40 ms)

slide-4
SLIDE 4

Left hemisphere damage leads to sign language aphasia

Left-Hemisphere Damaged Signers

Rating Scale of Sign Characteristics

Melodic Line Sign Finding Articulatory Agility Grammatical Form Paraphasia in Running Sign Sign Finding Sign Comprehension

Control Deaf Signers

Rating Scale of Sign Characteristics

Melodic Line Sign Finding Articulatory Agility Grammatical Form Paraphasia in Running Sign Sign Finding Sign Comprehension

From Bellugi and Hickok (1995)

Right-Hemisphe Damaged Signe

Rating Scale of Sign Characteristics

Melodic Line Sign Finding Articulatory Agility Grammatical Form Paraphasia in Running Sign Sign Finding Sign Comprehension

Poizner et al., (1987)

slide-5
SLIDE 5

BUT: Neville et al. (1998) PNAS

Left lateralized activation for reading English Bilateral activation for viewing ASL

slide-6
SLIDE 6

Comparing audio-visual speech and British Sign Language comprehension

Watching a BSL Signer Watching/listening to an English speaker

MacSweeney et al., (2002) Brain

Bilateral activation for both sign and speech comprehension, but Left > Right

slide-7
SLIDE 7

Auditory regions (including Wernicke’s area) are engaged during sign perception

Watching a BSL Signer

MacSweeney et al., (2002) Brain

Regions of common activation for A-V English and BSL

slide-8
SLIDE 8

Nouns Verbs

Emmorey et al. 2002, NeuroImage 2003, Neuropsych 2004, Brain & Lan

Prepositions

Broca’s area (left inferior frontal gyrus Picture Naming

Left Hemisphere Right Hemisphere

Sign Language Production

slide-9
SLIDE 9

Left hemisphere activation for both right- handed and left-handed signing

Corina et al. (2003) Journal of Cognitive Neuroscience

slide-10
SLIDE 10

The left hemisphere is dominant for both sign and speech

  • Classic left hemisphere language areas (e.g., Broca’s

area and Wernicke’s area) are involved in sign language production and comprehension

  • Left lateralization is stronger for language production

than comprehension for both speech & sign

  • Left hemisphere specialization for language is not tied

to the properties of speech

slide-11
SLIDE 11

The neural systems underlying facial expression recognition

Stephen McCullough, UCSD

slide-12
SLIDE 12

Linguistic Facial Expressions

t PAPER Topic marker q PAPER Yes/No question GO cond RAIN Conditional clause marker

“If it rains, we’ll leave.”

Adverbial markers

cs NOW

“recently”

mm WRITE

“easily”

Baker & Cokely, 1980

slide-13
SLIDE 13

fMRI fMRI Study Design Study Design

  • Task

Task: same / different judgments : same / different judgments

  • Baseline task: same / different

Baseline task: same / different Gender Gender

“ “Same Same” ” “ “Different Different” ”

Male Male Male Male Female Female Male Male

“ “Same Same” ” “ “Different Different” ”

Linguistic Linguistic Emotional Emotional

MM MM MM MM Happy Happy Angry Angry

  • Experimental task: same / different

Experimental task: same / different Facial Expressions Facial Expressions

McCullough, Emmorey, & Sereno (2005) Cognitive Brain Research

slide-14
SLIDE 14

Stimuli: Stimuli: Face Only Face Only

Emotional Expressions Emotional Expressions

happy sad angry surprised disgust fear happy sad angry surprised disgust fear

Linguistic Expressions Linguistic Expressions

MM CS TH INT PUFF PS MM CS TH INT PUFF PS “ “easily easily” ” “ “recently recently” ” “ “carelessly carelessly” ” “ “intense intense” ” “ “a lot a lot” ” “ “smoothly smoothly” ”

slide-15
SLIDE 15

Stimuli: Stimuli: Face with Sign Face with Sign

MM MM RUN RUN “ “run easily run easily” ”

Linguistic Linguistic

surprised surprised STUDY STUDY “ “study (with surprise) study (with surprise)” ”

Emotional Emotional

neutral neutral DISCUSS DISCUSS “ “discuss (no expression) discuss (no expression)” ”

Baseline Baseline

McCullough et al. (2005) Cognitive Brain Research

slide-16
SLIDE 16
  • 10 native ASL signers

10 native ASL signers

  • 5 women, 5 men

5 women, 5 men

  • Deaf from birth

Deaf from birth

  • Right handed

Right handed

  • Mean Age = 29.4 years

Mean Age = 29.4 years

Deaf Subjects Deaf Subjects

  • 10 non-signers

10 non-signers

  • 5 women, 5 men

5 women, 5 men

  • Normal hearing

Normal hearing

  • Right handed

Right handed

  • Mean Age = 24.2 years

Mean Age = 24.2 years

Hearing Subjects Hearing Subjects

McCullough et al. (2005) Cognitive Brain Research

slide-17
SLIDE 17

Regions of Interest

L R

Right STS

Superior Temporal Suclus (STS)

Right FG

Fusiform Gyrus (FG)

slide-18
SLIDE 18

0.8 0.6 0.4 0.2

  • 0.2
  • 0.4
  • 0.6

Emotional Linguistic Emotional Linguistic

Face Only Face with verb Rightward bias Leftward bias

* *

Deaf Hearing Deaf Hearing

Hemisphere Laterality Index: Superior Temporal Sulcus

Emotional Emotional McCullough et al. (2005) Cognitive Brain Research Linguistic Linguistic

slide-19
SLIDE 19

Linguistic Facial Expressions: Face with verb Linguistic Facial Expressions: Face with verb Deaf Hearing

Gender Linguistic Facial expression McCullough et al. (2005) Cognitive Brain Research

slide-20
SLIDE 20

0.2 0.1

  • 0.1
  • 0.2
  • 0.3
  • 0.4
  • 0.5
  • 0.6

Emotional Linguistic Emotional Linguistic

Face Only Face with verb Rightward bias Leftward bias Deaf Hearing

* *

Hearing

Hemisphere Laterality Index: Fusiform Gyrus

Emotional Linguistic Emotional Linguistic

Deaf Hearing

McCullough et al. (2005) Cognitive Brain Research

slide-21
SLIDE 21

Emotional Facial Expressions: Face Only Emotional Facial Expressions: Face Only Deaf

Hearing

Gender Emotional Facial expression McCullough et al. (2005) Cognitive Brain Research

slide-22
SLIDE 22

Summary: Facial Expression Perception

  • Greater left STS activation when linguistic facial

expressions were in the obligatory verbal context

– Left STS may integrate adverbial facial expressions with the manual verb

  • For signers, activation in STS was bilateral for

emotional facial expressions

– Emotional facial expression play a linguistic role in narratives and in lexical emotion signs

  • For signers, neural activity within the fusiform

gyrus was left-lateralized

– Deaf signers have extensive experience analyzing local facial features (e.g., mouth configuration)

slide-23
SLIDE 23

Neural systems underlying spatial language

slide-24
SLIDE 24

Classifier constructions: The use of space to represent space

HOUSE located here BIKE located here

“The bike is near the house.”

Emmorey (2003)

slide-25
SLIDE 25

Positron Emission Tomography (PET) Studies of Spatial Language Production

  • Deaf native signers (congenitally deaf)
  • Hearing native signers (Deaf parents)
  • Hearing monolingual English speakers

University of Iowa: Thomas Grabowski, Hanna Damasio, Richard Hichwa, Laurie Ponto UCSD: Stephen McCullough

slide-26
SLIDE 26

Locative classifier construction Stimuli Lexical preposition

“next to” “in”

Figure obj noun

“brush” “paint brush

slide-27
SLIDE 27

Locative classifier constructions

  • vs. ASL prepositions

R L Left Hemisphere

Classifier constructions activated left inferotemporal (IT) cortex

Emmorey et al. (2002) NeuroImage

slide-28
SLIDE 28

Locative Constructions vs. ASL Nouns

Emmorey et al. (2002; 2005) NeuroImage

Hearing Signers Deaf Signers Left parietal activation for both groups Right parietal activation for both groups

slide-29
SLIDE 29

English Prepositions vs. Nouns

Emmorey et al. (2005) NeuroImage, H. Damasio et al. (2001) NeuroImage

Hearing Signers Monolingual Speakers Left parietal activation for both groups Right parietal activation: only for ASL-English bilinguals

slide-30
SLIDE 30

Summary: Spatial Language

  • Classifier constructions engage regions in left inferior

temporal cortex implicated in the retrieval of names for concrete objects

– Handshape encodes information about object type

  • The production of ASL locative classifier constructions

uniquely engages regions within right parietal cortex

– Signing space is used to represent spatial relationships

  • ASL-English bilinguals recruit right parietal cortex when

producing English prepositions

– Bimodal bilinguals may analyze spatial relationships for encoding in ASL, even when speaking English

slide-31
SLIDE 31

Motor-iconicity and the neural systems underlying tool and action naming

slide-32
SLIDE 32

Sensory-motoric iconicity: Signs for tools and tool-based actions

TURN-KEY SPRAY-PAINT TELEPHONE

slide-33
SLIDE 33

Picture Naming Task

BRUSH-HAIR Handling verb Actions with an implement SLEEP Non-pantomimic verb Actions without an implement SCISSORS Tools “Pantomimic” noun

slide-34
SLIDE 34

Emmorey et al. (2004) Brain & Language

Naming tools & tool-actions activates a premotor-parietal network for sign and speech

Left premotor cortex Left inferior parietal lobule

R L R L

slide-35
SLIDE 35

Activation in Broca’s area

Handling verbs Pantomimic nouns Non-pantomimic verbs

Emmorey et al. (2004) Brain & Language

slide-36
SLIDE 36

Handling verbs minus

non-pantomimic verbs

Left Hemisphere Right Hemisphere

Subtraction revealed no significant differences in neural activation for the production of motorically iconic signs versus non-pantomimic signs

Emmorey et al. (2004) Brain & Language

slide-37
SLIDE 37

Pantomime substitutions for ASL signs by an aphasic signer

Corina et al. (1992)

ASL WL’s Pantomime

slide-38
SLIDE 38

Poor sign recognition versus good pantomime recognition

50 55 60 65 70 75 80 85 90 95 100 Sign Pantomime Deaf Controls WL

Percent Correct

Corina et al. (1992)

slide-39
SLIDE 39

Conclusions

  • Naming tools or tool-based actions engages a left

premotor-parietal cortical network for both signers and speakers Left premotor-parietal regions encode (embodied) tool- use semantics

  • The pantomimic iconicity of ASL handling classifier verbs

has no effect on the neural systems that underlie their production Sign and pantomime production depend upon partially segregated neural systems

slide-40
SLIDE 40

A morphometric analysis of auditory brain regions in congenitally deaf adults

HG = Heschl’s Gyrus PT = Planum Temporale Hanna Damasio, John S. Allen, Joel Bruss, Natalie Schenker

slide-41
SLIDE 41
  • Native ASL signers
  • 14 women, 11 men
  • Deaf from birth
  • Hearing loss: 21 profound 3

severe; 1 mod.-severe

  • Right handed
  • Mean Age = 23.8 years

25 Deaf Brains

  • Native English speakers
  • 14 women, 11 men
  • Normal hearing
  • Right handed
  • Mean Age = 28.5 years

25 Hearing Brains

slide-42
SLIDE 42

Auditory deprivation from birth results in a bilateral increase in grey-white matter ratios in primary auditory cortex The increase in GW ratios is largely due to a reduction in white matter volume in deaf subjects Thus, the degree of myelination and/or axonal growth within auditory cortices appears to depend upon sound input during development. Lack of auditory input from birth does not alter the total volume of grey matter within auditory cortices.

Emmorey et al. (2003) P

slide-43
SLIDE 43

Both hearing and deaf individuals exhibit a leftward gray matter asymmetry in Heschl’s gyrus (pink), but no white matter asymmetry

R L

Emmorey et al. (2003) P

slide-44
SLIDE 44

R L

Both hearing and deaf individuals exhibit a leftward asymmetry in the Planum Temporale (blue) Thus, these leftward asymmetries are not likely to be related to experience with spoken language

  • r to auditory processing in general

Emmorey et al. (2003) P

slide-45
SLIDE 45

Conclusions Summary

  • Language, not speech, is lateralized to the left

hemisphere

  • Hemispheric asymmetries can be altered by

functions unique to sign language

  • Signing and pantomime engage distinct neural

systems

  • Anatomical asymmetries can be altered by

experience

slide-46
SLIDE 46

Collaborators:

Research supported by: National Institute of Deafness and Communicative Disorders (R01 DC006708; R01 DC00201; P50 DC03189)

Hanna Damasio Tom Grabowski Steve McCullough John S. Allen Joel Bruss