Asymmetries in brain organization for sign language Karen Emmorey - - PowerPoint PPT Presentation
Asymmetries in brain organization for sign language Karen Emmorey - - PowerPoint PPT Presentation
Asymmetries in brain organization for sign language Karen Emmorey San Diego State University Overview What determines asymmetries in the neural organization for language? Is left-hemisphere dominance found for signed languages?
Overview
- What determines asymmetries in the neural organization
for language?
– Is left-hemisphere dominance found for signed languages?
- Asymmetries in neural organization specific to sign
language
– Facial expression perception – Spatial language – Sensory-motoric iconicity (sign vs. pantomime)
- Anatomical asymmetries due to deafness or experience
with sign language
– Auditory cortices – Motor cortices
Some potential determinants of hemispheric asymmetries
- Linguistic functions are left dominant
– Sign languages exhibit the same linguistic structure as spoken languages (including phonology)
- Spatial functions are right dominant
– Sign languages depend on spatial contrasts at all linguistic levels
- Rapid temporal processing is left dominant (P. Tallal)
– Phonological transitions in sign are five times slower than in speech (200 ms vs. 40 ms)
Left hemisphere damage leads to sign language aphasia
Left-Hemisphere Damaged Signers
Rating Scale of Sign Characteristics
Melodic Line Sign Finding Articulatory Agility Grammatical Form Paraphasia in Running Sign Sign Finding Sign Comprehension
Control Deaf Signers
Rating Scale of Sign Characteristics
Melodic Line Sign Finding Articulatory Agility Grammatical Form Paraphasia in Running Sign Sign Finding Sign Comprehension
From Bellugi and Hickok (1995)
Right-Hemisphe Damaged Signe
Rating Scale of Sign Characteristics
Melodic Line Sign Finding Articulatory Agility Grammatical Form Paraphasia in Running Sign Sign Finding Sign Comprehension
Poizner et al., (1987)
BUT: Neville et al. (1998) PNAS
Left lateralized activation for reading English Bilateral activation for viewing ASL
Comparing audio-visual speech and British Sign Language comprehension
Watching a BSL Signer Watching/listening to an English speaker
MacSweeney et al., (2002) Brain
Bilateral activation for both sign and speech comprehension, but Left > Right
Auditory regions (including Wernicke’s area) are engaged during sign perception
Watching a BSL Signer
MacSweeney et al., (2002) Brain
Regions of common activation for A-V English and BSL
Nouns Verbs
Emmorey et al. 2002, NeuroImage 2003, Neuropsych 2004, Brain & Lan
Prepositions
Broca’s area (left inferior frontal gyrus Picture Naming
Left Hemisphere Right Hemisphere
Sign Language Production
Left hemisphere activation for both right- handed and left-handed signing
Corina et al. (2003) Journal of Cognitive Neuroscience
The left hemisphere is dominant for both sign and speech
- Classic left hemisphere language areas (e.g., Broca’s
area and Wernicke’s area) are involved in sign language production and comprehension
- Left lateralization is stronger for language production
than comprehension for both speech & sign
- Left hemisphere specialization for language is not tied
to the properties of speech
The neural systems underlying facial expression recognition
Stephen McCullough, UCSD
Linguistic Facial Expressions
t PAPER Topic marker q PAPER Yes/No question GO cond RAIN Conditional clause marker
“If it rains, we’ll leave.”
Adverbial markers
cs NOW
“recently”
mm WRITE
“easily”
Baker & Cokely, 1980
fMRI fMRI Study Design Study Design
- Task
Task: same / different judgments : same / different judgments
- Baseline task: same / different
Baseline task: same / different Gender Gender
“ “Same Same” ” “ “Different Different” ”
Male Male Male Male Female Female Male Male
“ “Same Same” ” “ “Different Different” ”
Linguistic Linguistic Emotional Emotional
MM MM MM MM Happy Happy Angry Angry
- Experimental task: same / different
Experimental task: same / different Facial Expressions Facial Expressions
McCullough, Emmorey, & Sereno (2005) Cognitive Brain Research
Stimuli: Stimuli: Face Only Face Only
Emotional Expressions Emotional Expressions
happy sad angry surprised disgust fear happy sad angry surprised disgust fear
Linguistic Expressions Linguistic Expressions
MM CS TH INT PUFF PS MM CS TH INT PUFF PS “ “easily easily” ” “ “recently recently” ” “ “carelessly carelessly” ” “ “intense intense” ” “ “a lot a lot” ” “ “smoothly smoothly” ”
Stimuli: Stimuli: Face with Sign Face with Sign
MM MM RUN RUN “ “run easily run easily” ”
Linguistic Linguistic
surprised surprised STUDY STUDY “ “study (with surprise) study (with surprise)” ”
Emotional Emotional
neutral neutral DISCUSS DISCUSS “ “discuss (no expression) discuss (no expression)” ”
Baseline Baseline
McCullough et al. (2005) Cognitive Brain Research
- 10 native ASL signers
10 native ASL signers
- 5 women, 5 men
5 women, 5 men
- Deaf from birth
Deaf from birth
- Right handed
Right handed
- Mean Age = 29.4 years
Mean Age = 29.4 years
Deaf Subjects Deaf Subjects
- 10 non-signers
10 non-signers
- 5 women, 5 men
5 women, 5 men
- Normal hearing
Normal hearing
- Right handed
Right handed
- Mean Age = 24.2 years
Mean Age = 24.2 years
Hearing Subjects Hearing Subjects
McCullough et al. (2005) Cognitive Brain Research
Regions of Interest
L R
Right STS
Superior Temporal Suclus (STS)
Right FG
Fusiform Gyrus (FG)
0.8 0.6 0.4 0.2
- 0.2
- 0.4
- 0.6
Emotional Linguistic Emotional Linguistic
Face Only Face with verb Rightward bias Leftward bias
* *
Deaf Hearing Deaf Hearing
Hemisphere Laterality Index: Superior Temporal Sulcus
Emotional Emotional McCullough et al. (2005) Cognitive Brain Research Linguistic Linguistic
Linguistic Facial Expressions: Face with verb Linguistic Facial Expressions: Face with verb Deaf Hearing
Gender Linguistic Facial expression McCullough et al. (2005) Cognitive Brain Research
0.2 0.1
- 0.1
- 0.2
- 0.3
- 0.4
- 0.5
- 0.6
Emotional Linguistic Emotional Linguistic
Face Only Face with verb Rightward bias Leftward bias Deaf Hearing
* *
Hearing
Hemisphere Laterality Index: Fusiform Gyrus
Emotional Linguistic Emotional Linguistic
Deaf Hearing
McCullough et al. (2005) Cognitive Brain Research
Emotional Facial Expressions: Face Only Emotional Facial Expressions: Face Only Deaf
Hearing
Gender Emotional Facial expression McCullough et al. (2005) Cognitive Brain Research
Summary: Facial Expression Perception
- Greater left STS activation when linguistic facial
expressions were in the obligatory verbal context
– Left STS may integrate adverbial facial expressions with the manual verb
- For signers, activation in STS was bilateral for
emotional facial expressions
– Emotional facial expression play a linguistic role in narratives and in lexical emotion signs
- For signers, neural activity within the fusiform
gyrus was left-lateralized
– Deaf signers have extensive experience analyzing local facial features (e.g., mouth configuration)
Neural systems underlying spatial language
Classifier constructions: The use of space to represent space
HOUSE located here BIKE located here
“The bike is near the house.”
Emmorey (2003)
Positron Emission Tomography (PET) Studies of Spatial Language Production
- Deaf native signers (congenitally deaf)
- Hearing native signers (Deaf parents)
- Hearing monolingual English speakers
University of Iowa: Thomas Grabowski, Hanna Damasio, Richard Hichwa, Laurie Ponto UCSD: Stephen McCullough
Locative classifier construction Stimuli Lexical preposition
“next to” “in”
Figure obj noun
“brush” “paint brush
Locative classifier constructions
- vs. ASL prepositions
R L Left Hemisphere
Classifier constructions activated left inferotemporal (IT) cortex
Emmorey et al. (2002) NeuroImage
Locative Constructions vs. ASL Nouns
Emmorey et al. (2002; 2005) NeuroImage
Hearing Signers Deaf Signers Left parietal activation for both groups Right parietal activation for both groups
English Prepositions vs. Nouns
Emmorey et al. (2005) NeuroImage, H. Damasio et al. (2001) NeuroImage
Hearing Signers Monolingual Speakers Left parietal activation for both groups Right parietal activation: only for ASL-English bilinguals
Summary: Spatial Language
- Classifier constructions engage regions in left inferior
temporal cortex implicated in the retrieval of names for concrete objects
– Handshape encodes information about object type
- The production of ASL locative classifier constructions
uniquely engages regions within right parietal cortex
– Signing space is used to represent spatial relationships
- ASL-English bilinguals recruit right parietal cortex when
producing English prepositions
– Bimodal bilinguals may analyze spatial relationships for encoding in ASL, even when speaking English
Motor-iconicity and the neural systems underlying tool and action naming
Sensory-motoric iconicity: Signs for tools and tool-based actions
TURN-KEY SPRAY-PAINT TELEPHONE
Picture Naming Task
BRUSH-HAIR Handling verb Actions with an implement SLEEP Non-pantomimic verb Actions without an implement SCISSORS Tools “Pantomimic” noun
Emmorey et al. (2004) Brain & Language
Naming tools & tool-actions activates a premotor-parietal network for sign and speech
Left premotor cortex Left inferior parietal lobule
R L R L
Activation in Broca’s area
Handling verbs Pantomimic nouns Non-pantomimic verbs
Emmorey et al. (2004) Brain & Language
Handling verbs minus
non-pantomimic verbs
Left Hemisphere Right Hemisphere
Subtraction revealed no significant differences in neural activation for the production of motorically iconic signs versus non-pantomimic signs
Emmorey et al. (2004) Brain & Language
Pantomime substitutions for ASL signs by an aphasic signer
Corina et al. (1992)
ASL WL’s Pantomime
Poor sign recognition versus good pantomime recognition
50 55 60 65 70 75 80 85 90 95 100 Sign Pantomime Deaf Controls WL
Percent Correct
Corina et al. (1992)
Conclusions
- Naming tools or tool-based actions engages a left
premotor-parietal cortical network for both signers and speakers Left premotor-parietal regions encode (embodied) tool- use semantics
- The pantomimic iconicity of ASL handling classifier verbs
has no effect on the neural systems that underlie their production Sign and pantomime production depend upon partially segregated neural systems
A morphometric analysis of auditory brain regions in congenitally deaf adults
HG = Heschl’s Gyrus PT = Planum Temporale Hanna Damasio, John S. Allen, Joel Bruss, Natalie Schenker
- Native ASL signers
- 14 women, 11 men
- Deaf from birth
- Hearing loss: 21 profound 3
severe; 1 mod.-severe
- Right handed
- Mean Age = 23.8 years
25 Deaf Brains
- Native English speakers
- 14 women, 11 men
- Normal hearing
- Right handed
- Mean Age = 28.5 years
25 Hearing Brains
Auditory deprivation from birth results in a bilateral increase in grey-white matter ratios in primary auditory cortex The increase in GW ratios is largely due to a reduction in white matter volume in deaf subjects Thus, the degree of myelination and/or axonal growth within auditory cortices appears to depend upon sound input during development. Lack of auditory input from birth does not alter the total volume of grey matter within auditory cortices.
Emmorey et al. (2003) P
Both hearing and deaf individuals exhibit a leftward gray matter asymmetry in Heschl’s gyrus (pink), but no white matter asymmetry
R L
Emmorey et al. (2003) P
R L
Both hearing and deaf individuals exhibit a leftward asymmetry in the Planum Temporale (blue) Thus, these leftward asymmetries are not likely to be related to experience with spoken language
- r to auditory processing in general
Emmorey et al. (2003) P
Conclusions Summary
- Language, not speech, is lateralized to the left
hemisphere
- Hemispheric asymmetries can be altered by
functions unique to sign language
- Signing and pantomime engage distinct neural
systems
- Anatomical asymmetries can be altered by
experience
Collaborators:
Research supported by: National Institute of Deafness and Communicative Disorders (R01 DC006708; R01 DC00201; P50 DC03189)
Hanna Damasio Tom Grabowski Steve McCullough John S. Allen Joel Bruss