An introduction to computational psycholinguistics: Modeling human - - PDF document

an introduction to computational psycholinguistics
SMART_READER_LITE
LIVE PREVIEW

An introduction to computational psycholinguistics: Modeling human - - PDF document

An introduction to computational psycholinguistics: Modeling human sentence processing Shravan Vasishth University of Potsdam, Germany http://www.ling.uni-potsdam.de/ vasishth vasishth@acm.org September 2005, Bochum ACT-R introduction Why


slide-1
SLIDE 1

An introduction to computational psycholinguistics: Modeling human sentence processing

Shravan Vasishth University of Potsdam, Germany http://www.ling.uni-potsdam.de/∼vasishth vasishth@acm.org September 2005, Bochum

ACT-R introduction

Why cognitive architectures? The inspiration is Newell’s idea (Newell himself developed another architecture called SOAR, which we will not get into in this course).

  • Aims towards a unified theory of mind as a computational process.
  • Provides means for integrating assumptions about domain-specific behavior with

empirically-based research on human cognition (e.g., constraints on memory, eye movements).

  • Provides a framework for interpreting brain imaging data.
  • Applications in HCI, education, simulated cognitive agents for all kinds of hazardous
  • perations.

1

slide-2
SLIDE 2

The units of knowledge: Chunks

(CLEAR-ALL) (CHUNK-TYPE addition-fact addend1 addend2 sum) (CHUNK-TYPE integer value) (ADD-DM (fact3+4 isa addition-fact addend1 three addend2 four sum seven) (three isa integer value 3) (four isa integer value 4)

2

Encoding sentences as linked chunks

Fact: The cat sits on the mat (Add-DM (fact001 isa proposition agent cat01 action sits_on

  • bject mat)

) Fact: The black cat with five legs sits on the mat (cat01 isa cat legs 5 color black)

3

slide-3
SLIDE 3

Exercise

(Chunk-Type proposition agent action object) (Chunk-Type professor money-status age) (Chunk-Type house kind price status) Fact: The rich young professor buys a beautiful and expensive city house. What chunks do we need to add to the declarative memory?

4

Productions: The unit of processing

  • Condition-action control structure.
  • Has a default execution time of 50 milliseconds.
  • Presents a serial bottleneck in an otherwise parallel system.
  • Symbolic realization of the flow of information from cortex to basal ganglia and back.

5

slide-4
SLIDE 4

The syntax of production rules

Syntax: (p name (p Production-name Specification of buffer Conditions tests ==> delimiter Specification of buffer Actions ) transformations)

6

An example of a production rule

(p predict-an-IP =retrieval> isa determiner pronounciation ‘‘a’’ ... [other relevant features of the determiner] ==> +retrieval> isa noun-prediction number singular ;;; we call this a retrieval cue ... )

7

slide-5
SLIDE 5

Buffers in ACT-R

  • 1. Goal buffer: represents one’s current state in a task, and preserves information across

productions.

  • 2. Retrieval buffer:

information retrieved from DM stored here; locus of activation computations

  • 3. Visual buffers
  • 4. Auditory buffers
  • 5. Manual buffers
  • 6. Vocal buffers

8

Chunk Activation: The subsymbolic level

Ai = Bi + X

j

W j × Sji + X

k

MP k × Simkl + N (, s) (1)

  • Ai: The activation of a chunk i
  • Bi: Base-level activation of i, reflects past usefulness
  • P

j

W j × Sji: Associative activation, relevance to the general context

  • P

k

MP k × Simkl: Mismatch penalty

  • N (0, s): Noise.

9

slide-6
SLIDE 6

Mapping activation to retrieval time

Retrieval Timei = F × e−Ai (2) Among the chunks that match a retrieval request, the one with the highest activation is retrieved provided it exceeds a predefined threshold τ.

10

Base-level activation and associative activation

Bi = ln @

n

X

j=

tj

−d

1 A Ai = Bi + X

j

W jSji (3) Chunks are retrieved by a content-addressed, associative retrieval process. Associative retrieval interference arises because the strength of association from a cue is reduced as a function of the number of items associated with the cue. This is captured by Equation 4, which reduces the maximum associative strength S by the log of the “fan” of item j, i.e., the number of items associated with j. Sji = S − ln (fanj) (4)

11

slide-7
SLIDE 7

Base-level activation and associative activation

12

Base-level activation and associative activation

Bi = ln

n

P

j=

tj

−d

! Ai = Bi + P

j

W jSji

Activation 0.5 1.0 1.5

Interference Decay Retrievals

13