JUDGMENT AND DECISION MAKING IN MILITARY CONTEXTS M ARC C ANELLAS - - PowerPoint PPT Presentation

judgment and decision
SMART_READER_LITE
LIVE PREVIEW

JUDGMENT AND DECISION MAKING IN MILITARY CONTEXTS M ARC C ANELLAS - - PowerPoint PPT Presentation

UNCLASSIFIED Marc Canellas Dec. 6, 2016 MORS Experimental Techniques Special Meeting MATHEMATICAL REPRESENTATIONS OF JUDGMENT AND DECISION MAKING IN MILITARY CONTEXTS M ARC C ANELLAS *, K AREN F EIGH , P H D, AND R ACHEL H AGA C


slide-1
SLIDE 1

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

MATHEMATICAL REPRESENTATIONS OF JUDGMENT AND DECISION MAKING IN MILITARY CONTEXTS

MARC CANELLAS*, KAREN FEIGH, PHD, AND RACHEL HAGA COGNITIVE ENGINEERING CENTER GEORGIA INSTITUTE OF TECHNOLOGY *PRESENTER, MARC.C.CANELLAS@GATECH.EDU

“SIMULATION OF MULTIPLE STRATEGIES WITHIN A DECISION PROCESS MODEL: A PATHWAY TO IMPROVED DECISION SUPPORT,” OFFICE OF NAVAL RESEARCH, COMMAND DECISION MAKING PROGRAM, #N00014-14-1-0136

slide-2
SLIDE 2

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

  • 1. Military operators use naturalistic heuristics (simple decision algorithms, pattern

recognition, etc.) based on their experiences and expertise to make quick and accurate decisions – especially when faced with limited time, information, resources,

  • r cognition.
  • 2. Our new general linear model of judgment and decision making can mathematically

and transparently represent these types of strategies (and more) while accounting for expertise and incomplete information.

  • 3. Leveraging the perspectives of naturalistic heuristics and the mathematics of the GLM

will enable new ways to

  • Model and simulate military operators
  • Develop prescriptive strategies for better performance
  • Design support tools and interfaces

OBJECTIVE: BETTER MODELING, SIMULATION, AND SUPPORT OF MILITARY OPERATORS

slide-3
SLIDE 3

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

DEVELOPMENT, MODELING, AND SIMULATION OF NATURALISTIC HEURISTICS1

N AT U R A L I S T I C D E C I S I O N M A K I N G + FA S T - A N D - F R U G A L H E U R I S T I C S

1Keller et al., 2010

slide-4
SLIDE 4

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

“God does not play dice.” “Neither do people [in the wild]…”

  • People do not generate and

compare option sets

  • People use prior experience to

rapidly categorize situations

  • People rely on a synthesis of their

experiences

  • People do not passively await
  • utcomes of their gambles and

bets; they actively shape events

Cognitive Continuum1 Skill’s, Rules, and Knowledge-Based Behavior2 Naturalistic Decision Making3

NATURALISTIC DECISION MAKING

1Hammond et al., 1987; 2Rasmussen, 1983; 3Klein, 2008; Lipshitz et al., 2001

slide-5
SLIDE 5

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

SOLDIERS AND COMMANDERS USE AND ARE TAUGHT NATURALISTIC DECISION MAKING COA 1 COA 2 COA 3 Single-Option Model: Evaluate the first COA and use if satisfactory (else reevaluate situation and generate a new COA) COA 1 Multiple-Options Model: Evaluate 3 courses of action (COAs), compare each, then select the best.

Multiple-attribute utility analysis Based on Recognition-primed decision making Rarely fully-implemented Used by military Natural and comfortable strategy1; In 2014 US Army Field Manual3 Research results Reduced planning by 20%2 or 30%1; Better COAs1; Degrades under time-pressure; Not fully-representative Limitations Requires advanced programming while still not implementing all aspects vs vs

1Ross et al., 2004: Fort Leavenworth Battle Command Battle Lab; 2Thundholm, 2003: Members of Swedish Staff Officer Program

COA 1.1

slide-6
SLIDE 6

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

  • Heuristics: Simple search, stopping, and

decision rules

  • People use the bounds on rationality of

simplicity, speed, and frugality as a mechanism for simple, robust, and accurate decisions

  • Example: Fast-and-frugal trees
  • Binary predictors and outcomes
  • One exit at each level, two at final level
  • Derived from data and qualitative methods1 or

from simplifying random forests2

  • Accurate and robust3 in military, finance, and

medical domains

  • Transparency, consistency, and communicability

lead to more acceptance than actuarial methods4

FAST-AND-FRUGAL HEURISTICS

1Katsikopoulos et al., 2008; Martignon et al., 2008; 2Deng, 2014; 4Katsikopoulos et al., 2008; Elwyn et al., 2001; Green & Mehr, 1997;

Question 1 Question 2 Question 3 No No No Yes

+ + +

slide-7
SLIDE 7

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

SOLDIERS AND COMMANDERS USE AND ARE TAUGHT FAST-AND-FRUGAL HEURISTICS

1Bagwell, 2008 2Keller and Katsikopoulos, 2014

Hostile act (continued) ? Nothing Yes No

Escalation of Force1 in Vehicle Checkpoints2

Increase Level of Force Use Force

Escalation of Force Level Description 1 – SHOUT Warning to stop 2 – SHOW Weapon and intent to use it 3 – SHOOT Warning shot at ground/air 4 – SHOOT One shot placed in grill/tires 5 – SHOOT Destroy vehicle/eliminate threat

slide-8
SLIDE 8

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

Military operators often use simple decision algorithms, pattern recognition, etc. based on their experiences and expertise to make quick and accurate decisions – especially when faced with limited time, information, resources, or cognition.

MILITARY OPERATORS USE NATURALISTIC HEURISTICS

Recognize Modify Generate Evaluate Act Hostile act (continued)? Nothing Yes No Increase Level

  • f Force

Use Force

slide-9
SLIDE 9

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

A GENERAL LINEAR MODEL OF JUDGMENT AND DECISION MAKING

W H AT I F A L I N E A R M O D E L C O U L D R E P R E S E N T T H E S E N AT U R A L I S T I C H E U R I S T I C S ?

slide-10
SLIDE 10

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

  • Simulating naturalistic decision making has often required complex

computations

  • Fuzzy logic1, episodic recognition memory2, system dynamics4
  • Designers and users prefer the simpler analytic models because they are

easier to use despite their difficulty to account for environmental and psychological issues.4

SIMULATING NATURALISTIC DECISION MAKING – LIMITED BY COMPLEXITY

1 Ji et al., 2007; 2 Mueller, 2009; 3 Patterson et al., 2009; 4 Katsikopoulos and Fasolo, 2006; Katsikopoulos et al., 2008; 5Keller et al., 2010

There are many shared components with fast-and-frugal heuristics5, so can’t we use the computational models of the fast-and-frugal heuristics?

slide-11
SLIDE 11

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

SIMULATING HEURISTIC DECISION MAKING – LIMITED BY LACK OF UNIFIED MODEL 1

Take-the-Best Tallying Weighted-Additive Take Two Equal-Weighting Cue weights Estimates of missing information Utility Functions Incomplete information Expertise and Experience Time Pressure Redundancy Variability Minimalist Predictability Effort Dominance Distribution of Weights DEBA Strategies Environmental Parameters Components Context Thresholds Cue directions

1Two major books from the ABC Research Group which developed the fast-and-frugal heuristics research program: Gigerenzer et al., 1999; Todd et al., 2012

Cutoff values

slide-12
SLIDE 12

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

SIMPLE DECISION MODELS – CLASSES OF JUDGMENT AND DECISION MAKING PROBLEMS

  • Utility of any option is defined by the function:

𝑉 𝑌 = 3𝑏1 𝑌 + 2𝑏2 𝑌 + 𝑏3(𝑌)

  • Two options with cue values:

a1 A = 1; a2 𝐵 = 0; 𝑏3 𝐵 = 1 ⟶ 𝐵 = 1,0,1 a1 B = 0; a2 𝐶 = 0; 𝑏3 𝐶 = 1 ⟶ 𝐶 = {0,0,1}

Decision Making: Select the option with the highest utility. 𝑏𝑠𝑕𝑛𝑏𝑦𝑗{𝑉 𝐵 = 2, 𝑉 𝐶 = 1} 𝑽 𝑩 > 𝑽 𝑪 → 𝑩 Judgment: Categorize individual option. 𝐷𝑏𝑢𝑓𝑕𝑝𝑠𝑧 = 𝐼𝑗𝑕ℎ, 𝑉 ≥ 2 𝑀𝑝𝑥, 𝑉 < 2 𝑽 𝑩 = 𝟒 = 𝑰𝒋𝒉𝒊

slide-13
SLIDE 13

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

TOWARD A GENERAL LINEAR MODEL

𝐷𝑗 = ෍

𝑘=1 𝑜

𝑥

𝑘 ∙ 𝑉 𝑘 𝑏𝑗,𝑘 𝑤

Processes of naturalistic decision making Simple linear models from decision theory Diversity and component structure of fast-and-frugal heuristics

Strategies Environmental Parameters Components Context Recognize Modify Generate Evaluate Act

slide-14
SLIDE 14

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

GENERAL LINEAR MODEL – BINARY FORM1

𝑫𝒋 = ෍

𝒌=𝟐 𝒐

𝒙𝒌 ∙ 𝑰𝒌 𝒆𝒌 𝒇𝒌 + 𝒃𝒋,𝒌

𝒘 − 𝒇𝒌 𝒜𝒋,𝒌

− 𝒆𝒌𝒅 + 𝚬𝒌

Criterion (C):

  • verall score of an
  • ption

Cue weight (w): relative importance

  • f the cue

Estimate of missing info (e): Assumed value of a missing piece

  • f information

Incomplete information (z): 1 if known, 0 if unknown Heaviside utility function (H): Convert cue values into 1’s or 0’s Cue direction (d): Sign of the correlation between the scores and criterion Cutoff value (c): Threshold comparison for binary. Threshold (𝚬): The difference between cue values is large enough to be meaningfully different

1Canellas and Feigh, 2016

slide-15
SLIDE 15

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

EXAMPLE: CLASSIFYING TARGET BASED ON SPEED

Criterion (C): If 1 then hostile, if 0 then non-hostile. Cue weight (w = 1): Only one cue. Cue direction (d = 1): Higher cue (speed) values correlate with higher criterion (hostile) values Cutoff value (c = 500) Threshold (𝚬 = 50): Estimate of missing info (e = 0): Set as a value less than 500 (non-hostile) Incomplete information (z): 1 if known, 0 if unknown Training: Classify targets based as hostile (1) or non-hostile (0) based on their measured speed (av) and the following guidance:

  • If the speed is greater than 500 kts then

hostile, else, non-hostile

  • If the speed is unknown assume non-hostile
  • Speed measurements have an error of 

50kts

𝑫𝒋 = ෍

𝒌=𝟐 𝒐

𝒙𝒌 ∙ 𝑰𝒌 𝒆𝒌 𝒇𝒌 + 𝒃𝒋,𝒌

𝒘 − 𝒇𝒌 𝒜𝒋,𝒌

− 𝒆𝒌𝒅 + 𝚬𝒌

1 1 1 500 50 𝑫𝒋 = ෍

𝒌=𝟐 𝒐

𝒙𝒌 ∙ 𝑰𝒌 𝒃𝒋,𝒌

𝒘 𝒜𝒋,𝒌 − 𝟔𝟏𝟏 + 𝟔𝟏

slide-16
SLIDE 16

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

CAPABILITIES OF GENERAL LINEAR MODEL

𝐷𝑈𝑏𝑚𝑚𝑧𝑗𝑜𝑕 = ෍

𝑘=1 𝑜

𝐼 𝑒𝑘 ∙ 𝑏𝑗,𝑘

𝑤 ∙ 𝑨𝑗,𝑘 − 𝑒𝑘 ∙ ෥

𝑏𝑘 + Δj 𝐷𝐹𝑟𝑣𝑏𝑚 𝑋𝑓𝑗𝑕ℎ𝑢𝑗𝑜𝑕 = ෍

𝑘=1 𝑜

𝑓

𝑘 + 𝑏𝑗,𝑘 𝑤 − 𝑓 𝑘 𝑨𝑗,𝑘

𝐷𝑈𝑏𝑙𝑓−𝑢ℎ𝑓−𝐶𝑓𝑡𝑢 = ෍

𝑘=1 𝑜

1 4

𝑘−1

𝐼 𝑒𝑘 ∙ 𝑏𝑗,𝑘

𝑤 ∙ 𝑨𝑗,𝑘 − 𝑒𝑘 ∙ ෥

𝑏𝑘 + Δj

Decision Making Strategies Judgment

Trees up to 3 outputs per node and 3(# cues)

𝐷 = 𝐼 𝒴𝐸 ⊙ 𝒴𝐹 + 𝐵 − 𝒴𝐹 ⊙ 𝑎 − 𝒴 𝐸 ⊙ 𝐷 + 𝒴Δ 𝑋𝑈

Matrix Form The GLM is an outcome-equivalent model that makes the same judgments or decisions; it does not mirror the underlying processes.

slide-17
SLIDE 17

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

REPRESENTING EXPERIENCE/EXPERTISE 1 Model Calibration Amount/type of experience should be proportional to the amount/type of data used to calculate components:

Cue weights and cue order (w)3; Cue directions (d)4; Estimates of missing information (e)5; Cutoff values (c); and Thresholds (Δ)

Experience Experts tend to use naturalistic heuristics (fewer, more important cues)2 Novices tend to use normative and analytic strategies (more, less important cues)2

1Canellas and Feigh, 2016 2Gigerenzer and Gaissmaier, 2011; Todd et al., 2012; 3Czerlinski et al., 1999; 4Katsikopoulos et al, 2010; 5Garcia-Retamero and Rieskamp, 2009

http://snagfilms.s3.amazonaws.com/56/ee/cc0b41404c81ad05c2a4ed7aebfc/635933100424826639-sealsjpg

slide-18
SLIDE 18

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

REPRESENTING TIME PRESSURE & DEGRADED AND DENIED INFORMATION ENVIRONMENTS

1Canellas and Feigh, 2016; 2Rieskamp and Hoffrage, 1999, 2008; Maule, 1994; Payne et al., 1988; 3Garcia-Retamero and Rieskamp, 2008, 2009; 4Canellas and Feigh, 2015, 2016 http://navsource.org/archives/08/750/0876711.jpg

High time pressure results in:

  • More use of heuristics and

selective information search2

  • More incomplete information

How be robust degraded & denied information environments?

  • Well-calibrated estimates of

missing information3

  • Focus on the balanced

information (option-wise and cue-wise)4

slide-19
SLIDE 19

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

Single unifying framework for representing, modeling, and simulating a wide range of judgment and decision making strategies:

  • 1. Transparent: Allows for specificity in model selection and description.

Model the individual components of strategies (weights, estimates, utility functions) and the environment (incomplete information).

  • 2. Simple: Look-up and use. Not an algorithm, nor a specialty code.
  • 3. Applied: Can model any combination of components based on the

domain – not restricted to established models.

  • 4. Representative: Can approximate “expertise” by changing how much

prior information is used to set the components, specifically weights, estimates, cue directions, cutoffs, and thresholds.

BENEFITS OF THE GENERAL LINEAR MODEL

slide-20
SLIDE 20

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

OPPORTUNITIES FOR THE GENERAL LINEAR MODEL

A P P L I C AT I O N S I N V E H I C L E C H E C K P O I N T – I D E N T I F I C AT I O N F R I E N D O R F O E

slide-21
SLIDE 21

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

DEVELOPING FAST-AND-FRUGAL TREES FOR CHECKPOINTS IN IRAQ 1

Task: Classify incoming vehicle as hostile or non-hostile

ISAF-NATO Forces in Afghanistan 2004-2009

1Keller and Katsikopoulos, 2014; Keller et al., 2014

Identify cues and categories, and their quantifications Construction and simulation of FFTs Analysis and final selection of FFT for implementation in the field

slide-22
SLIDE 22

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

FAST-AND-FRUGAL TREE FOR CHECKPOINTS 1

Are there multiple

  • ccupants?

Does the vehicle comply (slow/stop)? Are there no further threat attributes? Non- hostile

No Yes No Yes Yes

Hostile Non- hostile Hostile

No

Development

  • Constructed prior to testing on 1053

incidents Results

  • Civilian casualties reduced from

204 to 78 (-60%)

  • For attacks, only occupants measured:

Never multiple occupants in attack (agreement with FFT)

  • Average of 1.2 cues used
  • Sequential and deterministic

information search

How can the GLM help address the “challenges” faced during development of FFTs?

1Keller and Katsikopoulos, 2014; Keller et al., 2014

slide-23
SLIDE 23

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

  • 1. IDENTIFY CUES, CATEGORIES, & STRATEGIES:

QUAL/QUANT STRATEGY IDENTIFICATION Identify cues and categories (actions) and their related constraints Quantify the cues and categories. Determine relevant

  • rdering of cues.

Challenges GLM Solutions

Too many cues/Difficult ordering: Need to down-select cues and find relevant order Fit operators to the GLM: Construct representative tasks for use in human- subjects studies; then, statistically fit the GLM

Occupants Visible weapons Single/ Multiple Visible or not? Slowing down? Yes/No Occupants : Single/ Multiple Slowing down? Yes/No Visible weapons

  • r not?

Speed

slide-24
SLIDE 24

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

2/3. SIMULATION OF FFTS AND FINAL SELECTION: FAST-SIMULATION/PERFORMANCE PREDICTIONS

Challenges GLM Solutions

Computational resource constraints: Had to limit the number of FFTs and cases studied. Matrix-form: In a single matrix multiplication, a single FFT can classify m-

  • bjects.

Artificiality of simulations/High similarity of final FFTs: Did not account for individuals’ “differential ability to perceive cues.” Lots of “good-enough” FFTs. Vary components: Directly model “thresholds” and vary training data to represent experience. Measure robustness.

Construct large numbers of FFTs Construct simulation environment Signal-detection analysis

1 2 1 2

p(Hit) p(FA)

1 2

slide-25
SLIDE 25

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

  • 4. DESIGN TRAINING AND DECISION SUPPORT

(NATURALISTIC HEURISTICS)

Build training and support structures based on the operators’ context and decision making strategies “I thought, therefore, I build.”

Methods: Brief literature review (if any), pontifications about what

  • perators need

Result: Unusable/unused systems, failures

“I thought about what I saw, therefore, I build.”

Methods: Literature review of psychology and cognitive engineering. Applied methods1: cognitive work analysis, goal-directed interviews, and human- subjects studies. Result: Useable, traceable systems, that increase the performance of operators.

1Millitello et al., 2013; Vincente

slide-26
SLIDE 26

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

CONCLUSION: TOWARD BETTER MODELING, SIMULATION, AND SUPPORT OF MILITARY OPERATORS

slide-27
SLIDE 27

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

OBJECTIVE: BETTER MODELING, SIMULATION, AND SUPPORT OF MILITARY OPERATORS

  • 1. Military operators use naturalistic heuristics (simple decision

algorithms, pattern recognition, etc.) based on their experiences and expertise to make quick and accurate decisions – especially when faced with limited time, information, resources, or cognition.

  • 2. Our new general linear model of judgment and decision

making can mathematically and transparently represent these types of strategies (and more) while accounting for expertise and incomplete information.

  • 3. Leveraging the perspectives of naturalistic heuristics and the

mathematics of the GLM will enable new ways to

  • Model and simulate military operators; Develop prescriptive strategies

for better performance; Design support tools and interfaces

slide-28
SLIDE 28

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

REFERENCES Bagwell, R. (2008). The threat assessment process. The Army Lawyer – Department of the Army Pamphlet, 5–17. Canellas, M. C., & Feigh, K. M. (2016). Toward Simple Representative Mathematical Models of Naturalistic Decision Making through Fast-and-Frugal Heuristics. Journal of Cognitive Engineering and Decision Making, 10(3), 255–267. doi:www.doi.org/10.1177/1555343416656103 Canellas, M. C., Feigh, K. M., & Chua, Z. K. (2015). Accuracy and Effort of Decision- Making Strategies With Incomplete Information: Implications for Decision Support System Design. IEEE Transactions on Human-Machine Systems, 45(6), 686–701. doi:10.1109/THMS.2015.2420575 Czerlinski, J., Gigerenzer, G., & Goldstein, D. G. (1999). How good are simple heuristics?

slide-29
SLIDE 29

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

Deng, H. (2014). Interpreting Tree Ensembles with inTrees. arXiv:1408.5456. Elwyn, G., Edwards, A., Eccles, M., & Rovner, D. (2001). Decision analysis in patient care. The Lancet, 358(9281), 571–574. Garcia-Retamero, R., & Rieskamp, J. (2008). Adaptive Mechanisms For Treating Missing Information: A Simulation Study. The Psychological Record, 58. Garcia-Retamero, R., & Rieskamp, J. (2009). Do people treat missing information adaptively when making inferences? The Quarterly Journal of Experimental Psychology, 62(10), 1991–2013. doi:10.1080/17470210802602615 Gigerenzer, G., & Gaissmaier, W. (2011). Heuristic decision making. Annual review of psychology, 62, 451–482. Gigerenzer, G., Todd, P. M., Group, A. R., & others. (1999). Simple heuristics that make us

  • smart. Oxford University Press New York.

Green, L., & Mehr, D. R. (1997). What alters physicians’ decisions to admit to the coronary care unit? The Journal of Family Practice, 45, 219–226. REFERENCES

slide-30
SLIDE 30

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

Hammond, K. R., Hamm, R. M., Grassia, J., & Pearson, T. (1987). Direct Comparison of the Efficacy of Intuitive and analytical Cognition In Expert Judgment. IEEE Transactions on Systems Man and Cybernetics, 17, 753–770. Ji, Y., Massanari, R. M., Ager, J., Yen, J., Miller, R. E., & Ying, H. (2007). A fuzzy logic-based computational recognition-primed decision model. Information Sciences, 177(20), 4338– 4353. Katsikopoulos, K. V., & Fasolo, B. (2006). New Tools for Decision Analysts. IEEE Transactions on Systems, Man, and Cybernetics–Part A: Systems and Humans, 36(5), 960– 967. Katsikopoulos, K. V., Pachur, T., Machery, E., & Wallin, A. (2008). From Meehl to Fast and Frugal Heuristics (and Back): New Insights into How to Bridge the Clinical–Actuarial

  • Divide. Theory & Psychology, 18(4), 443–464.

Keller, N., Cokely, E. T., Katsikopoulos, K. V., & Wegwarth, O. (2010). Naturalistic heuristics for decision making. Journal of Cognitive Engineering and Decision Making, 4(3), 256– 274. REFERENCES

slide-31
SLIDE 31

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

Keller, N., Czienskowski, U., & Feufel, M. A. (2014). Tying up loose ends: a method for constructing and evaluating decision aids that meet blunt and sharp-end goals. Ergonomics, 57(8), 1127–1139. Keller, N., & Katsikopoulos, K. (2016). On the role of psychological heuristics in

  • perational research; and a demonstration in military stability operations. European

Journal of Operational Research, 249(3), 1063–1073. doi:http://dx.doi.org/10.1016/j.ejor.2015.07.023 Klein, G. (2008). Naturalistic Decision Making. Human Factors, 50, 456–460. Lipshitz, R., Klein, G., Orasanu, J., & Salas, E. (2001). Taking stock of naturalistic decision

  • making. Journal of Behavioral Decision Making, 14(5), 331–352.

Martignon, L., Katsikopoulos, K. V., & Woike, J. K. (2008). Categorization with limited resources: A family of simple heuristics. Journal of Mathematical Psychology, 52(6), 352– 361. REFERENCES

slide-32
SLIDE 32

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

Maule, A. J. (1994). A componential investigation of the relation between structural modelling and cognitive accounts of human judgement. Acta Psychologica, 87, 199–216. doi:http://dx.doi.org/10.1016/0001-6918(94)90051-5 Mueller, S. T. (2009). A Bayesian Recognitional Decision Model. Journal of Cognitive Engineering and Decision Making, 3(2), 111–130. Patterson, R., Fournier, L., Pierce, B., Winterbottom, M., & Tripp, L. (2009). Modeling the Dynamics of Recognition-Primed Decision Making. Proceedings of NDM9, the 9th International Conference on Naturalistic Decision Making. London, UK. Payne, J. W., Bettman, J. R., & Johnson, E. J. (1988). Adaptive strategy selection in decision making. Journal of Experimental Psychology: Learning, Memory, and Cognition, 14(3), 534–552. Rasmussen, J. (1983). Skill, rules and knowledge:Signals, signs, and symbols, and other distinctions in human performance models. IEEE Transactions on Systems, Man and Cybernetics, 13(3), 257–266. REFERENCES

slide-33
SLIDE 33

UNCLASSIFIED – Marc Canellas – Dec. 6, 2016 MORS Experimental Techniques Special Meeting

Rieskamp, J., & Hoffrage, U. (1999). When do people use simple heuristics, and how can we tell? In G. Gigerenzer & P. M. Todd (Eds.), (pp. 141–167). New York: Oxford University Press. Rieskamp, J., & Hoffrage, U. (2008). Inferences under time pressure: How opportunity costs affect strategy selection. Acta psychologica, 127(2), 258–276. Ross, K. G., Klein, G. A., Thunholm, P., Schmitt, J. F., & Baxter, H. C. (2004). The Recognition-Primed Decision Model. Military Review. Todd, P. M., Gigerenzer, G., & ABC Research Group. (2012). Ecological rationality: Intelligence in the world. (P. M. Todd, G. Gigerenzer, & the ABC Research Group, Eds.). Oxford University Press. REFERENCES