Cognitive Bias and Critical Thinking in Open Source Intelligence - - PowerPoint PPT Presentation

cognitive bias and critical thinking in open source
SMART_READER_LITE
LIVE PREVIEW

Cognitive Bias and Critical Thinking in Open Source Intelligence - - PowerPoint PPT Presentation

Cognitive Bias and Critical Thinking in Open Source Intelligence (OSINT) Benjamin Brown Akamai Technologies Security Architecture * Law Enforcement Engagement * Systems Safety * Threat Intelligence * Security Research - Novel Attack


slide-1
SLIDE 1

Cognitive Bias and Critical Thinking in Open Source Intelligence (OSINT)

slide-2
SLIDE 2

Benjamin Brown

Akamai Technologies

Security Architecture

* Law Enforcement Engagement * Systems Safety * Threat Intelligence * Security Research

  • Novel Attack Vectors
  • Multilayered Attacks
slide-3
SLIDE 3

Coming To Terms

  • Cognitive Biases
  • Open Source Intelligence
  • Intelligence Analysis
  • Metacognition
  • Critical Thinking
  • Frameworks for Structured Analysis
slide-4
SLIDE 4
slide-5
SLIDE 5

Why We Care

Actionable Intelligence

  • Accurate conclusions
  • Properly framed

Cognitive Biases (faulty heuristics)

  • Can lead to inaccurate conclusions

Metacognition and Critical Thinking

  • Recognize and correct for cognitive biases
  • Arrive at more accurate solutions more often
slide-6
SLIDE 6

Why We Care

Bad Data Biased Analysis False Conclusions Bad Intelligence

slide-7
SLIDE 7

Why We Care

slide-8
SLIDE 8

Open Source Intelligence (OSINT)

slide-9
SLIDE 9

Defining OSINT

"Intelligence produced from publicly available information"

Army Techniques Publication (ATP), “Open-Source Intelligence,” 2-22.9 July 2012.

slide-10
SLIDE 10

Defining OSINT

Typical Quality of Publicly Available Information:

slide-11
SLIDE 11

Intelligence vs Information

  • Timely
  • Relevant
  • Actionable

VS

slide-12
SLIDE 12

OSINT Sources

  • Search engines
  • Social networks
  • Communication services
  • E-commerce site profiles
  • Business / tax records
  • Media
slide-13
SLIDE 13

OSINT Tools

  • Recon-ng
  • ExifTool
  • theHarvester
  • Maltego
  • Cree.py
  • fierce.pl
  • TAPIR
  • DNSRecon
slide-14
SLIDE 14

Defining Cognitive Bias

slide-15
SLIDE 15

Defining Cognitive Bias

Note: Emotional, intrinsically cultural, spiritual, or faith-based biases are

  • ut of scope.
slide-16
SLIDE 16

Defining Cognitive Bias

  • Patterns of subjective judgment.
  • Simplified information processing

strategies.

  • Subconscious mental procedures

for processing information.

slide-17
SLIDE 17

Defining Cognitive Bias ‘Deduction’ of color / shade

slide-18
SLIDE 18

Defining Cognitive Bias ‘Deduction’ of color / shade

slide-19
SLIDE 19

Example Types of Cognitive Bias

slide-20
SLIDE 20

Select Biases

Confirmation Bias

  • Seek out evidence that confirms
  • Self-fulfilling prophecies
  • Avoid information supporting

competing hypotheses.

slide-21
SLIDE 21

Select Biases

Self-serving Bias

  • Self-enhancement
  • Self-preservation
  • Self-esteem
  • Social and/or Career

Advancement

slide-22
SLIDE 22

Select Biases

Echo Effect

  • Information repeated by source after source.

* Media telephone * Sources obscured * Bandwagon effect * Promotes group-think

slide-23
SLIDE 23

Select Biases

Representativeness

  • Focus on similarities / Neglect differences
slide-24
SLIDE 24

Select Biases

Representativeness Cont.

  • Base Rate Neglect
slide-25
SLIDE 25

Select Biases

  • Base Rate Neglect Cont.

* Deadly Disease ‘X’ * Afflicts .01% of Population * Cheap diagnostic test that finds the disease in 99.5% of those infected, false-positive rate of only 1.95% * Test = Positive (Oh God I’m Dying! 99.5%!!!) ...But Wait, There’s More...

slide-26
SLIDE 26

Select Biases

  • Base Rate Neglect Cont.

99.5% likely to find it if you have it.

  • 1.95% false-positive rate
  • 1 mill people tested, 100 of which are infected

(.01%)

  • Test accurately identifies 99 of them as infected
  • BUT 19,500 uninfected receive a false-positive

(1.95%)

slide-27
SLIDE 27

Select Biases

Availability Bias

  • “Anecdata” (first or second hand)
  • Topic’s trend-power
  • Censorship
  • Language(s) of collector / analyst
  • Маскировка (Maskirovka)

* Dezinformatsiia

slide-28
SLIDE 28

Synthesis

slide-29
SLIDE 29

Reddit vs. Boston Bombers

slide-30
SLIDE 30

Reddit vs. Boston Bombers

“Methodology”

  • Marathon Photos and Videos

* ‘not looking at the race’

  • Warped police scanner snippets
  • Multiple ‘suspects’

* Social media harassment / cyberstalking

  • Media outlets ran bad info from Reddit
slide-31
SLIDE 31

Reddit vs. Boston Bombers

slide-32
SLIDE 32

Reddit vs. Boston Bombers

Bias

  • Bandwagon effect
  • Echo effect
  • Availability bias
  • Self-serving bias
  • Confirmation bias
slide-33
SLIDE 33

Attack Attribution - APT

OMFGWTFBBQ CHINA!!1one!! APT: Advanced Persistent Threat =

slide-34
SLIDE 34

Attack Attribution - APT

“Evidence”

  • Type of RAT (Remote Administration Tool)
  • Geo Loc of C2s (Command and Control)
  • ‘Shared similarities’ with past APTs
  • Tool author’s (not user’s) native language
  • “We’re expert researchers, just trust us”

(Show me your methodology!)

slide-35
SLIDE 35

Attack Attribution - APT

(Vendor) Bias

  • Confirmation bias
  • Self-serving bias
  • Representativeness

* Base rate / Population

  • Availability Bias
slide-36
SLIDE 36

D0xing

slide-37
SLIDE 37

White-Knight Syndrome

  • Suspected Amanda Todd bully
  • L337 Pro-Scientology hackers:

* 59 year-old couple

  • LEO in Ferguson not involved

* Family info used for ID fraud

slide-38
SLIDE 38

DDoS FUD

slide-39
SLIDE 39

DDoS FUD

  • Statistical methodology

* Sample size * Length of collection time

  • Availability bias
  • Self-serving bias (vendors)
slide-40
SLIDE 40

Newsweek vs The ‘Creator of Bitcoin’

slide-41
SLIDE 41

Malaysia Airways flight MH370 (Google Maps)

slide-42
SLIDE 42

Groundwork For Remedy

slide-43
SLIDE 43

Defining Metacognition

“Thinking about thinking”

slide-44
SLIDE 44

Defining Critical Thinking

  • “What do I think I know?”
  • “How do I think I know it?”
  • “When would it not be true?”
slide-45
SLIDE 45

The More You Know

Know about cognitive biases.

  • Difficult to affect that which

you don't know you don't know.

  • Everyone has them, you are

not immune or invincible.

slide-46
SLIDE 46

The More You Know

[META INTENSIFIES] Beware The Bias Bias / Bias Blind Spot

  • Belief that your bias is actually

insight

slide-47
SLIDE 47

The More You Know

  • Dr. Emily Pronin

Princeton University - Dept of Psychology

Subjects:

  • Report being less

susceptible to bias

  • Infer bias in others

holding contrary opinions

Pronin, Emily, et al. PSPB, Vol. 2 No 3, “The Bias Blind Spot: Perceptions of Bias in Self Versus Others,” March 2002.

slide-48
SLIDE 48

The More You Know

Actively seek out:

  • Peer review
  • Diverse, outside expertise
  • Alternative mindsets
slide-49
SLIDE 49

Defining Cognitive Bias

“Similar to optical illusions... [it] remains compelling even when one is fully aware of its nature.”

Heuer, Richards J., Jr. Psychology of Intelligence Analysis. Washington D.C.: CSI, 1999.

slide-50
SLIDE 50

Two Frameworks

slide-51
SLIDE 51

Structures and Frameworks

Checklists For Analysts

  • Defining the problem
  • Generating hypotheses
  • Collecting information
  • Evaluating hypotheses
  • Selecting the most likely hypothesis
  • Ongoing monitoring of new information

Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.

slide-52
SLIDE 52

Structures and Frameworks

Defining The Problem

  • Right question
  • Framing
  • Audience
  • Specificity

Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.

slide-53
SLIDE 53

Structures and Frameworks

Generating Hypotheses

  • Identify all plausible hypothesis
  • Consult peers and outside experts
  • Do not yet screen out or reject
  • Consider actor deception or denial

Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.

slide-54
SLIDE 54

Structures and Frameworks

Collecting Information

  • Don’t just focus on likely hypothesis
  • Suspend judgement
  • Notice /note info gaps

Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.

slide-55
SLIDE 55

Structures and Frameworks

Evaluating Hypotheses

  • Develop arguments against

each hypothesis

  • Check assumptions and their origins
  • Implement ACH frameworks

(Analysis of Competing Hypothesis)

Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.

slide-56
SLIDE 56

Structures and Frameworks

Selecting Most Likely Hypothesis

  • Least evidence against
  • Reject don’t confirm
  • Note alternate

hypotheses

Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.

slide-57
SLIDE 57

Structures and Frameworks

Ongoing Monitoring

  • Add data sources
  • Plug-in alternate hypotheses
  • Keep conclusions tentative

Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.

slide-58
SLIDE 58

Structured Analytic Techniques for Improving Intelligence Analysis

slide-59
SLIDE 59

Structures and Frameworks

Structured Analytic Techniques for Improving Intelligence Analysis

  • Diagnostic Techniques
  • Contrarian Techniques
  • Imaginative Thinking Techniques

US Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis,” March 2009.

slide-60
SLIDE 60

Structures and Frameworks

Diagnostic Techniques

  • Key Assumptions Check

*“What do we think we know? How do we think we know it?”

  • Quality of Information Check
  • Indicators or Signposts of Change
  • Analysis of Competing Hypotheses

US Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis,” March 2009.

slide-61
SLIDE 61

Structures and Frameworks

Contrarian Techniques

  • Devil’s Advocacy
  • Team A / Team B
  • High-Impact /

Low-Probability

  • “What If?” Analysis

US Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis,” March 2009.

slide-62
SLIDE 62

Structures and Frameworks

Imaginative Thinking Techniques

  • Brainstorming
  • Outside-In Thinking
  • Red Team Analysis
  • Alternative Futures

Analysis

US Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis,” March 2009.

slide-63
SLIDE 63

Structures and Frameworks

US Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis,” March 2009.

slide-64
SLIDE 64

Application

slide-65
SLIDE 65

Application - Report

  • Clearly Define

* Goals * Sources * Audience * Limitations & Assumptions ...

slide-66
SLIDE 66

Application

METHODOLOGY

Verizon, Trend Micro, Prolexic, Recorded Future

slide-67
SLIDE 67

Further Research

slide-68
SLIDE 68

Further Research

  • Context

* Chrononarcissism and historical context * Socio-cultural and economic context

  • Unknown unknowns and blindspots
  • Data originator biases /

data supply-chain pressures

  • Baked-in Bias for OSINT automation tools
  • Analysis of competing hypotheses (ACH)
slide-69
SLIDE 69

Suggested Reading

  • “Psychology of Intelligence Analysis”, Richards J.

Heuer, Jr.

  • “A Tradecraft Primer: Structured Analytic Techniques

for Improving Intelligence Analysis”, US Government

  • “Judgement in Managerial Decision Making”, Max H.

Bazerman and Don Moore

slide-70
SLIDE 70

Contact Me bbrowntalks@gmail.com

slide-71
SLIDE 71

Cognitive Bias and Critical Thinking in Open Source Intelligence (OSINT)

slide-72
SLIDE 72

Benjamin Brown

Akamai Technologies

Security Architecture

* Law Enforcement Engagement * Systems Safety * Threat Intelligence * Security Research

  • Novel Attack Vectors
  • Multilayered Attacks
slide-73
SLIDE 73

Coming To Terms

  • Cognitive Biases
  • Open Source Intelligence
  • Intelligence Analysis
  • Metacognition
  • Critical Thinking
  • Frameworks for Structured Analysis
slide-74
SLIDE 74

The terms may layout a ‘So’, but we also need a ‘So What’

slide-75
SLIDE 75

Why We Care

Actionable Intelligence

  • Accurate conclusions
  • Properly framed

Cognitive Biases (faulty heuristics)

  • Can lead to inaccurate conclusions

Metacognition and Critical Thinking

  • Recognize and correct for cognitive biases
  • Arrive at more accurate solutions more often
slide-76
SLIDE 76

Why We Care

Bad Data Biased Analysis False Conclusions Bad Intelligence

slide-77
SLIDE 77

Why We Care

Think Bay of Pigs

slide-78
SLIDE 78

Open Source Intelligence (OSINT)

slide-79
SLIDE 79

Defining OSINT

"Intelligence produced from publicly available information"

Army Techniques Publication (ATP), “Open-Source Intelligence,” 2-22.9 July 2012.
slide-80
SLIDE 80

Defining OSINT

Typical Quality of Publicly Available Information:

You are relying on public information of variable quality to draw important conclusions that you intend to act upon. You damn well better be able to recognize bad information and bad

  • analysis. Including you own!
slide-81
SLIDE 81

Intelligence vs Information

  • Timely
  • Relevant
  • Actionable

VS Information may look interesting, but is it really useful?

slide-82
SLIDE 82

OSINT Sources

  • Search engines
  • Social networks
  • Communication services
  • E-commerce site profiles
  • Business / tax records
  • Media
slide-83
SLIDE 83

OSINT Tools

  • Recon-ng
  • ExifTool
  • theHarvester
  • Maltego
  • Cree.py
  • fierce.pl
  • TAPIR
  • DNSRecon
slide-84
SLIDE 84

Defining Cognitive Bias

slide-85
SLIDE 85

Defining Cognitive Bias

Note: Emotional, intrinsically cultural, spiritual, or faith-based biases are

  • ut of scope.
slide-86
SLIDE 86

Defining Cognitive Bias

  • Patterns of subjective judgment.
  • Simplified information processing

strategies.

  • Subconscious mental procedures

for processing information.

slide-87
SLIDE 87

Defining Cognitive Bias ‘Deduction’ of color / shade

Human brain does not see the colors as they are, but ‘deduces’ the shade of grey from other information presented.

slide-88
SLIDE 88

Defining Cognitive Bias ‘Deduction’ of color / shade

Human brain does not see the colors as they are, but ‘deduces’ the shade of grey from other information presented.

slide-89
SLIDE 89

Example Types of Cognitive Bias

slide-90
SLIDE 90

Select Biases

Confirmation Bias

  • Seek out evidence that confirms
  • Self-fulfilling prophecies
  • Avoid information supporting

competing hypotheses.

slide-91
SLIDE 91

Select Biases

Self-serving Bias

  • Self-enhancement
  • Self-preservation
  • Self-esteem
  • Social and/or Career

Advancement

slide-92
SLIDE 92

Select Biases

Echo Effect

  • Information repeated by source after source.

* Media telephone * Sources obscured * Bandwagon effect * Promotes group-think

slide-93
SLIDE 93

Select Biases

Representativeness

  • Focus on similarities / Neglect differences
slide-94
SLIDE 94

Select Biases

Representativeness Cont.

  • Base Rate Neglect

Focusing on specific information and neglecting the importance of base rates

slide-95
SLIDE 95

Select Biases

  • Base Rate Neglect Cont.

* Deadly Disease ‘X’ * Afflicts .01% of Population * Cheap diagnostic test that finds the disease in 99.5% of those infected, false-positive rate of only 1.95% * Test = Positive (Oh God I’m Dying! 99.5%!!!) ...But Wait, There’s More... Focusing on specific information and neglecting the importance of base rates

slide-96
SLIDE 96

Select Biases

  • Base Rate Neglect Cont.

99.5% likely to find it if you have it.

  • 1.95% false-positive rate
  • 1 mill people tested, 100 of which are infected

(.01%)

  • Test accurately identifies 99 of them as infected
  • BUT 19,500 uninfected receive a false-positive

(1.95%) Focusing on specific information and neglecting the importance of base rates

slide-97
SLIDE 97

Select Biases

Availability Bias

  • “Anecdata” (first or second hand)
  • Topic’s trend-power
  • Censorship
  • Language(s) of collector / analyst
  • Маскировка (Maskirovka)

* Dezinformatsiia

slide-98
SLIDE 98

Synthesis

slide-99
SLIDE 99

Reddit vs. Boston Bombers

slide-100
SLIDE 100

Reddit vs. Boston Bombers

“Methodology”

  • Marathon Photos and Videos

* ‘not looking at the race’

  • Warped police scanner snippets
  • Multiple ‘suspects’

* Social media harassment / cyberstalking

  • Media outlets ran bad info from Reddit
slide-101
SLIDE 101

Reddit vs. Boston Bombers

slide-102
SLIDE 102

Reddit vs. Boston Bombers

Bias

  • Bandwagon effect
  • Echo effect
  • Availability bias
  • Self-serving bias
  • Confirmation bias

Mad Internet points yo, rolling in the karma

slide-103
SLIDE 103

Attack Attribution - APT

OMFGWTFBBQ CHINA!!1one!! APT: Advanced Persistent Threat =

slide-104
SLIDE 104

Attack Attribution - APT

“Evidence”

  • Type of RAT (Remote Administration Tool)
  • Geo Loc of C2s (Command and Control)
  • ‘Shared similarities’ with past APTs
  • Tool author’s (not user’s) native language
  • “We’re expert researchers, just trust us”

(Show me your methodology!)

RAT - freely available source and compiled binaries Geo Loc services like Maxmind are notoriously unreliable Show Me Your Methodology!!

slide-105
SLIDE 105

Attack Attribution - APT

(Vendor) Bias

  • Confirmation bias
  • Self-serving bias
  • Representativeness

* Base rate / Population

  • Availability Bias

Most reports come from Vendors offering anti-APT services

slide-106
SLIDE 106

D0xing

slide-107
SLIDE 107

White-Knight Syndrome

  • Suspected Amanda Todd bully
  • L337 Pro-Scientology hackers:

* 59 year-old couple

  • LEO in Ferguson not involved

* Family info used for ID fraud

slide-108
SLIDE 108

DDoS FUD

slide-109
SLIDE 109

DDoS FUD

  • Statistical methodology

* Sample size * Length of collection time

  • Availability bias
  • Self-serving bias (vendors)
slide-110
SLIDE 110

Newsweek vs The ‘Creator of Bitcoin’

slide-111
SLIDE 111

Malaysia Airways flight MH370 (Google Maps)

slide-112
SLIDE 112

Groundwork For Remedy

slide-113
SLIDE 113

Defining Metacognition

“Thinking about thinking”

slide-114
SLIDE 114

Defining Critical Thinking

  • “What do I think I know?”
  • “How do I think I know it?”
  • “When would it not be true?”
slide-115
SLIDE 115

The More You Know

Know about cognitive biases.

  • Difficult to affect that which

you don't know you don't know.

  • Everyone has them, you are

not immune or invincible.

If you are familiar with them you are more likely to recognize them in yourself, your data, and others

slide-116
SLIDE 116

The More You Know

[META INTENSIFIES] Beware The Bias Bias / Bias Blind Spot

  • Belief that your bias is actually

insight

slide-117
SLIDE 117

The More You Know

  • Dr. Emily Pronin

Princeton University - Dept of Psychology

Subjects:

  • Report being less

susceptible to bias

  • Infer bias in others

holding contrary opinions

Pronin, Emily, et al. PSPB, Vol. 2 No 3, “The Bias Blind Spot: Perceptions of Bias in Self Versus Others,” March 2002.
slide-118
SLIDE 118

The More You Know

Actively seek out:

  • Peer review
  • Diverse, outside expertise
  • Alternative mindsets
slide-119
SLIDE 119

Defining Cognitive Bias

“Similar to optical illusions... [it] remains compelling even when one is fully aware of its nature.”

Heuer, Richards J., Jr. Psychology of Intelligence Analysis. Washington D.C.: CSI, 1999.
slide-120
SLIDE 120

Two Frameworks

slide-121
SLIDE 121

Structures and Frameworks

Checklists For Analysts

  • Defining the problem
  • Generating hypotheses
  • Collecting information
  • Evaluating hypotheses
  • Selecting the most likely hypothesis
  • Ongoing monitoring of new information
Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.
slide-122
SLIDE 122

Structures and Frameworks

Defining The Problem

  • Right question
  • Framing
  • Audience
  • Specificity
Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.
slide-123
SLIDE 123

Structures and Frameworks

Generating Hypotheses

  • Identify all plausible hypothesis
  • Consult peers and outside experts
  • Do not yet screen out or reject
  • Consider actor deception or denial
Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.
slide-124
SLIDE 124

Structures and Frameworks

Collecting Information

  • Don’t just focus on likely hypothesis
  • Suspend judgement
  • Notice /note info gaps
Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.
slide-125
SLIDE 125

Structures and Frameworks

Evaluating Hypotheses

  • Develop arguments against

each hypothesis

  • Check assumptions and their origins
  • Implement ACH frameworks

(Analysis of Competing Hypothesis)

Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.
slide-126
SLIDE 126

Structures and Frameworks

Selecting Most Likely Hypothesis

  • Least evidence against
  • Reject don’t confirm
  • Note alternate

hypotheses

Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.
slide-127
SLIDE 127

Structures and Frameworks

Ongoing Monitoring

  • Add data sources
  • Plug-in alternate hypotheses
  • Keep conclusions tentative
Heuer, Richards J., Jr. “Psychology of Intelligence Analysis,” Washington D.C.: CSI, 1999.
slide-128
SLIDE 128

Structured Analytic Techniques for Improving Intelligence Analysis

slide-129
SLIDE 129

Structures and Frameworks

Structured Analytic Techniques for Improving Intelligence Analysis

  • Diagnostic Techniques
  • Contrarian Techniques
  • Imaginative Thinking Techniques
US Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis,” March 2009.
slide-130
SLIDE 130

Structures and Frameworks

Diagnostic Techniques

  • Key Assumptions Check

*“What do we think we know? How do we think we know it?”

  • Quality of Information Check
  • Indicators or Signposts of Change
  • Analysis of Competing Hypotheses
US Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis,” March 2009.

Key Assumptions Check - “What do we think we know? Why do we think we know it?”

slide-131
SLIDE 131

Structures and Frameworks

Contrarian Techniques

  • Devil’s Advocacy
  • Team A / Team B
  • High-Impact /

Low-Probability

  • “What If?” Analysis
US Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis,” March 2009.

Team A / Team B: Using separate analytic teams to contrast competing hypotheses High-Impact / Low-Probability Analysis: Highlighting Black Swan Events

slide-132
SLIDE 132

Structures and Frameworks

Imaginative Thinking Techniques

  • Brainstorming
  • Outside-In Thinking
  • Red Team Analysis
  • Alternative Futures

Analysis

US Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis,” March 2009.

Outside-In Thinking: Identifying the full range of basic forces, factors, and trends that would indirectly shape an issue. Red Team Analysis: Modeling adversaries Alternative Futures Analysis: Systematically exploring multiple ways a situation can develop

slide-133
SLIDE 133

Structures and Frameworks

US Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis,” March 2009.
slide-134
SLIDE 134

Application

slide-135
SLIDE 135

Application - Report

  • Clearly Define

* Goals * Sources * Audience * Limitations & Assumptions ...

slide-136
SLIDE 136

Application

METHODOLOGY

Verizon, Trend Micro, Prolexic, Recorded Future

slide-137
SLIDE 137

Further Research

slide-138
SLIDE 138

Further Research

  • Context

* Chrononarcissism and historical context * Socio-cultural and economic context

  • Unknown unknowns and blindspots
  • Data originator biases /

data supply-chain pressures

  • Baked-in Bias for OSINT automation tools
  • Analysis of competing hypotheses (ACH)
slide-139
SLIDE 139

Suggested Reading

  • “Psychology of Intelligence Analysis”, Richards J.

Heuer, Jr.

  • “A Tradecraft Primer: Structured Analytic Techniques

for Improving Intelligence Analysis”, US Government

  • “Judgement in Managerial Decision Making”, Max H.

Bazerman and Don Moore

slide-140
SLIDE 140

Contact Me bbrowntalks@gmail.com