Organizational Structure, Exploration, and Exploitation on the ELICIT - - PowerPoint PPT Presentation

organizational structure exploration and exploitation on
SMART_READER_LITE
LIVE PREVIEW

Organizational Structure, Exploration, and Exploitation on the ELICIT - - PowerPoint PPT Presentation

Organizational Structure, Exploration, and Exploitation on the ELICIT Experimental Platform Allan Friedman Ethan Bernstein David Lazer Harvard ELICIT Outline Exploration & Exploitation ELICIT overview Extensions to ELICIT


slide-1
SLIDE 1

Organizational Structure, Exploration, and Exploitation

  • n the ELICIT Experimental

Platform

Allan Friedman Ethan Bernstein David Lazer

Harvard ELICIT

slide-2
SLIDE 2

Outline

  • Exploration & Exploitation
  • ELICIT overview
  • Extensions to ELICIT

– Capturing Exploration & Exploitation – Experimental Design Tweaks

  • Experimental Summary
  • Preliminary Results

Harvard ELICIT

slide-3
SLIDE 3

Exploration & Exploitation

  • Defined by James March (1991)

– Exploration: introduce new information – Exploitation: use existing information

  • Ambidexterity – balancing the two

Harvard ELICIT

slide-4
SLIDE 4

Ambidexterity & Agility

Harvard ELICIT

Agility Agility

Robustness Resilience Robustness Resilience

Exploration Exploration

Adaptation Innovation Adaptation Innovation

Exploitation Exploitation

slide-5
SLIDE 5

Exploration & Exploitation

  • Defined by James March (1991)

– Exploration: introduce new information – Exploitation: use existing information

  • Ambidexterity – balancing the two
  • Structure and Exploration vs. Exploitation

(Lazer & Friedman, 2007)

– Finding: Structural barriers can promote exploration

Harvard ELICIT

slide-6
SLIDE 6

ELICIT Overview

  • Lab‐based “whodunnit” game
  • Computer mediated system
  • Subjects receive “factoids”, have to work with

each other to identify a terrorism plot

– Who – What – Where – When

Harvard ELICIT

slide-7
SLIDE 7

New Features to ELICIT

Harvard ELICIT

slide-8
SLIDE 8

New Features to ELICIT

(well, at least new)

Harvard ELICIT

slide-9
SLIDE 9

Search Functionality

  • Subjects have the option of seeking out new

information for the organization

Harvard ELICIT

slide-10
SLIDE 10

Myopic & Cooperative Search

Harvard ELICIT

slide-11
SLIDE 11

Myopic & Cooperative Search

  • Red Herrings

Harvard ELICIT

slide-12
SLIDE 12

Myopic & Cooperative Search

  • Red Herrings
  • No Silver Bullets

Harvard ELICIT

slide-13
SLIDE 13

Myopic & Cooperative Search

  • Red Herrings
  • No Silver Bullets
  • Disintegrated Problems
  • Multiple Factoids

Required

Harvard ELICIT

slide-14
SLIDE 14

Progress Check

  • We periodically query the user for her best guess of the

solution, as well as allowing her to enter it manually.

Harvard ELICIT

slide-15
SLIDE 15

Experimental Design

  • Clear incentives for solving the problem
  • Control information flow through sharing

patterns

– No websites

  • Enable annotation to allow theory sharing
  • Pretest to control for skill
  • Visible progress check inside the network

Harvard ELICIT

slide-16
SLIDE 16

Visible Progress Check

  • Players may see their network neighbors’

most recent progress check updates.

– Strong tie / explicit theory sharing

Harvard ELICIT

slide-17
SLIDE 17

Network Structures

  • Goal: individual nodes are identical

– Macrostructural variations

  • Question: How do structural impediments to

information flow change the exploration/exploitation dynamic?

Harvard ELICIT

slide-18
SLIDE 18

Caveman

Harvard ELICIT

vs Rewired Caveman

Motivated by Watts’ Small Worlds (1998)

slide-19
SLIDE 19

Degree‐Preserving Hierarchy

Harvard ELICIT

slide-20
SLIDE 20

Simple Lattice

Harvard ELICIT

slide-21
SLIDE 21

Summary of Experiments

  • 18 Experimental Sessions
  • 416 subjects
  • 70 rounds of 25 minutes
  • 58: Network x Factoid x Round Order
  • 12 rounds testing Progress Check & AV capacity
  • 1120 subject‐round records  Quantitative

Analysis

Harvard ELICIT

slide-22
SLIDE 22

Challenges with Data

  • Messy Data
  • What are the performance metrics?

– 1 solution vs. 4 sub‐problems – Time – Individual vs. group level

  • What to control for?
  • Challenges of understanding network‐level

data

Harvard ELICIT

slide-23
SLIDE 23

Preliminary Regression Summary

  • Rewired Cave performs somewhat better

– Other network treatments not significant

  • Large learning effect
  • Pretest is a strong predictor of performance
  • Turning off visible progress check has a large

effect

Harvard ELICIT

slide-24
SLIDE 24

Distribution of Fully Correct Answers

Harvard ELICIT

slide-25
SLIDE 25

Distribution of Time of 1st Correct ID

Harvard ELICIT

slide-26
SLIDE 26

Distribution of Average ID time

Harvard ELICIT

slide-27
SLIDE 27

Ongoing Data Analysis

  • Multi‐level Models
  • Knowledge Sharing Patterns
  • Coming soon:

– Controlling for dependencies in WWWW – Understanding the mezzo‐level effects of network neighbors – Isomorphic network roles

Harvard ELICIT

slide-28
SLIDE 28

Conclusions

  • Integrate Exploration & Exploitation into

Agility research

  • Extend ELICIT to capture search in a complex

space

  • Some evidence that structural barriors to

communication can improve performance in some cases

  • Group experimental data requires careful

statistical analysis.

Harvard ELICIT

slide-29
SLIDE 29

Questions?

Harvard ELICIT

Thanks to the ELICIT team, especially Mary Ruddy and Szymon Letowski