Learning Context Effects in Triadic Closure Kiran Tomlinson SINM - - PowerPoint PPT Presentation

learning context effects in triadic closure
SMART_READER_LITE
LIVE PREVIEW

Learning Context Effects in Triadic Closure Kiran Tomlinson SINM - - PowerPoint PPT Presentation

bit.ly/lcl-slides Slides: bit.ly/lcl-paper Preprint: bit.ly/lcl-code Code: bit.ly/lcl-data Data: Learning Context Effects in Triadic Closure Kiran Tomlinson SINM 2020 research with Austin R. Benson ? What factors drive edge formation?


slide-1
SLIDE 1

SINM 2020

Learning Context Effects in Triadic Closure

Kiran Tomlinson research with Austin R. Benson

Slides: Preprint: Code: Data: bit.ly/lcl-slides bit.ly/lcl-paper bit.ly/lcl-code bit.ly/lcl-data

slide-2
SLIDE 2

What factors drive edge formation?

? ? ?

slide-3
SLIDE 3

What factors drive edge formation?

Preferential attachment


(Barabási & Albert, Science 1999)

? ? ?

slide-4
SLIDE 4

What factors drive edge formation?

Homophily 


(McPherson et al., Annual Review of Sociology 2001) 
 (Papadopoulos et al., Nature 2012)

Preferential attachment


(Barabási & Albert, Science 1999)

? ? ?

slide-5
SLIDE 5

What factors drive edge formation?

Fitness


(Bianconi & Barabási, Europhysics Letters 2001)
 (Caldarelli et al., Physical Review Letters 2002)

Homophily 


(McPherson et al., Annual Review of Sociology 2001) 
 (Papadopoulos et al., Nature 2012)

Preferential attachment


(Barabási & Albert, Science 1999)

? ? ?

slide-6
SLIDE 6

What factors drive edge formation?

Fitness


(Bianconi & Barabási, Europhysics Letters 2001)
 (Caldarelli et al., Physical Review Letters 2002)

Homophily 


(McPherson et al., Annual Review of Sociology 2001) 
 (Papadopoulos et al., Nature 2012)

Preferential attachment


(Barabási & Albert, Science 1999)

Triadic closure


(Rapoport, Bulletin of Mathematical Biophysics 1953)
 (Jin et al., Physical Review E 2001)

? ? ?

slide-7
SLIDE 7

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19) (Gupta & Porter, arXiv 2020)

slide-8
SLIDE 8

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set

(Gupta & Porter, arXiv 2020)

slide-9
SLIDE 9

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set

(Gupta & Porter, arXiv 2020)

(under-explored in sociology)

(Bruch & Feinberg, Annual Review of Sociology 2017)

slide-10
SLIDE 10

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set chooser choice set in network growth

(Gupta & Porter, arXiv 2020)

(under-explored in sociology)

(Bruch & Feinberg, Annual Review of Sociology 2017)

slide-11
SLIDE 11

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set chooser choice set in network growth Key usage Timestamped edges 
 → meaningful choice sets

(Gupta & Porter, arXiv 2020)

(under-explored in sociology)

(Bruch & Feinberg, Annual Review of Sociology 2017)

slide-12
SLIDE 12

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set chooser choice set Infer relative importance of edge formation mechanisms from data in network growth Key usage Timestamped edges 
 → meaningful choice sets

(Gupta & Porter, arXiv 2020)

(under-explored in sociology)

(Bruch & Feinberg, Annual Review of Sociology 2017)

slide-13
SLIDE 13

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set chooser choice set Infer relative importance of edge formation mechanisms from data in network growth Multinomial logit
 (MNL) (McFadden, 1973) Key usage Timestamped edges 
 → meaningful choice sets

(Gupta & Porter, arXiv 2020)

(under-explored in sociology)

(Bruch & Feinberg, Annual Review of Sociology 2017)

slide-14
SLIDE 14

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set chooser choice set Infer relative importance of edge formation mechanisms from data in network growth Multinomial logit
 (MNL) (McFadden, 1973) Key usage

node


Timestamped edges 
 → meaningful choice sets

(Gupta & Porter, arXiv 2020)

(under-explored in sociology)

(Bruch & Feinberg, Annual Review of Sociology 2017)

slide-15
SLIDE 15

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set chooser choice set Infer relative importance of edge formation mechanisms from data in network growth Multinomial logit
 (MNL) (McFadden, 1973) Key usage

node
 choice set

Timestamped edges 
 → meaningful choice sets

(Gupta & Porter, arXiv 2020)

(under-explored in sociology)

(Bruch & Feinberg, Annual Review of Sociology 2017)

slide-16
SLIDE 16

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set chooser choice set Infer relative importance of edge formation mechanisms from data in network growth Multinomial logit
 (MNL) (McFadden, 1973)

preferences

Key usage

node
 choice set

Timestamped edges 
 → meaningful choice sets

(Gupta & Porter, arXiv 2020)

(under-explored in sociology)

(Bruch & Feinberg, Annual Review of Sociology 2017)

slide-17
SLIDE 17

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set chooser choice set Infer relative importance of edge formation mechanisms from data in network growth Multinomial logit
 (MNL) (McFadden, 1973)

preferences node features 


Key usage

node
 choice set

Timestamped edges 
 → meaningful choice sets

(Gupta & Porter, arXiv 2020)

(under-explored in sociology)

(Bruch & Feinberg, Annual Review of Sociology 2017)

slide-18
SLIDE 18

“Choosing to grow a graph”

(Overgoor et al., SINM ’19 & WWW ’19)

Traditional discrete choice: chooser choice set chooser choice set Infer relative importance of edge formation mechanisms from data in network growth Multinomial logit
 (MNL) (McFadden, 1973)

preferences node features 
 (similarity, in-degree, fitness…)

Key usage

node
 choice set

Timestamped edges 
 → meaningful choice sets

(Gupta & Porter, arXiv 2020)

(under-explored in sociology)

(Bruch & Feinberg, Annual Review of Sociology 2017)

slide-19
SLIDE 19

The choice set affects preferences

slide-20
SLIDE 20

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

slide-21
SLIDE 21

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

e.g., compromise effect:

(Simonson, Journal of Consumer Research 1989)

slide-22
SLIDE 22

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

e.g., compromise effect:

(Simonson, Journal of Consumer Research 1989)

$10 $15 $20

slide-23
SLIDE 23

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

e.g., compromise effect:

(Simonson, Journal of Consumer Research 1989)

$10 $15 $20

slide-24
SLIDE 24

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

e.g., compromise effect:

(Simonson, Journal of Consumer Research 1989)

$10 $15 $20 $15 $20 $25

slide-25
SLIDE 25

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

e.g., compromise effect:

(Simonson, Journal of Consumer Research 1989)

$10 $15 $20 $15 $20 $25

slide-26
SLIDE 26

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

vs. In networks
 e.g., how do preferences change 
 when choosing from a popular group? e.g., compromise effect:

(Simonson, Journal of Consumer Research 1989)

$10 $15 $20 $15 $20 $25

slide-27
SLIDE 27

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

vs. Linear context logit (LCL)

Pr(i, C) = exp([θ + AxC]

T xi)

∑j∈C exp([θ + AxC]

T xj)

In networks
 e.g., how do preferences change 
 when choosing from a popular group? Our model: e.g., compromise effect:

(Simonson, Journal of Consumer Research 1989)

$10 $15 $20 $15 $20 $25

slide-28
SLIDE 28

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

vs. Linear context logit (LCL)

Pr(i, C) = exp([θ + AxC]

T xi)

∑j∈C exp([θ + AxC]

T xj)

base preferences

In networks
 e.g., how do preferences change 
 when choosing from a popular group? Our model: e.g., compromise effect:

(Simonson, Journal of Consumer Research 1989)

$10 $15 $20 $15 $20 $25

slide-29
SLIDE 29

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

vs. Linear context logit (LCL)

Pr(i, C) = exp([θ + AxC]

T xi)

∑j∈C exp([θ + AxC]

T xj)

base preferences context effect
 matrix

In networks
 e.g., how do preferences change 
 when choosing from a popular group? Our model: e.g., compromise effect:

(Simonson, Journal of Consumer Research 1989)

$10 $15 $20 $15 $20 $25

slide-30
SLIDE 30

The choice set affects preferences

Context effects

(Huber et al., Journal of Consumer Research 1982)
 (Simonson & Tversky, Journal of Marketing Research 1992)

vs. Linear context logit (LCL)

Pr(i, C) = exp([θ + AxC]

T xi)

∑j∈C exp([θ + AxC]

T xj)

base preferences context effect
 matrix mean features


  • ver choice set

In networks
 e.g., how do preferences change 
 when choosing from a popular group? Our model: e.g., compromise effect:

(Simonson, Journal of Consumer Research 1989)

$10 $15 $20 $15 $20 $25

slide-31
SLIDE 31

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets

slide-32
SLIDE 32

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets Our data
 Timestamped edges
 (including repeats)

slide-33
SLIDE 33

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets Our data
 Timestamped edges
 (including repeats)

slide-34
SLIDE 34

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets Our data
 Timestamped edges
 (including repeats)

slide-35
SLIDE 35

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets Our data
 Timestamped edges
 (including repeats)

slide-36
SLIDE 36

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets Our data
 Timestamped edges
 (including repeats)

slide-37
SLIDE 37

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets Our data
 Timestamped edges
 (including repeats)

slide-38
SLIDE 38

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets Our data
 Timestamped edges
 (including repeats)

slide-39
SLIDE 39

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets Our data
 Timestamped edges
 (including repeats)

slide-40
SLIDE 40

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets Our data
 Timestamped edges
 (including repeats)

slide-41
SLIDE 41

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets Our data
 Timestamped edges
 (including repeats)

slide-42
SLIDE 42

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets u v w1 w3 w2 Our data
 Timestamped edges
 (including repeats)

slide-43
SLIDE 43

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets u v w1 w3 w2 chooser 
 u Our data
 Timestamped edges
 (including repeats)

slide-44
SLIDE 44

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets u v w1 w3 w2 choice set
 {w1, w2, w3} chooser 
 u Our data
 Timestamped edges
 (including repeats)

slide-45
SLIDE 45

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets u v w1 w3 w2 choice set
 {w1, w2, w3} choice 
 w1 chooser 
 u Our data
 Timestamped edges
 (including repeats)

slide-46
SLIDE 46

Choosing to close triangles

Triadic closure offers small choice sets
 → tractable inference 
 → varied choice sets u v w1 w3 w2 choice set
 {w1, w2, w3} choice 
 w1 Node features

  • 1. in-degree of w
  • 2. # shared neighbors of u, w
  • 3. weight of edge w→u
  • 4. time since last edge into w
  • 5. time since last edge out of w
  • 6. time since last w→u edge

chooser 
 u Our data
 Timestamped edges
 (including repeats)

slide-47
SLIDE 47

Context matters in triadic closure

slide-48
SLIDE 48

Context matters in triadic closure

Datasets


email-enron
 email-eu
 email-w3c
 wiki-talk
 reddit-hyperlink
 bitcoin-alpha
 bitcoin-otc
 mathoverflow
 college-msg
 facebook-wall
 sms-a
 sms-b
 sms-c


bit.ly/lcl-data

slide-49
SLIDE 49

Context matters in triadic closure

Datasets


email-enron
 email-eu
 email-w3c
 wiki-talk
 reddit-hyperlink
 bitcoin-alpha
 bitcoin-otc
 mathoverflow
 college-msg
 facebook-wall
 sms-a
 sms-b
 sms-c


bit.ly/lcl-data

slide-50
SLIDE 50

Context matters in triadic closure

Datasets


email-enron
 email-eu
 email-w3c
 wiki-talk
 reddit-hyperlink
 bitcoin-alpha
 bitcoin-otc
 mathoverflow
 college-msg
 facebook-wall
 sms-a
 sms-b
 sms-c


bit.ly/lcl-data

Synthetic data,
 no context effects

slide-51
SLIDE 51

Context matters in triadic closure

Datasets


email-enron
 email-eu
 email-w3c
 wiki-talk
 reddit-hyperlink
 bitcoin-alpha
 bitcoin-otc
 mathoverflow
 college-msg
 facebook-wall
 sms-a
 sms-b
 sms-c


bit.ly/lcl-data

Synthetic data,
 no context effects Commenting network,
 linear context effects

slide-52
SLIDE 52

Context matters in triadic closure

Datasets


email-enron
 email-eu
 email-w3c
 wiki-talk
 reddit-hyperlink
 bitcoin-alpha
 bitcoin-otc
 mathoverflow
 college-msg
 facebook-wall
 sms-a
 sms-b
 sms-c


bit.ly/lcl-data

Synthetic data,
 no context effects Commenting network,
 linear context effects Email network,
 nonlinear context effects?

slide-53
SLIDE 53

LCL reveals interpretable context effects

slide-54
SLIDE 54

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

(concave)

slide-55
SLIDE 55

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

(concave)

slide-56
SLIDE 56

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

Node features
 (left-right, top-bottom) 1. in-degree 2. shared neighbors 3. reciprocal weight 4. send recency 5. receive recency 6. reciprocal recency (concave) context effect matrix A red: +, blue: -, white: 0
 (column acts on row)

slide-57
SLIDE 57

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

Node features
 (left-right, top-bottom) 1. in-degree 2. shared neighbors 3. reciprocal weight 4. send recency 5. receive recency 6. reciprocal recency (concave) L1 regularization level context effect matrix A red: +, blue: -, white: 0
 (column acts on row)

slide-58
SLIDE 58

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

Node features
 (left-right, top-bottom) 1. in-degree 2. shared neighbors 3. reciprocal weight 4. send recency 5. receive recency 6. reciprocal recency (concave) L1 regularization level context effect matrix A red: +, blue: -, white: 0
 (column acts on row) LCL negative log-likelihood
 (lower = better)

slide-59
SLIDE 59

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

Node features
 (left-right, top-bottom) 1. in-degree 2. shared neighbors 3. reciprocal weight 4. send recency 5. receive recency 6. reciprocal recency (concave) L1 regularization level context effect matrix A red: +, blue: -, white: 0
 (column acts on row) LCL negative log-likelihood
 (lower = better) likelihood-ratio test vs MNL significance threshold (p < 0.001)

slide-60
SLIDE 60

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

Node features
 (left-right, top-bottom) 1. in-degree 2. shared neighbors 3. reciprocal weight 4. send recency 5. receive recency 6. reciprocal recency (concave) L1 regularization level context effect matrix A red: +, blue: -, white: 0
 (column acts on row) LCL negative log-likelihood
 (lower = better) likelihood-ratio test vs MNL significance threshold (p < 0.001)

slide-61
SLIDE 61

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

Node features
 (left-right, top-bottom) 1. in-degree 2. shared neighbors 3. reciprocal weight 4. send recency 5. receive recency 6. reciprocal recency (concave) L1 regularization level context effect matrix A red: +, blue: -, white: 0
 (column acts on row) LCL negative log-likelihood
 (lower = better) likelihood-ratio test vs MNL significance threshold (p < 0.001)

slide-62
SLIDE 62

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

Node features
 (left-right, top-bottom) 1. in-degree 2. shared neighbors 3. reciprocal weight 4. send recency 5. receive recency 6. reciprocal recency (concave) L1 regularization level context effect matrix A red: +, blue: -, white: 0
 (column acts on row) LCL negative log-likelihood
 (lower = better) likelihood-ratio test vs MNL significance threshold (p < 0.001) “popularity matters less when choosing from close connections” “close connections matter more when choosing from the popular”

slide-63
SLIDE 63

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

Node features
 (left-right, top-bottom) 1. in-degree 2. shared neighbors 3. reciprocal weight 4. send recency 5. receive recency 6. reciprocal recency (concave) L1 regularization level context effect matrix A red: +, blue: -, white: 0
 (column acts on row) LCL negative log-likelihood
 (lower = better) likelihood-ratio test vs MNL significance threshold (p < 0.001) “popularity matters less when choosing from close connections” “close connections matter more when choosing from the popular”

slide-64
SLIDE 64

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

Node features
 (left-right, top-bottom) 1. in-degree 2. shared neighbors 3. reciprocal weight 4. send recency 5. receive recency 6. reciprocal recency (concave) L1 regularization level context effect matrix A red: +, blue: -, white: 0
 (column acts on row) LCL negative log-likelihood
 (lower = better) likelihood-ratio test vs MNL significance threshold (p < 0.001) “popularity matters less when choosing from close connections” “close connections matter more when choosing from the popular”

slide-65
SLIDE 65

LCL reveals interpretable context effects

MLE to infer LCL Estimation

−log∑

j∈C

exp([θ + AxC]

T xj)

ℓ(θ, A; 𝒠) = ∑

(i,C)∈𝒠

(θ + AxC)

T xi

Node features
 (left-right, top-bottom) 1. in-degree 2. shared neighbors 3. reciprocal weight 4. send recency 5. receive recency 6. reciprocal recency (concave) L1 regularization level context effect matrix A red: +, blue: -, white: 0
 (column acts on row) LCL negative log-likelihood
 (lower = better) likelihood-ratio test vs MNL significance threshold (p < 0.001) “popularity matters less when choosing from close connections” “close connections matter more when choosing from the popular” “popularity matters less when your inbox is full of recent emails”

slide-66
SLIDE 66

Other things in our paper

Kiran Tomlinson and Austin R. Benson
 Learning Interpretable Feature Context Effects in Discrete Choice
 arXiv: 2009.03417, September 2020

bit.ly/lcl-paper

slide-67
SLIDE 67

Other things in our paper

Kiran Tomlinson and Austin R. Benson
 Learning Interpretable Feature Context Effects in Discrete Choice
 arXiv: 2009.03417, September 2020

  • LCL derivation from simple assumptions

bit.ly/lcl-paper

slide-68
SLIDE 68

Other things in our paper

Kiran Tomlinson and Austin R. Benson
 Learning Interpretable Feature Context Effects in Discrete Choice
 arXiv: 2009.03417, September 2020

  • LCL derivation from simple assumptions
  • More flexible model: decomposed LCL

bit.ly/lcl-paper

slide-69
SLIDE 69

Other things in our paper

Kiran Tomlinson and Austin R. Benson
 Learning Interpretable Feature Context Effects in Discrete Choice
 arXiv: 2009.03417, September 2020

  • LCL derivation from simple assumptions
  • More flexible model: decomposed LCL
  • LCL identifiability condition

bit.ly/lcl-paper

slide-70
SLIDE 70

Other things in our paper

Kiran Tomlinson and Austin R. Benson
 Learning Interpretable Feature Context Effects in Discrete Choice
 arXiv: 2009.03417, September 2020

  • LCL derivation from simple assumptions
  • More flexible model: decomposed LCL
  • LCL identifiability condition
  • Application to general choice data

bit.ly/lcl-paper

slide-71
SLIDE 71

Other things in our paper

Kiran Tomlinson and Austin R. Benson
 Learning Interpretable Feature Context Effects in Discrete Choice
 arXiv: 2009.03417, September 2020

  • LCL derivation from simple assumptions
  • More flexible model: decomposed LCL
  • LCL identifiability condition
  • Application to general choice data
  • Accounting for context improves prediction

bit.ly/lcl-paper

slide-72
SLIDE 72

Concluding thoughts

Challenges
 Features correlate
 Causal context effects?
 Handling nonlinearity?
 Global edge formation modes?
 Missing timestamps? Key takeaway
 Context effects matter in triadic closure

slide-73
SLIDE 73

Concluding thoughts

Challenges
 Features correlate
 Causal context effects?
 Handling nonlinearity?
 Global edge formation modes?
 Missing timestamps? Key takeaway
 Context effects matter in triadic closure Acknowledgments Funding from NSF , ARO Thanks to Johan Ugander 
 and Jan Overgoor

Slides: Preprint: Code: Data: bit.ly/lcl-slides bit.ly/lcl-paper bit.ly/lcl-code bit.ly/lcl-data

Thank you!

More questions or ideas?
 Email me: kt@cs.cornell.edu