Fidelity versus Interpretability Derek Bridge Insight Centre for - - PowerPoint PPT Presentation

fidelity versus interpretability
SMART_READER_LITE
LIVE PREVIEW

Fidelity versus Interpretability Derek Bridge Insight Centre for - - PowerPoint PPT Presentation

Explaining Recommendations: Fidelity versus Interpretability Derek Bridge Insight Centre for Data Analytics University College Cork, Ireland Overview Recommender Systems Explaining Recommendations Case Studies Concluding Remarks


slide-1
SLIDE 1

Explaining Recommendations: Fidelity versus Interpretability

Derek Bridge Insight Centre for Data Analytics University College Cork, Ireland

slide-2
SLIDE 2

Overview

  • Recommender Systems
  • Explaining Recommendations
  • Case Studies
  • Concluding Remarks
slide-3
SLIDE 3

RECOMMENDER SYSTEMS

slide-4
SLIDE 4

What is a Recommender System?

  • Software that helps

users discover

– new music and other media – cultural artefacts such as works of art and architecture – products and services – travel experiences – …

  • Recommendations must

typically be

– relevant to the user (‘personalized’) and the context-of-use (‘contextualized’) – diverse – serendipitous – …

Photo by Nickolai Kashirin (CC by 2.0)

slide-5
SLIDE 5

A Scenario

A hungry academic …. …receives a recommendation for a place-to-eat but …not within walking distance …a fusion-style cuisine with which the academic is unfamiliar. …Her confidence in the recommendation might be improved by an explanation…

slide-6
SLIDE 6

A Scenario

A hungry academic …. …receives a recommendation for a place-to-eat but …not within walking distance …a fusion-style cuisine with which the academic is unfamiliar. …Her confidence in the recommendation might be improved by an explanation…

high cost high uncertainty

slide-7
SLIDE 7

Types of Recommender System

  • Content-based
  • Collaborative

– User-based nearest-neighbours – Item-based nearest-neighbours – Matrix factorization

Training set Build Model

slide-8
SLIDE 8

Types of Recommender System

  • Content-based
  • Collaborative

– User-based nearest-neighbours – Item-based nearest-neighbours – Matrix factorization

User & Context-of-use Training set Build Model

slide-9
SLIDE 9

Types of Recommender System

  • Content-based
  • Collaborative

– User-based nearest-neighbours – Item-based nearest-neighbours – Matrix factorization

User & Context-of-use Training set Build Model Recommendation

slide-10
SLIDE 10

Content-Based

Crime, drama Adventure, drama, fantasy Western Action, sci-fi Comedy, drama, romance

slide-11
SLIDE 11

User-Based Nearest-Neighbours

5 4 4 3 4 5 2 5 3 3 2 5 4 1 5

slide-12
SLIDE 12

User-Based Nearest-Neighbours

5 4 4 3 4 5 2 5 3 3 2 5 4 1 5

User-user similarity

slide-13
SLIDE 13

Item-Based Nearest-Neighbours

5 4 4 3 4 5 2 5 3 3 2 5 4 1 5

slide-14
SLIDE 14

Item-Based Nearest-Neighbours

5 4 4 3 4 5 2 5 3 3 2 5 4 1 5

Item-item similarity

slide-15
SLIDE 15

Matrix Factorization

𝑜 movies 𝑛 users

slide-16
SLIDE 16

Matrix Factorization

𝑜 movies 𝑜 movies 𝑛 users 𝑛 users 𝑔 latent factors 𝑔 latent factors

≈ ×

slide-17
SLIDE 17

Matrix Factorization

𝑜 movies 𝑜 movies 𝑛 users 𝑛 users 𝑔 latent factors 𝑔 latent factors

≈ ×

slide-18
SLIDE 18

Ever More Complex Models

Hybrids and Ensembles Latent Feature Spaces Multi- Objective Systems Deep Models

slide-19
SLIDE 19

Interpretable Models

  • Intelligible global

descriptions of systems

– E.g. decision trees – E.g. linear models (esp. sparse linear models)

  • Challenges

– preserving accuracy – intelligibility, e.g. when there are many features or highly-engineered features – protecting Intellectual Property

  • Interpretable deep

models

– learn to associate semantic feature with nodes in hidden layers

Outlook Humidity No Yes Yes Wind No Yes Sunny Overcast Rain High Normal Strong Weak

slide-20
SLIDE 20

DARPA’s XAI Initiative

https://www.darpa.mil/program/explainable-artificial-intelligence

slide-21
SLIDE 21

EXPLAINING RECOMMENDATIONS

slide-22
SLIDE 22

Explanations are Relational

slide-23
SLIDE 23

Explanations are Relational

  • Recommendation only

– “You might like Never Let Me Go”

slide-24
SLIDE 24

Explanations are Relational

  • Recommendation only

– “You might like Never Let Me Go”

  • Recommendation plus description

– “You might like Never Let Me Go, a 2010 dystopian drama based on the 2005 novel of the same name…”

slide-25
SLIDE 25

Explanations are Relational

  • Recommendation only

– “You might like Never Let Me Go”

  • Recommendation plus description

– “You might like Never Let Me Go, a 2010 dystopian drama based on the 2005 novel of the same name…”

  • Recommendation plus explanation

– “You liked Atonement, so you might also like Never Let Me Go”

User & Context-of-use

slide-26
SLIDE 26

Intermediaries in Explanations

Recommendation Items Users Features [Vig et al., 2009] User

slide-27
SLIDE 27

Intermediaries in Explanations

Recommendation Items Users Features [Vig et al., 2009] User likes

slide-28
SLIDE 28

Intermediaries in Explanations

Recommendation Items Users Features [Vig et al., 2009] User likes are similar to

slide-29
SLIDE 29

Intermediaries in Explanations

Recommendation Items Users Features [Vig et al., 2009] User likes is similar to are similar to

slide-30
SLIDE 30

Intermediaries in Explanations

Recommendation Items Users Features [Vig et al., 2009] User likes is similar to are similar to who like

slide-31
SLIDE 31

Intermediaries in Explanations

Recommendation Items Users Features [Vig et al., 2009] User likes likes is similar to are similar to who like

slide-32
SLIDE 32

Intermediaries in Explanations

Recommendation Items Users Features [Vig et al., 2009] User likes likes is similar to are similar to who like are present in

slide-33
SLIDE 33

Explanation Dimensions

Explanations Interpretable Actionable Cheap-to- compute Sound and Complete (Fidelity) Ethical

slide-34
SLIDE 34

Fidelity

Soundness How truthful each element in an explanation is with respect to the underlying system Completeness The extent to which an explanation describes all of the underlying system Fidelity

[Kulesza et al., 2013]

slide-35
SLIDE 35

Fidelity

Soundness How truthful each element in an explanation is with respect to the underlying system Completeness The extent to which an explanation describes all of the underlying system Fidelity

[Kulesza et al., 2013]

increasing trust, fewer requests for clarification, better understanding

slide-36
SLIDE 36

White-Box Explanations

Training set Build Model

slide-37
SLIDE 37

User & Context-of-use

White-Box Explanations

Training set Build Model

slide-38
SLIDE 38

User & Context-of-use

White-Box Explanations

Training set Build Model Recommendation + “trace” data

slide-39
SLIDE 39

User & Context-of-use

White-Box Explanations

Training set Build Model Explanation Generation Recommendation + “trace” data Recommendation + Explanation

slide-40
SLIDE 40

Sound explanations User & Context-of-use

White-Box Explanations

Training set Build Model Explanation Generation Recommendation + “trace” data Recommendation + Explanation

slide-41
SLIDE 41

Black-Box Explanations

Training set Build Model

slide-42
SLIDE 42

User & Context-of-use

Black-Box Explanations

Training set Build Model

slide-43
SLIDE 43

User & Context-of-use

Black-Box Explanations

Training set Build Model Recommendation

slide-44
SLIDE 44

User & Context-of-use

Black-Box Explanations

Training set Build Model Explanation Generation Recommendation Recommendation + Explanation

slide-45
SLIDE 45

User & Context-of-use

Black-Box Explanations

Training set Build Model Explanation Generation Recommendation Recommendation + Explanation

slide-46
SLIDE 46

User & Context-of-use

Black-Box Explanations

Training set Build Model Explanation Generation Other data Recommendation Recommendation + Explanation

slide-47
SLIDE 47

User & Context-of-use

Black-Box Explanations

Training set Build Model Explanation Generation Other data Recommendation Recommendation + Explanation Queries

slide-48
SLIDE 48

User & Context-of-use

Black-Box Explanations

Training set Build Model Explanation Generation Other data Recommendation Recommendation + Explanation Queries Model-agnostic, probably not sound

slide-49
SLIDE 49

Why Explain?

Scrutability Trust Persuasion Decision- support

slide-50
SLIDE 50

CASE STUDIES

slide-51
SLIDE 51

CASE STUDY A

White-Box Explanations of User-Based Nearest-Neighbours Recommendations

slide-52
SLIDE 52

Explaining User-Based Nearest Neighbours Recommendations

  • Difficulties

– Often 50+ neighbours – Userids are meaningless: strangers! – Profiles are large and private

Recommendation Users User is similar to who like

slide-53
SLIDE 53

Explaining User-Based Nearest Neighbours Recommendations

  • Difficulties

– Often 50+ neighbours – Userids are meaningless: strangers! – Profiles are large and private – Good at persuading [Herlocker et al, 2000] – Less good for trust and decision-support [Bilgic & Mooney, 2005]

Recommendation Users User is similar to who like

slide-54
SLIDE 54

User & Context-of-use

Explaining User-Based Nearest Neighbours Recommendations

Training set Build Model Explanation Generation Recommendation + neighbours Recommendation + Explanation

slide-55
SLIDE 55

CASE STUDY B

Item-Based Explanations for User-Based Nearest-Neighbours Recommendations

slide-56
SLIDE 56

Item-Based Explanations

  • Good for trust and decision-support [Bilgic &

Mooney, 2005]

  • Familiar, e.g. Amazon:

– “Customers who bought Atonement also bought Never Let Me Go”

Recommendation Items User likes are similar to

slide-57
SLIDE 57

Item-Based Explanations for User-Based Recommendations

Recommendation Users User is similar to who like

slide-58
SLIDE 58

Item-Based Explanations for User-Based Recommendations

Explanation partners

  • The user’s

neighbours

Candidate items

  • The movies the

user has in common with her partners

Association rules

  • Rules that link

candidates to the recommended item

[Bridge & Dunleavy, 2014; Kaminskas, Durão & Bridge, 2017]

slide-59
SLIDE 59

User & Context-of-use

Item-Based Explanations for User-Based Recommendations

Training set Build Model Explanation Generation Recommendation + neighbours Recommendation + Explanation

slide-60
SLIDE 60

User & Context-of-use

Item-Based Explanations for User-Based Recommendations

Training set Build Model Explanation Generation Recommendation + neighbours Recommendation + Explanation Mine association rules

slide-61
SLIDE 61

CASE STUDIES C & D

Using Queries in Black-Box Explanations

(but these two case studies are not recommender systems!)

slide-62
SLIDE 62

User & Context-of-use

Queries in Black-Box Explanations

Training set Build Model Explanation Generation Recommendation Recommendation + Explanation Queries

slide-63
SLIDE 63

Are You Safe To Drive?

Weight Units consumed [Doyle et al, 2004; Bridge & Cummins, 2006]

slide-64
SLIDE 64

Are You Safe To Drive?

Weight Units consumed [Doyle et al, 2004; Bridge & Cummins, 2006]

slide-65
SLIDE 65

Are You Safe To Drive?

Weight Units consumed [Doyle et al, 2004; Bridge & Cummins, 2006]

slide-66
SLIDE 66

Are You Safe To Drive?

Weight Units consumed [Doyle et al, 2004; Bridge & Cummins, 2006] decision boundary

slide-67
SLIDE 67

Are You Safe To Drive?

Weight Units consumed [Doyle et al, 2004; Bridge & Cummins, 2006] decision boundary

slide-68
SLIDE 68

LIME: Explanations for any Classifier

𝑦 Training set Build Model Build Local Model arg min

𝑕∈𝐻

ℒ 𝑔, 𝑕, Π𝑦 + Ω(𝑕) Classification Classification + Explanation Queries [Ribeiro et al, 2016]

slide-69
SLIDE 69

CASE STUDY E

Opinionated Recommendation

slide-70
SLIDE 70

Opinionated Recommendation

  • Reviews as source of

features & sentiment

  • Recommendations

that balance similarity & sentiment

  • Novel explanation

generation

  • Re-ranking of

recommendations

[Muhammad et al., 2016]

slide-71
SLIDE 71

Opinionated Recommendation

[Muhammad et al., 2016]

slide-72
SLIDE 72

Opinionated Recommendation

[Muhammad et al., 2016]

slide-73
SLIDE 73

CASE STUDY F

Recommendation-By-Explanation

slide-74
SLIDE 74

Recommendation-By-Explanation

[Rana & Bridge, 2017]

slide-75
SLIDE 75

Recommendation-By-Explanation

The Notebook

  • star-crossed-

lovers

  • secret-love
  • broken-

engagement

  • volunteer
  • u.s.-army
  • romantic-rivalry
  • self-discovery
  • ….

Candidate [Rana & Bridge, 2017]

slide-76
SLIDE 76

Recommendation-By-Explanation

The Illusionist

  • fiancé-fiancee

relationship

  • shooting
  • secret-love
  • broken-

engagement

  • star-cross-lovers

The Notebook

  • star-crossed-

lovers

  • secret-love
  • broken-

engagement

  • volunteer
  • u.s.-army
  • romantic-rivalry
  • self-discovery
  • ….

Candidate User’s past preferences [Rana & Bridge, 2017]

slide-77
SLIDE 77

Recommendation-By-Explanation

The Illusionist

  • fiancé-fiancee

relationship

  • shooting
  • secret-love
  • broken-

engagement

  • star-cross-lovers

The Notebook

  • star-crossed-

lovers

  • secret-love
  • broken-

engagement

  • volunteer
  • u.s.-army
  • romantic-rivalry
  • self-discovery
  • ….

Candidate User’s past preferences [Rana & Bridge, 2017]

slide-78
SLIDE 78

Recommendation-By-Explanation

Pearl Harbour

  • fiancé-fiancee

relationship

  • shooting
  • secret-mission
  • volunteer
  • parachute

The Illusionist

  • fiancé-fiancee

relationship

  • shooting
  • secret-love
  • broken-

engagement

  • star-cross-lovers

The Notebook

  • star-crossed-

lovers

  • secret-love
  • broken-

engagement

  • volunteer
  • u.s.-army
  • romantic-rivalry
  • self-discovery
  • ….

Candidate User’s past preferences [Rana & Bridge, 2017]

slide-79
SLIDE 79

Recommendation-By-Explanation

Pearl Harbour

  • fiancé-fiancee

relationship

  • shooting
  • secret-mission
  • volunteer
  • parachute

The Illusionist

  • fiancé-fiancee

relationship

  • shooting
  • secret-love
  • broken-

engagement

  • star-cross-lovers

The Notebook

  • star-crossed-

lovers

  • secret-love
  • broken-

engagement

  • volunteer
  • u.s.-army
  • romantic-rivalry
  • self-discovery
  • ….

Candidate User’s past preferences [Rana & Bridge, 2017]

slide-80
SLIDE 80

Recommendation-By-Explanation

Pearl Harbour

  • fiancé-fiancee

relationship

  • shooting
  • secret-mission
  • volunteer
  • parachute

The Illusionist

  • fiancé-fiancee

relationship

  • shooting
  • secret-love
  • broken-

engagement

  • star-cross-lovers

The Notebook

  • star-crossed-

lovers

  • secret-love
  • broken-

engagement

  • volunteer
  • u.s.-army
  • romantic-rivalry
  • self-discovery
  • ….

Candidate User’s past preferences [Rana & Bridge, 2017]

slide-81
SLIDE 81

Recommendation-By-Explanation

Big Fish

  • romantic-rivalry
  • carnival
  • secret-mission
  • parachute

Pearl Harbour

  • fiancé-fiancee

relationship

  • shooting
  • secret-mission
  • volunteer
  • parachute

The Illusionist

  • fiancé-fiancee

relationship

  • shooting
  • secret-love
  • broken-

engagement

  • star-cross-lovers

The Notebook

  • star-crossed-

lovers

  • secret-love
  • broken-

engagement

  • volunteer
  • u.s.-army
  • romantic-rivalry
  • self-discovery
  • ….

Candidate User’s past preferences [Rana & Bridge, 2017]

slide-82
SLIDE 82

Recommendation-By-Explanation

Big Fish

  • romantic-rivalry
  • carnival
  • secret-mission
  • parachute

Pearl Harbour

  • fiancé-fiancee

relationship

  • shooting
  • secret-mission
  • volunteer
  • parachute

The Illusionist

  • fiancé-fiancee

relationship

  • shooting
  • secret-love
  • broken-

engagement

  • star-cross-lovers

The Notebook

  • star-crossed-

lovers

  • secret-love
  • broken-

engagement

  • volunteer
  • u.s.-army
  • romantic-rivalry
  • self-discovery
  • ….

Candidate User’s past preferences [Rana & Bridge, 2017]

slide-83
SLIDE 83

Recommendation-By-Explanation

Big Fish

  • romantic-rivalry
  • carnival
  • secret-mission
  • parachute

Pearl Harbour

  • fiancé-fiancee

relationship

  • shooting
  • secret-mission
  • volunteer
  • parachute

The Illusionist

  • fiancé-fiancee

relationship

  • shooting
  • secret-love
  • broken-

engagement

  • star-cross-lovers

The Notebook

  • star-crossed-

lovers

  • secret-love
  • broken-

engagement

  • volunteer
  • u.s.-army
  • romantic-rivalry
  • self-discovery
  • ….

Candidate User’s past preferences [Rana & Bridge, 2017]

slide-84
SLIDE 84

Recommendation-By-Explanation

94 95 96 97 98 R-by-E CB 75 80 85 90 95 R-by-E CB 2 4 6 8 10 R-by-E CB

Precision Diversity Surprise

slide-85
SLIDE 85

Comparison

Generate & rank recommendations Generate explanations

Classical approaches

Generate recommendations Generate explanations Re-rank recommendations

Opinionated Recommendation

Generate reasons to recommend (explanations) Recommend those with the best reasons

Recommendation

  • By-Explanation
slide-86
SLIDE 86

CONCLUDING REMARKS

slide-87
SLIDE 87

The Future

  • Drive explanation

design & evaluation by explanation goals

  • Bring explanations into

the heart of recommenders

  • Design explanations

that go beyond relevance to the user

– context-aware – diversity – serendipity

slide-88
SLIDE 88

The Future

  • Design ways to present

and visualise recommendations and explanations

  • Design explanations for

systems that are more conversational

– why are you recommending this? – what aren’t you recommending that? – why this ahead of that? – what would you recommend if things were different? (counterfactuals)

slide-89
SLIDE 89

Acknowledgements

Lisa Cummins Kevin Dunleavy Fred Durão Arpit Rana Marius Kaminskas Dónal Doyle Padraig Cunningham

and all other members of the Recommender Systems Group, Insight Centre for Data Analytics

slide-90
SLIDE 90

References

  • Bilgic, Mustafa and Mooney, Raymond: Explaining Recommendations: Satisfaction vs. Promotion, Proceedings of

Beyond Personalization 2005: A Workshop on the Next Stage of Recommender Systems Research at the 2005 International Conference on Intelligent User Interfaces, 2005

  • Bridge, Derek and Dunleavy, Kevin: If you liked Herlocker et al.'s explanations paper, then you might like this

paper too, Proceedings of the Workshop on Interfaces and Human Decision Making for Recommender Systems (Worskhop Programme of the Eighth ACM Conference on Recommender Systems), pp.22-27, 2014

  • Cummins, Lisa and Bridge, Derek: KLEOR: A Knowledge Lite Approach to Explanation Oriented

Retrieval, Computing and Informatics, vol.25(2-3), pp.173-193, 2006Doyle

  • Herlocker, Jonathan L. and Konstan, Joseph A. and Riedl, John: Explaining Collaborative Filtering

Recommendations, Proceedings of the 2000 ACM Conference on Computer Supported Cooperative Work, pp.241— 250, 2000

  • Kaminskas, Marius and Durao, Fred and Bridge, Derek: Item-Based Explanations for User-Based

Recommendations, Proceedings of eKNOW 2017, The Ninth International Conference on Information, Process, and Knowledge Management, pp.65-70, 2017

  • Kulesza, Todd and Stumpf, Simone and Burnett, Margaret and Yang, Sherry and Kwan, Irwin and Wong, Weng-

Keen: Too Much, Too Little, or Just Right? Ways Explanations Impact End Users’ Mental Models, Proceedings of the IEEE Symposium on Visual Languages and Human-Centric Computing, 2013

  • Khalil Muhammad, Aonghus Lawlor, Barry Smyth: On the Use of Opinionated Explanations to Rank and Justify
  • Recommendations. Proceedings of the FLAIRS Conference, pp.554-559, 2016
  • Rana, Arpit and Bridge, Derek: Explanation Chains: Recommendations by Explanation, Proceedings of the Poster

Track of the 11th ACM Conference on Recommender Systems, CEUR Workshop Proceedings, vol-1905, 2017

  • Ribeiro, Marco and Singh, Sameer and Guestrin, Carlos: "Why Should I Trust You?": Explaining the Predictions of

Any Classifier, http://arxiv.org/abs/1602.04938, 2016

  • Vig, Jesse and Sen, Shilad and Riedl, John: Tagsplanations: Explaining Recommendations Using Tags, Proceedings
  • f the 14th International Conference on Intelligent User Interfaces, pp.47—56, 2009
slide-91
SLIDE 91

Obrigado!