Search Snippet Evaluation Mikhail Lebedev, Pavel Braslavski, Denis - - PowerPoint PPT Presentation

search snippet evaluation
SMART_READER_LITE
LIVE PREVIEW

Search Snippet Evaluation Mikhail Lebedev, Pavel Braslavski, Denis - - PowerPoint PPT Presentation

Search Snippet Evaluation Mikhail Lebedev, Pavel Braslavski, Denis Savenkov CLEF 2011 CLEF 2011 What is a snippet What is a snippet # Why is it so important Why is it so important Search is ranking & representation Bad snippets spoil good


slide-1
SLIDE 1

Search Snippet Evaluation

Mikhail Lebedev, Pavel Braslavski, Denis Savenkov

CLEF 2011 CLEF 2011

slide-2
SLIDE 2

What is a snippet What is a snippet

#

slide-3
SLIDE 3

Why is it so important Why is it so important

Search is ranking & representation Bad snippets spoil good results Bad snippets spoil good results Overoptimized snippets lead to wrong clicks

slide-4
SLIDE 4

When we use evaluation When we use evaluation

Compare with competitors Choose between different Choose between different algorithms g hi l i Machine learning

slide-5
SLIDE 5

Evaluation approach Evaluation approach

User study Judgments Text‐based Clicks Text based metrics metrics

slide-6
SLIDE 6

Evaluation approach Evaluation approach

User study Judgments Text‐based Clicks Text based metrics metrics

slide-7
SLIDE 7

Eye tracking

slide-8
SLIDE 8

Eye tracking Eye tracking

Different users, different strategies , g

Title is much more important than body hl h Highlighting is attractive U li k if i t t i User clicks even if snippet contains answer Media content may be ignored Media content may be ignored

slide-9
SLIDE 9

Evaluation approach Evaluation approach

User study Judgments Text‐based Clicks Text based metrics metrics

slide-10
SLIDE 10

Ideal snippet is: Ideal snippet is:

R d bl I f ti Readable Informative

slide-11
SLIDE 11

Making ideal snippets Making ideal snippets

Machine learning Training set Training set Judgments Ab l t R l ti Absolute Relative

slide-12
SLIDE 12

Relative assessment Relative assessment

slide-13
SLIDE 13

Assessment issues Assessment issues

Assessors learn Snippet quality depends on ranking Snippet quality depends on ranking Assessment tool interface matters Pairs vs groups: time vs quality Pairs vs groups: time vs quality

slide-14
SLIDE 14

Evaluation approach Evaluation approach

User study Judgments Text‐based Clicks Text based metrics metrics

slide-15
SLIDE 15

Automated metrics Automated metrics

Highlighting Neatness Neatness Number of empty snippets Unique query words Unique query words

slide-16
SLIDE 16

Evaluation approach Evaluation approach

User study Judgments Text‐based Clicks Text based metrics metrics

slide-17
SLIDE 17

Click data Click data

Dwell time Abandonment Abandonment Inversions Time to first click Time to first click

slide-18
SLIDE 18

Conclusions Conclusions

Different goals – different methods

User‐study: making assumptions Judgments: expensive quality Judgments: expensive quality Text based metrics: fast but rough Text‐based metrics: fast but rough Clicks: ranking influence

slide-19
SLIDE 19

Future Future

integral metrics learning on clicks learning on clicks