Context Attentive Document Ranking and Query Suggestion Wasi Uddin - - PowerPoint PPT Presentation

context attentive document ranking and query suggestion
SMART_READER_LITE
LIVE PREVIEW

Context Attentive Document Ranking and Query Suggestion Wasi Uddin - - PowerPoint PPT Presentation

Context Attentive Document Ranking and Query Suggestion Wasi Uddin Ahmad Kai-Wei Chang Hongning Wang University of California, University of California, University of Virginia Los Angeles Los Angeles


slide-1
SLIDE 1

Context Attentive Document Ranking and Query Suggestion

Wasi Uddin Ahmad University of California, Los Angeles Kai-Wei Chang University of California, Los Angeles Hongning Wang University of Virginia https://github.com/wasiahmad/context_attentive_ir

Codes will be released soon!

slide-2
SLIDE 2

Search Logs Provide Rich Context to Understand Users’ Search Tasks

Context Attentive Ranking and Suggestion 2 CS@UCLA

5/29/2012 5/29/2012 14:06:04 coney island cincinnati 5/29/2012 14:11:49 sas 5/29/2012 14:12:01 sas shoes 5/30/2012 5/30/2012 12:12:04 exit #72 and 275 lodging 5/30/2012 12:25:19 6pm.com 5/30/2012 12:49:21 coupon for 6pm 5/31/2012 5/31/2012 19:40:38 motel 6 locations 5/31/2012 19:45:04 hotels near coney island 5/29/2012 5/29/2012 14:06:04 coney island cincinnati 5/29/2012 14:11:49 sas 5/29/2012 14:12:01 sas shoes 5/30/2012 5/30/2012 12:12:04 exit #72 and 275 lodging 5/30/2012 12:25:19 6pm.com 5/30/2012 12:49:21 coupon for 6pm 5/31/2012 5/31/2012 19:40:38 motel 6 locations 5/31/2012 19:45:04 hotels near coney island

Task: an atomic information need that may result

in one or more queries.

slide-3
SLIDE 3

Clicks Further Enrich Context to Understand Users’ Search Tasks

CS@UCLA Context Attentive Ranking and Suggestion 3

𝑅" 𝑅# 𝑅$ low cost decoration

Clicked document Skipped document

decoration options for home, patio,

  • utdoor etc. low cost bedroom

decorations. decorate bedroom on a limited budget furniture for decoration cheap furniture for bedroom. buying used furniture is an alternative route. used and cheap futon, dressers, cupboards. Guides to reformulate query Previous clicks help to rank documents

Summary of clicked documents

slide-4
SLIDE 4

Our Proposal

  • Attend on previous queries/clicks to perform retrieval tasks
  • Learning to utilize context in multiple retrieval activities

CS@UCLA Context Attentive Ranking and Suggestion 4

D6

q2 q3 q6 q5 q0

Rank candidate documents

D2 D3 D5

q7

Predict next query

slide-5
SLIDE 5

A Context Attentive Solution

  • A two-level hierarchical

structure

  • Task context embedding
  • Session-query and

session-click encoders

  • Context attentive

representations

  • Multi-task Learning
  • Document Ranking
  • Query Suggestion

CS@UCLA Context Attentive Ranking and Suggestion 5

master’s

master’s

master’s

……… ……… Ranker

Recommender

Query Encoder Session-Query Encoder Session-Click Encoder

initial state initial state

…..

Inner Attention

Document Encoder

Inner Attention Inner Attention it masters ny software engineer ny it position ny healthcare 2018 Inner Attention master’s degree hospital jobs

Clicked Documents

Context-Attentive Representation Context-Attentive Representation

lower-level lower-level Context Attentive Ranking and Suggestion

slide-6
SLIDE 6

A Context Attentive Solution

  • A two-level hierarchical

structure

  • Task context embedding
  • Session-query and

session-click encoders

  • Context attentive

representations

  • Multi-task Learning
  • Document Ranking
  • Query Suggestion

CS@UCLA Context Attentive Ranking and Suggestion 6

master’s

master’s

master’s

……… ……… Ranker

Recommender

Query Encoder Session-Query Encoder Session-Click Encoder

initial state initial state

…..

Inner Attention

Document Encoder

Inner Attention Inner Attention it masters ny software engineer ny it position ny healthcare 2018 Inner Attention master’s degree hospital jobs

Clicked Documents

Context-Attentive Representation Context-Attentive Representation

upper-level upper-level Context Attentive Ranking and Suggestion

slide-7
SLIDE 7

A Context Attentive Solution

  • A two-level hierarchical

structure

  • Task context embedding
  • Session-query and

session-click encoders

  • Context attentive

representations

  • Multi-task Learning
  • Document Ranking
  • Query Suggestion

CS@UCLA Context Attentive Ranking and Suggestion 7

master’s

master’s

master’s

……… ……… Ranker

Recommender

Query Encoder Session-Query Encoder Session-Click Encoder

initial state initial state

…..

Inner Attention

Document Encoder

Inner Attention Inner Attention it masters ny software engineer ny it position ny healthcare 2018 Inner Attention master’s degree hospital jobs

Clicked Documents

Context-Attentive Representation Context-Attentive Representation

Context Attentive Ranking and Suggestion

slide-8
SLIDE 8

A Context Attentive Solution

  • A two-level hierarchical

structure

  • Task context embedding
  • Session-query and

session-click encoders

  • Context attentive

representations

  • Multi-task Learning
  • Document Ranking
  • Query Suggestion

CS@UCLA Context Attentive Ranking and Suggestion 8

master’s

master’s

master’s

……… ……… Ranker

Recommender

Query Encoder Session-Query Encoder Session-Click Encoder

initial state initial state

…..

Inner Attention

Document Encoder

Inner Attention Inner Attention it masters ny software engineer ny it position ny healthcare 2018 Inner Attention master’s degree hospital jobs

Clicked Documents

Context-Attentive Representation Context-Attentive Representation

Context Attentive Ranking and Suggestion

slide-9
SLIDE 9

Multi-task Learning Objective

  • Optimized via Regularized Multi-task Learning

Context Attentive Ranking and Suggestion 9 Wasi Uddin Ahmad@UCLA

Negative log-likelihood Regularization

𝑅" 𝑅$ low cost decoration decorate bedroom on a limited budget Task 1: document ranking furniture for decoration Task 2: next query prediction 𝑅#

slide-10
SLIDE 10

Experiments

Context Attentive Ranking and Suggestion 10 CS@UCLA

slide-11
SLIDE 11

Data Source

  • AOL search log – 13 weeks search log of ~650k users
  • Background set – 5 weeks
  • Training set – 6 weeks
  • Validation and Test set – 2 weeks
  • Aggregation of candidate documents for ranking
  • Candidates are sampled from the top 1000 documents retrieved by

BM25 from a pool of 1M documents

  • Task Segmentation
  • Averaging the query term vectors → Query embedding
  • Consecutive queries* with cosine similarity > 0.5

Context Attentive Ranking and Suggestion 11 CS@UCLA

* within 30 mins interval

slide-12
SLIDE 12

Data Statistics

  • Only the document titles are utilized in the experiments

Context Attentive Ranking and Suggestion 12 CS@UCLA

slide-13
SLIDE 13

Evaluation Metrics and Baselines

  • Document Ranking
  • Metrics – MAP, MRR, NDCG@k (where k=1,3,10)
  • Baselines – BM25, QL, FixInt, DSSM, DUET, MNSRF etc.
  • Query Suggestion
  • Metrics – MRR, F1, BLEU-k (where k=1,2,3,4)
  • Baselines – HRED-qs, Seq2seq+Attn, MNSRF etc.

MRR – assesses discrimination ability, rank a list of candidate queries that might follow a given query

Context Attentive Ranking and Suggestion 13 CS@UCLA

slide-14
SLIDE 14

Evaluation Metrics and Baselines

  • Document Ranking
  • Metrics – MAP, MRR, NDCG@k (where k=1,3,10)
  • Baselines – BM25, QL, FixInt, DSSM, DUET, MNSRF etc.
  • Query Suggestion
  • Metrics – MRR, F1, BLEU-k (where k=1,2,3,4)
  • Baselines – HRED-qs, Seq2seq+Attn, MNSRF etc.

F1, BLEU-k – assesses generation ability, measures overlapping between the generated query term sequence and ground-truth sequence.

Context Attentive Ranking and Suggestion 14 CS@UCLA

slide-15
SLIDE 15

Evaluation on Document Ranking

CS@UCLA Context Attentive Ranking and Suggestion 15

− Vocabulary gap limits the performance − Do not utilize any search context information

slide-16
SLIDE 16

Evaluation on Document Ranking

CS@UCLA Context Attentive Ranking and Suggestion 16

− Do not utilize any search context information

slide-17
SLIDE 17

Evaluation on Document Ranking

CS@UCLA Context Attentive Ranking and Suggestion 17

+ Jointly learns ranking and suggestion − Utilizes query history but no click history

slide-18
SLIDE 18

Evaluation on Document Ranking

CS@UCLA Context Attentive Ranking and Suggestion 18

+ Jointly learns ranking and suggestion + Utilizes both query and click history

slide-19
SLIDE 19

Evaluation on Query Suggestion

CS@UCLA Context Attentive Ranking and Suggestion 19

Do not utilize any context

slide-20
SLIDE 20

Evaluation on Query Suggestion

CS@UCLA Context Attentive Ranking and Suggestion 20

Do not utilize click history

slide-21
SLIDE 21

Evaluation on Query Suggestion

CS@UCLA Context Attentive Ranking and Suggestion 21

+ Utilizes both in-task query and click history

slide-22
SLIDE 22

Ablation Study

CS@UCLA Context Attentive Ranking and Suggestion 22

  • Modeling search context is beneficial
  • Joint learning of related retrieval tasks results in improvements

∗ indicates that the attention in the query recommender

slide-23
SLIDE 23

Ablation Study

CS@UCLA Context Attentive Ranking and Suggestion 23

  • Modeling search context is beneficial
  • Joint learning of related retrieval tasks results in improvements

∗ indicates that the attention in the query recommender Performance drops

slide-24
SLIDE 24

Ablation Study

CS@UCLA Context Attentive Ranking and Suggestion 24

  • Modeling search context is beneficial
  • Joint learning of related retrieval tasks results in improvements

∗ indicates that the attention in the query recommender Performance drops Performance increases

slide-25
SLIDE 25

Effect of Context Modeling

Context Attentive Ranking and Suggestion 25 CS@UCLA

Hypothesis – longer tasks are intrinsically more difficult.

Short tasks (with 2 queries) Medium tasks (with 3–4 queries) Long tasks (with 5+ queries)

slide-26
SLIDE 26

Effect of Context Modeling

Context Attentive Ranking and Suggestion 26 CS@UCLA

  • Context information helps more on short/medium tasks
  • Longer tasks are intrinsically more difficult.
slide-27
SLIDE 27

Sample Complexity

Context Attentive Ranking and Suggestion 27 CS@UCLA

In terms of #parameters, CARS > MNSRF > M-Match Tensor

slide-28
SLIDE 28

10/14/19 Neural Task Modeling 28

Case Analysis

How in-task previous queries and clicks impact predicting the next query and click for it?

slide-29
SLIDE 29

Conclusion & Future Works

  • A task-based approach of learning search context
  • Exploiting users’ on-task search query and click sequence
  • Jointly optimized on two companion retrieval tasks
  • Future works
  • Modeling across-task relatedness, e.g., users’ long-term search

interest

  • Apply to any scenario where a user sequentially interacts with a

system

CS@UCLA Context Attentive Ranking and Suggestion 29

slide-30
SLIDE 30

Thank You!

Q&A

Context Attentive Ranking and Suggestion 30 CS@UCLA

In-task Context: a richer way

to understand users’ search intent

https://github.com/wasiahmad/context_attentive_ir

Codes will be released soon!