STAMP Short-Term Attention Memory Priority Model for Session-based Recommendation
ADVISOR: JIA-LING, KOH SOURCE: KDD 2018 SPEAKER: SHAO-WEI, HUANG DATE: 2019/07/01
Model for Session-based Recommendation ADVISOR: JIA-LING, KOH - - PowerPoint PPT Presentation
1 STAMP Short-Term Attention Memory Priority Model for Session-based Recommendation ADVISOR: JIA-LING, KOH SOURCE: KDD 2018 SPEAKER: SHAO-WEI, HUANG DATE: 2019/07/01 2 OUTLINE Introduction Method Experiment Conclusion
ADVISOR: JIA-LING, KOH SOURCE: KDD 2018 SPEAKER: SHAO-WEI, HUANG DATE: 2019/07/01
A user open a broswer. A user close a broswer.
⚫ Predict user’s next action (click on an item) in
Predict user’s next click item
➢ Only consider Long term memory, without explicitly taking
Short-Term Memory priority(STMP)
⚫ 𝑛𝑡 denotes the user’s general interest. ⚫ 𝑛𝑢 denotes the user’s current interest.
(d-dimensional)
A session
Short-Term Memory priority(STMP)
⚫ A simple MLP without hidden layer
𝑏𝑑𝑢𝑗𝑤𝑏𝑢𝑗𝑝𝑜 𝑔𝑣𝑜𝑑𝑢𝑗𝑝𝑜 (𝑢𝑏𝑜ℎ)
⚫ ℎ𝑡 = 𝑔 𝑋
𝑡 𝑛𝑡 + 𝑐𝑡
𝑢 𝑛𝑢 + 𝑐𝑢
d*1 d*d d*1 d*1 d*d d*1
candidate
Short-Term Memory priority(STMP)
⚫ Calculate the trilinear product of the
candidate
𝑒
⚫ Use softmax function to obtain
𝑢ℎ𝑓 𝑗𝑢𝑓𝑛 𝑗(d-dimensional)
Short-Term Memory priority(STMP)
⚫ Cross entropy
candidate
Short-Term Attention Memory priority(STAMP)
⚫ Generate attent weights
candidate
𝑢ℎ𝑓 𝑗𝑢𝑓𝑛 𝑗(d-dimensional)
𝑦𝑗
⚫ Compose attent weights
➢ Yoochoose:
commerce web.
➢ Diginetna:
web.
➢ Evaluation metrics
➢ Next-click prediction on 3 benchmark data sets
Neural model
Not neural model
➢ Compare STAMP with NARM
➢ Effects of the last click
𝑗𝑜 𝑢ℎ𝑓 𝑢𝑠𝑗𝑚𝑗𝑜𝑓𝑏𝑠 𝑚𝑏𝑧𝑓𝑠.
➢ Effects of the last click
➢ Effects of the last click
➢ Comparison among proposed models
Short is better than Long Long is better than Short
➢ Comparison among proposed models
➢ Further Investigation(Attention)
Session ID Categort of an ID
➢ Propose a short-term attention/memory priority model for session-
based recommendations.
➢ The next move of a user is mostly affected by the last-click of a
session prefix, and our model can effectively utilize such information through the temporal interests representation.
➢ The proposed attention mechanism can effectively capture long-
term and short-term interests of a session.