next item recommendation with sequential hypergraphs
play

Next-item Recommendation with Sequential Hypergraphs Jianling Wang, - PowerPoint PPT Presentation

Next-item Recommendation with Sequential Hypergraphs Jianling Wang, Kaize Ding*, Liangjie Hong**, Huan Liu* and James Caverlee Texas A&M University Arizona State University* LinkedIn Inc.** Next-item Recommendation The goal is to infer


  1. Next-item Recommendation with Sequential Hypergraphs Jianling Wang, Kaize Ding*, Liangjie Hong**, Huan Liu* and James Caverlee Texas A&M University Arizona State University* LinkedIn Inc.**

  2. Next-item Recommendation The goal is to infer the dynamic user preferences with sequential user interactions. Historic User-Item Interactions … … User A bouquet wall decoration sofa … User C iPhone XR Nintendo Switch iPhone 8 2018 2017 2019

  3. Next-item Recommendation The goal is to infer the dynamic user preferences with sequential user interactions. Historic User-Item Interactions … … The next User A bouquet wall decoration sofa item … User C iPhone XR Nintendo Switch iPhone 8 2018 2017 2019

  4. How are items treated?

  5. Items emerge and disappear • From a long-term perspective, the relationships between items are unstable. ==> Short-term relationships are critical for item modeling. More than 50% of the items becomes inactive shortly More than 50% of the items becomes inactive shortly More than 50% of the items become inactive shortly.

  6. The relationships change • The relationships between items are changing along time. • The variations are larger the longer time gap. Neighboring items change temporally We capture the item co-occurrence with word2vec. Neighboring items change along time. (c)

  7. How are items treated? For a certain time period, the meaning of an item can be revealed by the correlations defined by user interactions in the short term. C iPhone 8 Nintendo Switch The time when September 2017 iPhone 8 came out

  8. How are items treated? For a certain time period, the meaning of an item can be revealed by the correlations defined by user interactions in the short term. C E Nintendo Switch iPhone 8 Lite AirPods Gen1 Nintendo iPhone 8 Switch D iPhone 8 became a September 2017 September 2019 budget choice

  9. Challenge 1 • High-order correlations • Multiple-hop connections B Apple lighting A user may purchase multiple Cable E Nintendo numbers of items in a certain Switch time period. Lite AirPods AirPods Case Gen1 iPhone 8 D

  10. Challenge 1 • High-order correlations • Multiple-hop connections B Apple lighting Cable E Nintendo Switch Lite AirPods AirPods Items connected by multiple- Case Gen1 hop path are related. iPhone 8 D

  11. Challenge 2 The semantics of an item can change across users and over time. A C iPhone 11 Pro AirPods Pro C The same flower wedding veil bridal gown B bouquet is linked to di ff erent purposes Apple lighting Cable E Nintendo iPhone 8 bouquet Switch Lite AirPods AirPods sofa Case Nintendo Gen1 wall iPhone 8 Switch B decoration D September 2017 September 2019

  12. Challenge 2 The semantics of an item can change across users and over time. A C iPhone 11 Pro The semantic AirPods Pro C wedding veil meaning of the bridal gown B iPhone changes Apple lighting along time Cable E Nintendo iPhone 8 bouquet Switch Lite AirPods AirPods sofa Case Nintendo Gen1 wall iPhone 8 Switch B decoration D September 2017 September 2019

  13. Our proposal: HyperRec A novel end-to-end framework with sequential Hypergraphs to enhance next-item recommendation. Dynamic Self-attention User Preference Dynamic … User Modeling Fusion Layer Predicted x Score Dynamic Dynamic Dynamic Item Item Item + … … … Embedding Embedding Embedding t n HGCN t 2 HGCN HGCN t 1 Sequential Layer L Layer L Layer L Hypergraphs 0 0 0 0 0 0 7 7 0 0 0 Residual Residual … 2 2 2 2 9 2 2 2 1 Gating Gating 1 1 1 6 1 1 1 8 6 4 4 4 4 3 4 4 4 4 3 3 3 5 3 3 3 3 Static Item … Embedding

  14. Hypergraph Each hyperedge in a hypergraph can connect multiple nodes on a single edge, s.t., • Each node denotes an item; each hypedge can connect the set of items a user interacts within a certain short time period altogether. item 0 0 2 2 1 1 user 4 4 3 3 Simple Graph Hypergraph

  15. Hypergraph Hypergraph Convolutional Layers (HGCN) 0 0 ϵ 1 0 ϵ 1 ϵ 1 2 ϵ 1 2 ϵ 1 2 ϵ 1 2 2 ϵ 2 1 2 ϵ 2 1 ϵ 2 1 ϵ 2 ϵ 2 ϵ 2 4 3 4 4 3 … 3 Nodes —> Hyperedges Hyperedges —>Nodes

  16. Sequential Hypergraphs Split user-item interactions based on the timestamps. Construct a series of short-term hypergraphs for di ff erent timestamps. t n HGCN HGCN HGCN t 1 t 2 Layer L Layer L Layer L 0 0 0 0 0 0 7 7 0 0 0 … 2 2 2 2 9 2 2 2 1 1 1 1 6 1 1 1 8 6 4 4 4 4 3 4 4 4 4 3 3 3 5 3 3 3 3

  17. Sequential Hypergraphs Residual Gating: Model the residual information among the consecutive timestamps. Dynamic Dynamic Dynamic Item Item Item … … … Embedding Embedding Embedding t n HGCN t 2 HGCN HGCN t 1 Sequential Layer L Layer L Layer L Hypergraphs Static 0 0 0 0 0 0 7 7 0 0 0 Residual Residual Item … 2 2 2 2 9 2 2 2 1 Gating Gating Embedding 1 1 1 6 1 1 1 8 6 4 4 4 4 3 4 4 4 4 3 3 3 5 3 3 3 3 … Embedding

  18. Dynamic User Modeling Short-term User Intent: Combining the items interacted by the user in the short-term period. ==> embeddings of hyperedges 0 ϵ 1 2 1 ϵ 2 4 3

  19. Dynamic User Modeling Fusion Layer: To generate the representation for a user-item interaction at timestamp t. … Fusion Layer (user, item, timestamp) Short-term Dynamic Static user intent Item Item Embedding Embedding Embedding

  20. Dynamic User Modeling Self-attention: Generate the dynamic user embedding Self-attention … Fusion Layer

  21. HyperRec Dynamic Self-attention User Preference Dynamic … User Modeling Fusion Layer Predicted x Score Dynamic Dynamic Dynamic Item Item Item + … … … Embedding Embedding Embedding t n HGCN HGCN HGCN t 1 t 2 Sequential Layer L Layer L Layer L Hypergraphs 0 0 0 0 0 0 7 0 0 7 0 Residual Residual … 2 2 2 2 9 2 2 2 1 Gating Gating 1 1 1 6 1 1 1 8 6 4 4 4 4 3 4 4 4 4 3 3 3 5 3 3 3 3 Static Item … Embedding

  22. Experiments: Data Three Datasets: #User-User Cutting Dataset #User #Item Density Interactions Timestamp Amazon 74,823 64,602 1,475,092 0.0305% Jan 1, 18 Ecommerce Etsy 15,357 56,969 489,189 0.0559% Jan 1, 18 Information Sharing Goodreads 16,884 20,828 1,730,711 0.4922% Jan 1,17 Platform

  23. Experiments: Metric Leave-one-out Setting • HIT@K: Hit Rate • NDCG@K: Normalized Discounted Cumulative Gain • MRR: Mean Reciprocal Rank • K=1, 5

  24. Experiments: Baselines Compare with next-item recommendation frameworks: • PopRec : Most Popular • TransRec : Translation-based Recommendation (RecSys 2017) • GRU4Rec+ : Recurrent Neural Networks with Top-K Gains (CIKM 2018) • TCN : Convolutional Generative Network for Next Item Recommendation (WSDM 2019)

  25. Experiments: Baselines Compare with attention-based recommendation frameworks: • HPMN : Lifelong Sequential Modeling with Personalized Memorization (SIGIR 2019) • HGN : Hierarchical Gating Networks for Sequential Recommendation (KDD 2019) • SASRec : Self-attention Sequential Recommendation (ICDM 2018) • BERT4Rec : Bidirectional Encoder Representations from Transformer for Sequential Recommendation (CIKM 2019)

  26. HyperRec vs Baselines • HyperRec can achieve the best performance for all of the evaluation metrics in the experiments. • HyperRec outperforms all the baselines by 20.03%, 7.90% and 17.62% for Amazon, Etsy and Goodreads in NDCG@1/HIT@1. • The outstanding performance of HyperRec in both e- commerce and information sharing platforms demonstrate that it can be generalized to various online platforms.

  27. Impact of each component? We conduct ablation tests to examine the e ff ectiveness of each component. Architecture Amazon Etsy Goodreads (1) HyperRec 0.1215 0.4712 0.2809 (2) Static Item Embedding 0.1051 0.4477 0.2643 (3) Replace Hypergraph 0.0978 0.4588 0.2576 (4) (-) Residual 0.1169 0.4591 0.2626 (5) (-) Dynamic Item Embedding 0.1131 0.4646 0.2789 (6) (-) Short-term User Intent 0.1147 0.4616 0.2709 (7) (-) Dynamic in Prediction 0.1151 0.4703 0.2746 Table 3: Results for Ablation Test under HIT@1/NDCG@1. Results under NDCG@1/HIT@1

  28. Impact of each component? It is essential to have dynamic item embedding revealing their change of semantic meanings with the sequential Hypergraphs. Architecture Amazon Etsy Goodreads (1) HyperRec 0.1215 0.4712 0.2809 (2) Static Item Embedding 0.1051 0.4477 0.2643 (3) Replace Hypergraph 0.0978 0.4588 0.2576 (4) (-) Residual 0.1169 0.4591 0.2626 (5) (-) Dynamic Item Embedding 0.1131 0.4646 0.2789 (6) (-) Short-term User Intent 0.1147 0.4616 0.2709 (7) (-) Dynamic in Prediction 0.1151 0.4703 0.2746 Table 3: Results for Ablation Test under HIT@1/NDCG@1. Results under NDCG@1/HIT@1

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend