future of personalized recommendation systems
play

Future of Personalized Recommendation Systems Xing Xie Microsoft - PowerPoint PPT Presentation

Future of Personalized Recommendation Systems Xing Xie Microsoft Research Asia Recommendation Everywhere Personalized News Feed Online Advertising History 2010 (Various data competitions) Hybrid models with machine learning 1990s (Tapestry,


  1. Future of Personalized Recommendation Systems Xing Xie Microsoft Research Asia

  2. Recommendation Everywhere

  3. Personalized News Feed

  4. Online Advertising

  5. History 2010 (Various data competitions) Hybrid models with machine learning 1990s (Tapestry, GroupLens) Content based filtering LR, FM, GBDT, etc. Collaborative filtering Pair-wise ranking Explainable recommendation CB Knowledge enhanced recommendation ML DL FM Reinforcement learning Transfer learning CF … 2006 (Netflix prize) 2015 (Deep learning) Factorization-based Models Flourish with neural models SVD++ PNN, Wide&Deep, DeepFM, xDeepFM, etc.

  6. Our Research Deep learning based user modeling Deep learning based recommendation user Explainable recommendation Item Knowledge enhanced recommendation

  7. Microsoft Recommenders • Helping researchers and developers to quickly select, prototype, demonstrate, and productionize a recommender system • Accelerating enterprise-grade development and deployment of a recommender system into production • https://github.com/microsoft/recommenders

  8. User Behavioral Data

  9. Explicit User Representation Demographic Personality Interests Status Social Schedule Age Openness Food Emotion Friend Task Gender Conscientiousness Book Event Coworker Driving route Life stage Extraversion Movie Health Spouse Metro/bus line Marital status Agreeableness Music Wealth Children Appointment Residence Neuroticism Sport Device Other relatives Vacation Education Impulsivity Restaurant Tie strength Vocation Novelty-seeking Indecisiveness

  10. Explicit vs Implicit DNN Model Item Embedding Implicit User Representation User Embedding Deep Models Representation Pros Cons Network Embedding ID Embedding Text Embedding Image Embedding • • Easy to understand; Hard to obtain • Can be directly training data; • bidden by Difficult to satisfy Explicit advertisers complex and global needs; IDs Texts Images Network • • Unified and Difficult to explain; • heterogenous user Need to fine-tune in Implicit representation; each task Feature Engineering • End-to-end learning Classification/Regression Models Explicit User Representation

  11. Query Log based User Modeling gifts for classmates groom to bride gifts cool math games tie clips mickey mouse cartoon philips shaver shower chair for elderly lipstick color chart presbyopic glasses womans ana blouse costco hearing aids Dior Makeup Chuhan Wu, Fangzhao Wu, Junxin Liu, Shaojian He, Yongfeng Huang, Xing Xie, Neural Demographic Prediction using Search Query, WSDM 2019

  12. Query Log based User Modeling Different words may have different importance birthday gift for grandson Different records have central garden street different informativeness google my health plan Neighboring records may medicaid new York have relatedness, while far The same word may have different medicaid for elderly in new York ones usually not importance in different contexts alcohol treatment amazon.com documentary grandson youtube

  13. Query Log based User Modeling

  14. Mapping between age category and age range Experiments • Dataset: • 15,346,617 users in total with age category labels • Randomly sampled 10,000 users for experiments • Search queries posted from October 1, 2017 to March 31, 2018 Distribution of query number per user Distribution of query length Distribution of age category

  15. Experiments discrete feature, linear model continuous feature, linear model flat DNN models hierarchical LSTM model

  16. User Age Inference Queries from a young user Queries from an elder user

  17. Car / Pet Segment

  18. Universal User Representation • Existing user representation learning are task-specific • Difficult to generalize to other tasks • Highly rely on labeled data • Costly to exploit heterogenous unlabeled user behavior data • Learn universal user representations from heterogenous and multi- source user data • Capture global patterns of online users • Easily applied to different tasks as additional user features • Do not rely on manually labeled data

  19. Deep Learning Based Recommender System Learning latent representations Learning feature interactions

  20. Motivations • We try to design a new neural structure that • Automatically learns explicit high-order interactions • Vector-wise interaction, rather than bit-wise • Different types of feature interactions can be combined easily • Goals • Higher accuracy • Reducing manual feature engineering work Jianxun Lian, Xiaohuan Zhou, Fuzheng Zhang, Zhongxia Chen, Xing Xie, Guangzhong Sun, xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems, KDD 2018

  21. Compressed Interaction Network (CIN)

  22. Relation with CNN An example of image CNN Feature map 1 m … … Feature map H K+1 D Direction of filter sliding

  23. Extreme Deep Factorization Machine (xDeepFM) • Combining explicit and implicit feature interaction network • Integrate both memorization and generalization

  24. Data • Criteo: ads click-through-rate prediction • Dianping: restaurant recommendation • Bing News: news recommendation

  25. Experiments

  26. Experiments

  27. Knowledge Graph • A kind of semantic network, where node indicates entity or concept, edge indicates the semantic relation between entity/concept

  28. Knowledge Enhanced Recommendation • Precision • More semantic content about items • Deep user interest • Diversity • Different types of relations in knowledge graph • Extend user’s interest in different paths • Explainability • Connect user interest and recommendation results • Improve user satisfaction, boost user trust

  29. Knowledge Graph Embedding • Learns a low-dimensional vector for each entity and relation in KG, which can keep the structural and semantic knowledge Distance-based Models ❑ Apply distance-based score function to estimate the triple probability ❑ TransE, TransH, TransR, etc.

  30. Knowledge Graph Embedding Matching-based Models ❑ Apply similarity-based score function to estimate the triple probability ❑ SME, NTN, MLP, NAM, etc.

  31. Knowledge Graph Embedding KG KGE Entity Vector, Relation Vector (Joint Training) User Vector, Item Vector Learning RS Feed into KGE Learning Entity Vector User Vector (Successive Training) RS Task KG Relation Vector Item Vector KGE KG Entity Vector, Relation Vector (Alternate Training) Learning User Vector, Item Vector RS

  32. Deep Knowledge-aware Network Hongwei Wang, Fuzheng Zhang, Xing Xie, Minyi Guo, DKN: Deep Knowledge-Aware Network for News Recommendation, WWW 2018

  33. Deep Knowledge-aware Network

  34. Extract Knowledge Representations • Additionally use contextual entity embeddings to include structural information • Context implies one-step neighbor

  35. Deep Knowledge-aware Network

  36. Experiments

  37. Examples

  38. Ripple Network • Users interests as seed entity, propagates in the graph step by step • Decay in the propagating process Hongwei Wang, etc. Ripple Network: Propagating User Preferences on the Knowledge Graph for Recommender Systems, CIKM 2018

  39. Ripple Network

  40. Experiments

  41. Example

  42. Explainable Recommendation Systems Effectiveness Transparency Model Presentation Persuasiveness Explainability Quality Trust Readability

  43. Explainable Recommendation Systems 1-800-FLOWERS.COM – Elegant Flowers for Lovers Fog Harbor Fish House Their tan tan noodles are made of magic. The chili oil is really appetizing. However, prices are on the high side. Effectiveness Transparency Model Presentation Persuasiveness Explainability Quality Trust Readability

  44. Problem Definition • Input • User set 𝑉 , 𝑣 ∈ 𝑉 is a user 𝑣: user ID and user attributes • Item set 𝑊 , 𝑤 ∈ 𝑊 is an item 𝑗: item ID 𝑚 𝑘 : interpretable component • A recommendation model to be explained 𝑔(𝑣, 𝑤) • Output • z is generated based on the selected components • Explanation 𝑨 = expgen The 𝑘 th interpretable component is selected The 𝑘 th interpretable component is not selected

  45. Outline Can we enhance persuasiveness (presentation quality) in a data-driven way? Users 𝑉 Recommendation Recommended Explanation Explanation 𝑨 Feedback Aware Generative Model, Shipped to Bing Ads , revenue increased by 0.5% model 𝑔(𝑣, 𝑤) items ′ Method Items 𝑊 Can we build an explainable deep model (enhance model explainability)? … … Users 𝑉 Explanation Explainable Recommendation Through Attentive Multi-View Learning, AAAI 2019 Explanation 𝑨 Recommended Method items Items 𝑊 Recommendation model 𝑔(𝑣, 𝑤) Can we design a pipeline which better balances presentation quality and model explainability? A Reinforcement Learning Framework for Explainable Recommendation, ICDM 2018

  46. Explainable Recommendation for Ads Search Ads Advertiser Platform Native Ads / MSN Native Ads / Outlook.com

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend