data context adaptation
play

Data Context Adaptation for Accurate Recommendation with Additional - PowerPoint PPT Presentation

Data Context Adaptation for Accurate Recommendation with Additional Information Hyunsik Jeon, Bonhun Koo, and U Kang Seoul National University IEEE BigData 2019 Hyunsik Jeon (SNU) 1 Outline n Introduction n Proposed Method n Experiments n


  1. Data Context Adaptation for Accurate Recommendation with Additional Information Hyunsik Jeon, Bonhun Koo, and U Kang Seoul National University IEEE BigData 2019 Hyunsik Jeon (SNU) 1

  2. Outline n Introduction n Proposed Method n Experiments n Conclusion Hyunsik Jeon (SNU) 2

  3. Recommendation Systems Ratings 5 Fantasy 1 5 friendship 3 friendship Action 5 … … 1 Drama 5 Hyunsik Jeon (SNU) 3

  4. Recommendation Systems Ratings 5 Recommendation Fantasy 1 5 friendship 3 friendship Action 5 … … 1 Drama 5 Hyunsik Jeon (SNU) 4

  5. Problem Definition (Data Context-Aware Recommendation) n Given : rating matrix ! , an auxiliary matrix " q ! : sparse rating matrix q " : social networks or item-genre relationships n Goal : to predict unseen rating values in ! Item User Genre 1 2 3 4 5 1 2 3 4 5 User 1 2 3 4 5 User Item 5 ? ? 2 1 1 1 1 0 0 1 0 1 1 1 ? ? 2 ? ? 2 2 0 0 1 0 0 2 1 1 3 ? ? ? ? 3 3 0 1 1 0 0 3 1 1 ? 3 1 ? ? 4 4 4 0 0 1 1 0 1 1 5 ? ? 2 ? 5 5 5 0 0 0 1 1 1 1 User-Movie Matrix Movie-Genre Matrix User-User Matrix ! matrix " matrices q Users want to be provided items that they will give high ratings. Hyunsik Jeon (SNU) 5

  6. Collective Matrix Factorization n Collective Matrix Factorization (CMF) is the most dominant method in data context-aware recommendation n Key idea of CMF q Factorize two matrices while sharing the common latent factor genre + item # item genre user item ≈ ) item # user " ! $,& ≈ ( $ &,* - , , - & & . * Item-Genre User-Item inner-product inner-product Matrix ) Rating Matrix ! Hyunsik Jeon (SNU) 6

  7. Details Collective Matrix Factorization genre + item # item genre user item ≈ ) item # user " ! $,& ≈ ( $ &,* , , - - & & . * Item-Genre User-Item inner-product inner-product Matrix ) Rating Matrix ! n Inference and loss q ! $ , ! ' ( ' - , " #$ = & # + $, = ( $ 0 + 0 + / / 0 ∑ #,$ ∈3 4 ! 0 ∑ $,, ∈3 7 ! q . = " #$ − " #$ + $, − + $, 0 + ( : 0 + - 8 0 ) , where 0 ( & : : n " is rating matrix, + is additional matrix, n & is user latent matrix, ( is item latent matrix, n - is additional context matrix (e.g., genre). Hyunsik Jeon (SNU) 7

  8. Details Collective Matrix Factorization n CMF is extended to biased-CMF if bias terms are added. ' 1 0 + 2 * , + 2 q ! $ + * + + * , , ! ' ( " #$ = & # / $0 = ( * 3 $ 6 + 6 + 5 5 6 ∑ #,$ ∈9 : ! 6 ∑ $,0 ∈9 < ! q 4 = " #$ − " #$ / $0 − / $0 6 + ( ? 6 + 1 = 6 ) 6 ( & ? ? q where n " is rating matrix, / is additional matrix, n & is user latent matrix, ( is item latent matrix, n 1 is additional context matrix (e.g., genre), n * + , * , , 2 * , , and 2 * 3 are 1-dimensional bias terms. Hyunsik Jeon (SNU) 8

  9. Motivation n Previous works have the following limitations: q 1) Lack of consideration for the fact that data contexts of rating auxiliary matrices are different q 2) Restricted capability of expressing independent information of users or items (e.g., biases) q 3) To predict entries via an inner-product ( linear ) How to address these limitations? Hyunsik Jeon (SNU) 9

  10. Outline n Introduction n Proposed Method n Experiments n Conclusion Hyunsik Jeon (SNU) 10

  11. Key Ideas n A novel approach for data context-aware recommendation n 1) Data context adaptation by ! " and ! # q To consider differences between $ and % n 2) Latent interaction/independence factors q No size limit for latent independence factors n 3) Fully-connected neural networks & " and & # q To model non-linear relationships Hyunsik Jeon (SNU) 11

  12. Overall Architecture n Factorize data matrices ! and " item & Rating Data Context Item-Genre Data Context genre ' + ,,# + * ,,# ≈ ≈ ! * ! item & user 2 #,% #,% … … MLP layer MLP layer Rating Matrix * Item-Genre Matrix ! ) . ) ( ) 3 , ∘ ( ) . ( - . # ∘ ( - 0 % ‖. - ‖0 % - 3 , # # # ( ) 3 , ∘ ( ) . ( - . # ∘ ( - 0 % # ) ) ( - 0 % - 4 , . ( ) 3 ( ) . . - ( - . 5 % # , # # # ( ) ( ) ( - ( - ∘ Element-wise Data Context product . 3 0 % # , Adaptation ‖ Concatenation Item User Genre Hyunsik Jeon (SNU) 12

  13. Latent Factors n Latent interaction/independent factors q Participate in different ways to predict an entry item & Rating Data Context Item-Genre Data Context genre ' + ,,# + * ,,# ≈ ≈ ! * ! item & user 2 #,% #,% … … MLP layer MLP layer Rating Matrix * Item-Genre Matrix ! ) . ) ( ) 3 , ∘ ( ) . ( - . # ∘ ( - 0 % ‖. - ‖0 % - 3 # , # # ( ) 3 , ∘ ( ) . ( - . # ∘ ( - 0 % # ) ) ( - 0 % - 4 , . ( ) 3 ( ) . . - ( - . 5 % # , # # # ( ) ( ) ( - ( - ∘ Element-wise Data Context product 3 . 0 % # , Adaptation ‖ Concatenation Item User Genre latent independence vector latent interaction vector Hyunsik Jeon (SNU) 13

  14. Data Context Adaptation n Any models can be used as adaptation functions ! " and ! # q Our choice is a linear projection: " $ % = ' ( $ % , ! = ' ( ) = ' + ) n ! " ) * , ! # ) * , and ! # , - = * * ' + , - q where . " is an projection matrix for / q . # is an projection matrix for 0 q $ % , ) * , and , - are latent interaction vectors Hyunsik Jeon (SNU) 14

  15. Non-linear Modeling n Any non-linear models can be used as predictive functions ! " and ! # q Our choice is a multilayer perceptron (MLP): * " + & ∘ * " - * # - ' ∘ * # 1 0 ' $ ) , $ " # % &' = ! " ( + & / '0 = ! # ( - ) n ' " - # 1 0 ' q where bracket 2 denotes concatenation of vectors q Tanh as activation functions in ! " and ! # q outputs of ! " and ! # are predicted ratings (scalars) Hyunsik Jeon (SNU) 15

  16. Loss Function n Minimize two reconstruction errors for ! and " together q # = 1 − ' ()** + + '()** - / + . 8 / ∑ 1,3 ∈5 6 7 n ()** + = ! 13 − ! 13 / 9:; + / + . 8 / ∑ 3,< ∈5 = 7 n ()** - = " 3< − " / 9:; - 3< q where Ω + and Ω - are observable entries in ! and " , resp. q 9:; + and 9:; - are #2 -regularizations for ! and " , resp. q ' controls the balance of gradients from ()** + and ()** - Hyunsik Jeon (SNU) 16

  17. Regularization n Regularization terms $ + + - $ * ' q !"# $ = ∑ '∈) * ' + + $ + + - $ 1 ∑ .∈ℐ ( 1 + ) + ∑ 3 4 5 . . + 6 + + - 6 1 q !"# 6 = ∑ .∈ℐ 1 + + . . 6 + + - 6 9 7 ∑ 7∈8 ( 9 7 + ) + ∑ 3 4 : + ; + is Frobenius norm of vectors and matrices n n ) is set of users n ℐ is set of items n 8 is set of genres Hyunsik Jeon (SNU) 17

  18. Outline n Introduction n Preliminaries n Experiments n Conclusion Hyunsik Jeon (SNU) 18

  19. Experiments n Experimental questions q Q1. Overall performance n How better is our method compared to competitors? q Q2. Effects of data context adaptation n How does data context adaptation layer affect the performance? q Q3. Effects of interaction/independence factors n How do dimensions of interaction/independence vectors affect the performance? q Q4. Neural Networks n Do deeper structures yield better performance? n Does the activation function help improve performance? Hyunsik Jeon (SNU) 19

  20. Datasets n 3 user-coupled datasets q social network for additional data n 3 item-coupled datasets q item-genre relationships for additional data Hyunsik Jeon (SNU) 20

  21. Competitors n Comparison of our method and competitors Hyunsik Jeon (SNU) 21

  22. Evaluation Metrics n RMSE (Root Mean Square Error) ' $ ∑ " ∑ # % "# − % "# ()*( +,(-./* n MAE (Mean Absolute Error) ∑ " ∑ # | $ % "# − % "# | ()*( +,(-./* Hyunsik Jeon (SNU) 22

  23. Experimental Results n Q1. Overall performance q Our method provides the best accuracy Hyunsik Jeon (SNU) 23

  24. Experimental Results n Q2. Effects of data context adaptation q !"#ℎ%&#'( : no adaptation to each context q )*+ : separate adaptation for each entity Hyunsik Jeon (SNU) 24

  25. Experimental Results n Q3. Effects of interaction/independence factors q The total capacity of model is fixed Hyunsik Jeon (SNU) 25

  26. Experimental Results n Q4-1. Neural Networks (deepness) q ! : DaConA with depth ! Hyunsik Jeon (SNU) 26

  27. Experimental Results n Q4-2. Neural Networks (activation functions) q !"#ℎ%&#'( : DaConA without activation functions Hyunsik Jeon (SNU) 27

  28. Extension n Using multiple auxiliary information q Ratings, social information, and genre information DaConA CMF Hybrid-CDL Biased-CMF 1.08 1.06 1.04 1.02 -9.4% - 6.3 % RMSE 1 0.98 -11.1% 0.96 Best 0.94 0.92 0.9 40 50 60 70 80 90 training set (%) Hyunsik Jeon (SNU) 28

  29. Outline n Introduction n Proposed Method n Experiments n Conclusion Hyunsik Jeon (SNU) 29

  30. Conclusion n We propose a novel approach for data context- aware recommendation q Additional information is given as well as ratings n Our key ideas: q 1) Data context adaptation q 2) Latent interaction/independence factors q 3) Non-linear modeling n DaConA outperforms the SOTA algorithms n Extensive experiments show our ideas help improve performance Hyunsik Jeon (SNU) 30

  31. Thank you ! https://datalab.snu.ac.kr/dacona Hyunsik Jeon (SNU) 31

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend