slim sparse linear methods for top n recommender systems
play

SLIM : Sparse Linear Methods for Top-N Recommender Systems Xia Ning - PowerPoint PPT Presentation

SLIM : Sparse Linear Methods for Top-N Recommender Systems Xia Ning and George Karypis Computer Science & Engineering University of Minnesota, Minneapolis, MN Email: {xning,karypis@cs.umn.edu} December 14, 2011 Introduction Methods


  1. SLIM : Sparse Linear Methods for Top-N Recommender Systems Xia Ning and George Karypis Computer Science & Engineering University of Minnesota, Minneapolis, MN Email: {xning,karypis@cs.umn.edu} December 14, 2011

  2. Introduction Methods Materials Experimental Results Conclusions 2/25 Outline Introduction 1 Top-N Recommender Systems Definitions and Notations The State-of-the-Art methods Methods 2 Sparse LInear Methods for top-N Recommendation Learning W for SLIM SLIM with Feature Selection Materials 3 Experimental Results 4 SLIM on Binary Data Top-N Recommendation Performance SLIM for Long-Tail Distribution SLIM Regularization Effects SLIM on Rating Data Conclusions 5 Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  3. Introduction Methods Materials Experimental Results Conclusions 3/25 Outline Introduction 1 Top-N Recommender Systems Definitions and Notations The State-of-the-Art methods Methods 2 Sparse LInear Methods for top-N Recommendation Learning W for SLIM SLIM with Feature Selection Materials 3 Experimental Results 4 SLIM on Binary Data Top-N Recommendation Performance SLIM for Long-Tail Distribution SLIM Regularization Effects SLIM on Rating Data Conclusions 5 Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  4. Introduction Methods Materials Experimental Results Conclusions 4/25 Top-N Recommender Systems ❑ Top-N recommendation ❑ E-commerce: huge amounts of products ❑ Recommend a short ranked list of items for users ❑ Top-N recommender systems ❑ Neighborhood-based Collaborative Filtering ( CF ) ❑ Item based [2]: fast to generate recommendations, low recommendation quality ❑ Model-based methods [1, 3, 5] ❑ Matrix Factorization ( MF ) models: slow to learn the models, high recommendation quality ❑ SLIM : Sparse LInear Methods ❑ Fast and high recommendation quality Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  5. Introduction Methods Materials Experimental Results Conclusions 5/25 Definitions and Notations Table 1: Definitions and Notations Def Descriptions user u i t j item all users ( |U| = n ) U all items ( |T | = m ) T user-item purchase/rating matrix, size n × m A W item-item similarity matrix/coefficient matrix a T The i -th row of A , the purchase/rating history of u i on T i The j -th column of A , the purchase/rating history of U on t j a j ❑ Row vectors are represented by having the transpose supscript T , otherwise by default they are column vectors. ❑ Use matrix/vector notations instead of user/item purchase/rating profiles Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  6. Introduction Methods Materials Experimental Results Conclusions 6/25 The State-of-the-Art Methods Item-based Collaborative Filtering (1) ❑ Item-based k -nearest-neighbor ( itemkNN ) CF ❑ Identify a set of similar items ❑ Item-item similarity: ❑ Calculated from A ❑ Cosine similarity measure 2nd nn 1st nn t 1 t 2 t 3 . . . . . . t j . . . . . . t 1 t 2 t 3 . . . . . . t j . . . . . . t m − 1 t m t m − 1 t m s s s u 1 1 t 1 s s s u 2 1 1 1 t 2 s s s u 3 1 1 1 t 3 . . . . . . . . . . . . s s s 1 1 1 . . . . . . . . . . . . . . . . . . . . . . . . s u i t j 1 . . . . . . s 1 . . . . . . s s s 1 1 1 s u n − 1 1 t m − 1 s s u n 1 1 t m A W Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  7. Introduction Methods Materials Experimental Results Conclusions 7/25 The State-of-the-Art Methods Item-based Collaborative Filtering (2) t 1 t 2 t 3 . . . . . . t j . . . . . . u T u T t m − 1 t m ∗· ∗· p s s s t 1 t 1 t 1 s s s t 2 t 2 t 2 p s s s t 3 t 3 1 t 3 . . . . . . . . . . . . . . . . . . p × = s s s 1 . . . . . . . . . . . . p s t j t j t j . . . . . . . . . s . . . p . . . . . . s s s 1 p s t m − 1 t m − 1 t m − 1 s s t m t m 1 t m ❑ itemkNN recommendation ❑ Recommend similar items to what the user has purchased a T i = a T ˜ i × W ❑ Fast: sparse item neighborhood ❑ Low quality: no knowledge is learned Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  8. Introduction Methods Materials Experimental Results Conclusions 8/25 The State-of-the-Art Methods Matrix Factorization (1) ❑ Latent factor models ❑ Factorize A into low-rank user factors ( U ) and item factors ( V T ) ❑ U and V T represent user and item characteristics in a common latent space ❑ Formulated as an optimization problem 1 F + β F + λ 2 � A − UV T � 2 2 � U � 2 2 � V T � 2 minimize F U , V T t 1 t 2 t 3 . . . . . . t j . . . . . . l 1 l 2 . . . l k t m − 1 t m u u u u u 1 1 u 1 u u u u u 2 u 2 1 1 1 t 1 t 2 t 3 . . . . . . t j . . . . . . u u u u u 3 1 1 1 u 3 t m − 1 t m . . . . . . u u u u v v v v v v v v v v l 1 . . . . . . u u u u v v v v v v v v v v 1 1 1 l 2 × . . . . . . . . . . . . u u u u v v v v v v v v v v u i u i . . . 1 . . . . . . u u u u v v v v v v v v v v 1 l k . . . . . . u u u u 1 1 1 u u u u u n − 1 1 u k − 1 u n u k u u u u 1 1 × V T A U Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  9. Introduction Methods Materials Experimental Results Conclusions 9/25 The State-of-the-Art Methods Matrix Factorization (2) u T ∗· p t 1 p t 2 t 1 t 2 t 3 . . . . . . t j . . . . . . p t 3 t m − 1 t m . . . p v v v v v v v v v v l 1 l 2 . . . l k l 1 . . . p u u u u × v v v v v v v v v v = u ∗ l 2 p v v v v v v v v v v t j . . . . . . p v v v v v v v v v v l k . . . p p t m − 1 p t m ❑ MF recommendation ❑ Prediction: dot product in the latent space a ij = U T ˜ i · V j ❑ Slow: dense U and V T ❑ High quality: user tastes and item properties are learned Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  10. Introduction Methods Materials Experimental Results Conclusions 10/25 Outline Introduction 1 Top-N Recommender Systems Definitions and Notations The State-of-the-Art methods Methods 2 Sparse LInear Methods for top-N Recommendation Learning W for SLIM SLIM with Feature Selection Materials 3 Experimental Results 4 SLIM on Binary Data Top-N Recommendation Performance SLIM for Long-Tail Distribution SLIM Regularization Effects SLIM on Rating Data Conclusions 5 Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  11. Introduction Methods Materials Experimental Results Conclusions 11/25 SLIM for top-N Recommendation ❑ Motivations: ❑ recommendations generated fast ❑ high-quality recommendations ❑ “have my cake and eat it too” ❑ Key ideas: ❑ retain the nature of itemkNN : sparse W ❑ optimize the recommendation performance: learn W from A ❑ sparsity structures ❑ coefficient values Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  12. Introduction Methods Materials Experimental Results Conclusions 12/25 Learning W for SLIM ❑ The optimization problem: F + β 1 2 � A − AW � 2 2 � W � 2 minimize F + λ � W � 1 W (1) subject to W ≥ 0 diag ( W ) = 0 , Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  13. Introduction Methods Materials Experimental Results Conclusions 12/25 Learning W for SLIM ❑ The optimization problem: F + β 1 2 � A − AW � 2 2 � W � 2 minimize F + λ � W � 1 W (1) subject to W ≥ 0 diag ( W ) = 0 , ❑ Computing W : ❑ The columns of W are independent: easy to parallelize ❑ The decoupled problems: 1 2 + β 2 � a j − A w j � 2 2 � w j � 2 minimize 2 + λ � w j � 1 w j (2) subject to w j ≥ 0 w j , j = 0 , Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  14. Introduction Methods Materials Experimental Results Conclusions 13/25 Reducing model learning time 1 2 + β 2 � a j − A w j � 2 2 � w j � 2 minimize 2 + λ � w j � 1 w j ❑ fsSLIM : SLIM with f eature s election ❑ Prescribe the potential non-zero structure of w j ❑ Select a subset of columns from A ❑ itemkNN item-item similarity matrix a j u 1 1 1 1 1 1 u 1 u 2 1 1 u 2 1 1 1 u 3 1 1 1 1 1 u 3 1 . . . . . . . . . . . . 1 1 1 1 1 . . . . . . . . . . . . u i u j 1 . . . . . . 1 1 . . . . . . 1 1 1 1 1 1 1 1 u n − 1 1 1 u m − 1 1 u n 1 1 u m 1 1 A ′ A 1 2 + β 2 � a j − A ′ w j � 2 2 � w j � 2 minimize 2 + λ � w j � 1 w j Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

  15. Introduction Methods Materials Experimental Results Conclusions 14/25 Outline Introduction 1 Top-N Recommender Systems Definitions and Notations The State-of-the-Art methods Methods 2 Sparse LInear Methods for top-N Recommendation Learning W for SLIM SLIM with Feature Selection Materials 3 Experimental Results 4 SLIM on Binary Data Top-N Recommendation Performance SLIM for Long-Tail Distribution SLIM Regularization Effects SLIM on Rating Data Conclusions 5 Xia Ning and George Karypis SLIM : Sparse Linear Methodsfor Top-N Recommender Systems •

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend