ncdrec a decomposability inspired framework for top n
play

NCDREC: A Decomposability Inspired Framework for Top-N - PowerPoint PPT Presentation

Introduction & Motivation NCDREC Framework Experimental Evaluation Future Work & Conclusion NCDREC: A Decomposability Inspired Framework for Top-N Recommendation Athanasios N. Nikolakopoulos 1 , 2 John D. Garofalakis 1 , 2 1 Computer


  1. Introduction & Motivation NCDREC Framework Experimental Evaluation Future Work & Conclusion NCDREC: A Decomposability Inspired Framework for Top-N Recommendation Athanasios N. Nikolakopoulos 1 , 2 John D. Garofalakis 1 , 2 1 Computer Engineering and Informatics Department, University of Patras, Greece 2 Computer Technology Institute & Press “Diophantus” IEEE/WIC/ACM International Conference on Web Intelligence Warsaw, August 2014 A. N. Nikolakopoulos and J. D. Garofalakis NCDREC

  2. Introduction & Motivation NCDREC Framework Experimental Evaluation Future Work & Conclusion Outline Introduction & Motivation 1 Recommender Systems - Collaborative Filtering Challenges of Modern CF Algorithms NCDREC Framework 2 NCDREC Model Criterion for ItemSpace Coverage NCDREC Algorithm: Storage and Computational Issues Experimental Evaluation 3 Evaluation Methodology Quality of Recommendations Long-Tail Recommendations Cold-Start Recommendations Future Work & Conclusion 4 A. N. Nikolakopoulos and J. D. Garofalakis NCDREC

  3. . USERS . . Recommendation RATINGS . ITEMS . . . SYSTEM RECOMMENDER . List . Rating Predictions . . Introduction & Motivation NCDREC Framework Recommender Systems - Collaborative Filtering Experimental Evaluation Challenges of Modern CF Algorithms Future Work & Conclusion Recommender System Algorithms Collaborative Filtering Recommendation Algorithms Wide deployment in Commercial Environments Significant Research Efforts A. N. Nikolakopoulos and J. D. Garofalakis NCDREC

  4. Introduction & Motivation NCDREC Framework Recommender Systems - Collaborative Filtering Experimental Evaluation Challenges of Modern CF Algorithms Future Work & Conclusion Challenges of Modern CF Algorithms Sparsity It is an Intrinsic RS Characteristic related to serious problems: Long-Tail Recommendation Cold start Problem Limited ItemSpace Coverage Traditional CF techniques, such as neighborhood models, are very susceptible to sparsity. Among the most promising approaches in alleviating sparsity related problems are: Dimensionality-Reduction Models Build a reduced latent space which is dense. Graph-Based Models . Exploit transitive relations in the data, while preserving some of the “locality”. A. N. Nikolakopoulos and J. D. Garofalakis NCDREC

  5. Introduction & Motivation NCDREC Framework Recommender Systems - Collaborative Filtering Experimental Evaluation Challenges of Modern CF Algorithms Future Work & Conclusion Exploiting Decomposability We attack the problem from a different perspective: Sparsity ← → Hierarchy ← → Decomposability. Nearly Completely Decomposable Systems Pioneered by Herbert A. Simon . Many Applications in Diverse Disciplines such as economics, cognitive theory and social sciences, to computer systems performance evaluation, data mining and information retrieval Main Idea : Exploit the innate Hierarchy of the Item Set, and view it as a decomposable space. Can this enrich the Collaborative Filtering Paradigm in an Efficient and Scalable Way? Does this approach offer any qualitative advantages in alleviating sparsity related problems? A. N. Nikolakopoulos and J. D. Garofalakis NCDREC

  6. Introduction & Motivation NCDREC Model NCDREC Framework Criterion for ItemSpace Coverage Experimental Evaluation NCDREC Algorithm: Storage and Computational Issues Future Work & Conclusion NCDREC Model Overview Definitions G � R + ǫ W We define a D - decomposition to be an indexed family of sets D � {D 1 , . . . , D K } , that span W � ǫ ZX ⊺ the ItemSpace V , We define D v � � v ∈D k D k to be the proximal X � diag( A D e ) − 1 A D set of items of v ∈ V , [ Z ] ik � ( n k ui ) − 1 [ RA D ] ik , when n k ui > 0 , We also define the associated block coupling graph G D � ( V D , E D ); its vertices correspond and zero otherwise. to the D -blocks, and an edge between two vertices exists whenever the intersection of these blocks is a non-empty set. Finally, we S ( ω ) � (1 − α ) E + α ( β H + (1 − β ) D ) introduce an aggregation matrix A D ∈ R m × K , whose jk th element is 1, if v j ∈ D k and zero otherwise. H � diag( Ce ) − 1 C , NCDREC Components where [ C ] ij � r ⊺ i r j for i � = j Main Component: Recommendation vectors D � XY , X � diag( A D e ) − 1 A D , produced by projecting the NCD perturbed data onto an f -dimensional space. Y � diag( A ⊺ D e ) − 1 A ⊺ D , ColdStart Component:Recommendation E � e ω ⊺ vectors are the stationary distributions of a Discrete Markov Chain Model. A. N. Nikolakopoulos and J. D. Garofalakis NCDREC

  7. Introduction & Motivation NCDREC Model NCDREC Framework Criterion for ItemSpace Coverage Experimental Evaluation NCDREC Algorithm: Storage and Computational Issues Future Work & Conclusion Criterion for ItemSpace Coverage Theorem (ItemSpace Coverage) If the block coupling graph G D is connected, there exists a unique steady state distribution π of the Markov chain corresponding to matrix S that depends on the preference vector ω ; however, irrespectively of any particular such vector, the support of this distribution includes every item of the underlying space. Proof Sketch: When G D is connected, the Markov chain induced by the stochastic matrix S consists of a single irreducible and aperiodic closed set of states, that includes all the items. The above is true for every stochastic vector ω , and for every positive real numbers α, β < 1. Taking into account the fact that the state space is finite, the resulting Markov chain becomes ergodic. So π i > 0, for all i , and the support of the distribution that defines the recommendation vector includes every item of the underlying space. A. N. Nikolakopoulos and J. D. Garofalakis NCDREC

  8. Introduction & Motivation NCDREC Model NCDREC Framework Criterion for ItemSpace Coverage Experimental Evaluation NCDREC Algorithm: Storage and Computational Issues Future Work & Conclusion NCDREC Algorithm: Storage and Computational Issues Input: Matrices R ∈ R n × m , H ∈ R m × m , X ∈ R m × K , Y ∈ R K × m , Z ∈ R n × K . Parameters α, β, f , ǫ Output: The matrix with recommendation vectors for every user, Π ∈ R n × m Step 1: Find the newly added users and collect their preference vectors into matrix Ω . ColdStart Procedure . Step 2: Compute Π sparse using the Step 3: Initialize vector p 1 to be a random unit length vector. Step 4: Compute the modified Lanczos procedure up to step M , using NCD PartialLBD with starting vector p 1 . Step 5: Compute the SVD of the bidiagonal matrix B and use it to extract f < M approximate singular triplets: v j } ← { Qu ( B ) , σ ( B ) , Pv ( B ) { ˜ } u j , σ j , ˜ j j j Step 6: Orthogonalize against the approximate singular vectors to get a new starting vector p 1 . Step 7: Continue the Lanczos procedure for M more steps using the new starting vector. Step 8: Check for convergence tolerance. If met compute matrix: V ⊺ Π full = ˜ UΣ˜ else go to Step 4 Step 9: Update Π full , replacing the rows that correspond to new users with Π sparse . Return Π full A. N. Nikolakopoulos and J. D. Garofalakis NCDREC

  9. Introduction & Motivation Evaluation Methodology NCDREC Framework Quality of Recommendations Experimental Evaluation Long-Tail Recommendations Future Work & Conclusion Cold-Start Recommendations Experimental Evaluation Datasets Yahoo!R2Music MovieLens Competing Methods Commute Time (CT) Pseudo-Inverse of the user-item graph Laplacian ( L † ) Matrix Forest Algorithm (MFA) First Passage Time (FP) Katz Algorithm (Katz) Metrics Recall Precision R-Score Normalized Discounted Cumulative Gain (NDCG@k) Mean Reciprocal Rank A. N. Nikolakopoulos and J. D. Garofalakis NCDREC

  10. Introduction & Motivation Evaluation Methodology NCDREC Framework Quality of Recommendations Experimental Evaluation Long-Tail Recommendations Future Work & Conclusion Cold-Start Recommendations Full Dist. Recommendations Methodology TABLE I Randomly sample 1.4% of the ratings of R ECOMMENDATION Q UALITY ON M OVIE L ENS 1M AND Y AHOO !R2M USIC D ATASETS U SING R-S CORE AND MRR M ETRICS the dataset ⇒ probe set P Use each item v j , rated with 5 stars by MovieLens1M Yahoo!R2Music R(5) R(10) MRR R(5) R(10) MRR user u i in P ⇒ test set T NCDREC 0.3997 0.5098 0.3008 0.3539 0.4587 0.2647 MFA 0.1217 0.1911 0.0887 0.2017 0.2875 0.1591 Randomly select another 1000 unrated L † 0.1216 0.1914 0.0892 0.1965 0.2814 0.1546 FP 0.2054 0.2874 0.1524 0.1446 0.2241 0.0998 items of the same user for each item in T Katz 0.2187 0.3020 0.1642 0.1704 0.2529 0.1203 CT 0.2070 0.2896 0.1535 0.1465 0.2293 0.1019 Form ranked lists by ordering all the 1001 items Recall@N Precision/Recall NDCG@N 0 . 2 0 . 6 0 . 6 MovieLens1M 0 . 15 0 . 4 0 . 4 0 . 1 0 . 2 0 . 2 5 · 10 − 2 0 0 0 5 10 15 20 0 0 . 2 0 . 4 0 . 6 0 . 8 1 5 10 15 20 0 . 6 Yahoo!R2Music 0 . 6 0 . 15 0 . 4 0 . 4 0 . 1 0 . 2 0 . 2 5 · 10 − 2 0 0 0 5 10 15 20 0 0 . 2 0 . 4 0 . 6 0 . 8 1 5 10 15 20 L † NCDREC Katz FP MFA CT A. N. Nikolakopoulos and J. D. Garofalakis NCDREC

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend