pranking with ranking
play

PRanking with Ranking Koby Crammer Technion Israel Institute of - PowerPoint PPT Presentation

PRanking with Ranking Koby Crammer Technion Israel Institute of Technology Based on joint work with Yoram Singer at the Hebrew University of Jerusalem Problem Machine Prediction Users Rating Ranking 3 3 0 3 1 Loss Ranking


  1. PRanking with Ranking Koby Crammer Technion – Israel Institute of Technology Based on joint work with Yoram Singer at the Hebrew University of Jerusalem

  2. Problem Machine Prediction User’s Rating Ranking 3 3 0 3 1 Loss

  3. Ranking – Formal Description Problem Setting Online Framework • Instances • Algorithm works in rounds • On each round the ranking • Labels algorithm : – Gets an input instance – Outputs a rank as prediction • Structure 1  2  3  4  5 – Receives the correct rank- value • Ranking rule – Computes loss – Updates the rank-prediction x 1 is preferred over x 2 • Ranking Loss rule

  4. Goal t L t = y i − ˆ y ∑ • Algorithms Loss i i = 1 t L t f y i − f x i ∑ • Loss of a fixed function ( ) = ( ) i = 1 L t − inf f ∈ F L t f • Regret ( ) • No statistical assumptions over data • The algorithm should do well irrespectively of specific sequence of inputs and target labels

  5. Background Binary Classification

  6. The Perceptron Algorithm Rosenblatt, 1958 Hyperplane w • w 1 2

  7. The Perceptron Algorithm Rosenblatt, 1958 Hyperplane w • Get new instance x • Classify x : w • sign( ) (X,1) 1 2

  8. The Perceptron Algorithm Rosenblatt, 1958 Hyperplane w • w Get new instance x • Classify x : w • sign( ) • Update (in case of a mistake) 1 (X,1) 2 1 2

  9. A Function Class for Ranking

  10. Our Approach to Ranking • Project

  11. Our Approach to Ranking • Project • Apply Thresholds < > 4 3 2 1 Rank

  12. Update of a Specific Algorithm it if its not Broken Least change as possible One step at a time

  13. PRank Direction w , • Thresholds w Thresholds 1 2 3 4 5 Rank Levels

  14. PRank Direction w , • Thresholds Rank a new instance x • w 1 2 3 4 5

  15. PRank Direction w , Correct Rank • Thresholds Interval Rank a new instance x • Get the correct rank y • w 1 2 3 4 5

  16. PRank Direction w , • Thresholds Rank a new instance x • Get the correct rank y • Compute Error-Set E • w 1 2 3 4 5

  17. PRank – Update Direction w , • Thresholds Rank a new instance x • Get the correct rank y • Compute Error-Set E • w • Update : –

  18. PRank – Update w Direction w , • Thresholds x Rank a new instance x • Get the correct rank y • x Compute Error-Set E • • Update : w – –

  19. PRank – Summary of Update w Direction w , • Thresholds x Rank a new instance x • x Get the correct rank y • x Compute Error-Set E • • Update : w – –

  20. The PRank Algorithm Get an instance x Maintain Predict : Update Get the true rank y No ? Compute Error set : Yes

  21. Analysis Two Lemmas

  22. Consistency • Can the following happen? w b 4 b 2 b 3 b 2 b 1

  23. Consistency • Can the following happen? No w b b b b b 4 2 3 2 1 • The order of the thresholds is preserved after each round of PRank : .

  24. Regret Bound Given : • Arbitrary input sequence Easy Case: • Assume there exists a model that ranks all the input instances correctly – The total loss the algorithm suffers is bounded Hard Case: • In general L t − inf L ˜ – A “regret” is bounded ( ) t f f ∈ F

  25. Ranking Margin Margin( x ,y ) = min , w 5 3 2 1 4

  26. Ranking Margin Margin( x ,y ) = min , w 5 3 2 1 4

  27. Ranking Margin Margin( x ,y ) = min , w 5 3 2 1 4

  28. Ranking Margin Margin( x ,y ) = min , Margin = min Margin w 5 3 2 1 4

  29. Mistake Bound Given : • Input sequence , • Norm of instances is bounded • Ranked correctly by a normalized ranker with Margin>0 Then : Number of Mistakes PRank Makes

  30. Exploit Structure Loss Range Structure Classification None Under Constraint Regression Metric Over Constraint Ranking Order

  31. Other Approaches • Treat Ranking as Classification or Regression E.g. Basu, Hirsh, Cohen 1998 • Reduce a ranking problem into a classification problem over pair of examples E.g. Freund, Lyer, Schapire, Singer 1998 Herbrich, Graepel, Obermayer 2000 – Not simple to combine preferences predictions over pairs into a singe consistent ordering – No simple adaptation for online settings

  32. Empirical Study

  33. An Illustration • Five concentric ellipses • Training set of 50 points • Three approaches • Pranking • Classification • Regression PRank MC-Perceptron Widrow-Hoff Ranking Classification Regression

  34. Each-Movie database • 74424 registered Users • 1648 listed Movies • Users ranking of movies • 7451 Users saw >100 movies • 1801 Users saw >200 movies

  35. Ranking Loss, 100 Viewers WH Regression MC-Perceptron Classification PRank PRank Over constrained Rank Loss Under constrained Accurately constrained Round

  36. Ranking Loss, 200 Viewers WH Regression MC-Perceptron Classification PRank PRank Rank Loss Round

  37. Demonstration

  38. (2) Movies chosen and ranked by user (1) User choose movies from this list

  39. (3) Press the ‘learn’ key. (5) The system re-ranks a (4) The system re-ranks The systems learns the new fresh set of yet the training set user’s taste unseen movies

  40. (6) Press the ‘flip’ button (7) The flipped list to see what movies you should not view

  41. • Many alternatives to formulate ranking • Choose one that models best your problem • Exploit and Incorporate structure • Specifically: – Online algorithm for ranking problems via projections and conservative update of the projection’s direction and the threshold values – Experiments on a synthetic dataset and on Each- Movie data set indicate that the PRank algorithm performs better then algorithms for classification and regression

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend