http sztaki mta hu bozoki slides
play

http://www.sztaki.mta.hu/%7Ebozoki/slides p. 1/65 Multi Criteria - PowerPoint PPT Presentation

Weighting and ranking based on pairwise comparisons Pros sszehasonlts alap slyozs s rangsorols Sndor Bozki Institute for Computer Science and Control Hungarian Academy of Sciences (MTA SZTAKI) Corvinus University of


  1. Weighting and ranking based on pairwise comparisons Páros összehasonlítás alapú súlyozás és rangsorolás Sándor Bozóki Institute for Computer Science and Control Hungarian Academy of Sciences (MTA SZTAKI) Corvinus University of Budapest October 31, 2017 Slides can be downloaded from http://www.sztaki.mta.hu/%7Ebozoki/slides – p. 1/65

  2. Multi Criteria Decision Making Pairwise comparisons, Analytic Hierarchy Process (Saaty, 1977) Incomplete pairwise comparison matrix Efficiency (Pareto optimality) – p. 2/65

  3. Multi Criteria Decision Making – p. 3/65

  4. The aim of multiple criteria decision analysis The aim is to select the overall best one from a finite set of alternatives , with respect to a finite set of attributes (criteria) , or, to rank the alternatives, or, to classify the alternatives. – p. 4/65

  5. Multiple criteria decision problems Examples tenders, public procurements, privatizations evaluation of applications environmental studies ranking, classification – p. 5/65

  6. Multiple criteria decision problems Examples tenders, public procurements, privatizations evaluation of applications environmental studies ranking, classification Properties criteria contradict each other there is not a single best solution, that is optimal with respect to each criterion subjective factors influence the decision contradictive individual opinions have to be aggregated – p. 6/65

  7. Main tasks in multi criteria decision problems to assign weights of importance to the criteria to evaluate the alternatives to aggregate the evaluations with the weights of criteria sensitivity analysis – p. 7/65

  8. Decomposition of the goal: tree of criteria main criterion 1 criterion 1.1 criterion 1.2 criterion 1.3 criterion 1.4 criterion 1.5 main criterion 2 criterion 2.1 criterion 2.2 main criterion 3 criterion 3.1 subcriterion 3.1.1 subcriterion 3.1.2 criterion 3.2 – p. 8/65

  9. Estimating weights from pairwise comparisons ’How many times criterion 1 is more important than criterion 2?’   1 a 12 a 13 . . . a 1 n a 21 1 a 23 . . . a 2 n       a 31 a 32 1 . . . a 3 n A = ,   . . . . ...   . . . . . . . .     a n 1 a n 2 a n 3 . . . 1 is given, where for any i, j = 1 , . . . , n indices 1 a ij > 0 , a ij = a ji . The aim is to find the w = ( w 1 , w 2 , . . . , w n ) ⊤ ∈ R n + weight vector such that ratios w i w j are close enough to a ij s. – p. 9/65

  10. Evaluation of the alternatives Alternatives are evaluated directly, or by using a function, or by pairwise comparisons as before. ’How many times alternative 1 is better than alternative 2 with respect to criterion 1.1?’   1 b 12 b 13 . . . b 1 m b 21 1 b 23 . . . b 2 m       b 31 b 32 1 . . . b 3 m B =   . . . . ...   . . . . . . . .     b m 1 b m 2 b m 3 . . . 1 – p. 10/65

  11. Aggregation of the evaluations total scores are calculated as a weighted sum of the evaluations with respect to leaf nodes of the criteria tree (bottom up); partial sums are informative – p. 11/65

  12. Weighting methods Eigenvector Method (Saaty): Aw = λ max w . Logarithmic Least Squares Method (LLS): n n � 2 � log a ij − log w i � � min w j i =1 j =1 n � w i = 1 , w i > 0 , i = 1 , 2 , . . . , n. i =1 and 20+ other methods. – p. 12/65

  13. Incomplete pairwise comparison matrix – p. 13/65

  14. incomplete pairwise comparison matrix   1 a 12 a 14 a 15 a 16 a 21 1 a 23       a 32 1 a 34   A =   a 41 a 43 1 a 45     a 51 a 54 1     a 61 1 – p. 14/65

  15. incomplete pairwise comparison matrix and its graph   1 a 12 a 14 a 15 a 16 a 21 1 a 23       a 32 1 a 34   A =   a 41 a 43 1 a 45     a 51 a 54 1     a 61 1 – p. 15/65

  16. λ max -minimal completion   1 a 12 ∗ . . . a 1 n 1 /a 12 1 a 23 . . . ∗       ∗ 1 /a 23 1 . . . a 3 n A = .   . . . . ...   . . . . . . . .     1 /a 1 n ∗ 1 /a 3 n . . . 1 – p. 16/65

  17. λ max -minimal completion   1 a 12 x 1 . . . a 1 n 1 /a 12 1 a 23 . . . x d       1 /x 1 1 /a 23 1 . . . a 3 n A = ,   . . . . ...   . . . . . . . .     1 /a 1 n 1 /x d 1 /a 3 n . . . 1 where x 1 , x 2 , . . . , x d ∈ R + . – p. 17/65

  18. λ max -minimal completion   1 a 12 x 1 . . . a 1 n 1 /a 12 1 a 23 . . . x d       1 /x 1 1 /a 23 1 . . . a 3 n A = ,   . . . . ...   . . . . . . . .     1 /a 1 n 1 /x d 1 /a 3 n . . . 1 where x 1 , x 2 , . . . , x d ∈ R + . The aim is to solve min x > 0 λ max ( A ( x )) . – p. 18/65

  19. λ max -minimal completion Theorem (Bozóki, Fülöp, Rónyai, 2010): The optimal solution of the eigenvalue minimization problem min x > 0 λ max ( A ( x )) is unique if and only if the graph G corresponding to the incomplete pairwise comparison matrix is connected . – p. 19/65

  20. λ max -minimal completion Theorem (Bozóki, Fülöp, Rónyai, 2010): The optimal solution of the eigenvalue minimization problem min x > 0 λ max ( A ( x )) is unique if and only if the graph G corresponding to the incomplete pairwise comparison matrix is connected . If graph G corresponding to the incomplete pairwise comparison matrix is connected, then by using the exponential parametrization x 1 = e y 1 , x 2 = e y 2 , . . . x d = e y d , the eigenvalue minimization problem is transformed into a strictly convex optimization problem. – p. 20/65

  21. The Logarithmic Least Squares (LLS) problem �� 2 � � w i � min log a ij − log w j i, j : a ij is known w i > 0 , i = 1 , 2 , . . . , n. n n The most common normalizations are � w i = 1 , � w i = 1 i =1 i =1 and w 1 = 1 . – p. 21/65

  22. Theorem (Bozóki, Fülöp, Rónyai, 2010): Let A be an incomplete or complete pairwise comparison matrix such that its associated graph G is connected. Then the optimal solution w = exp y of the logarithmic least squares problem is the unique solution of the following system of linear equations: � for all i = 1 , 2 , . . . , n, ( Ly ) i = log a ik k : e ( i,k ) ∈ E ( G ) y 1 = 0 where L denotes the Laplacian matrix of G ( ℓ ii is the degree of node i and ℓ ij = − 1 if nodes i and j are adjacent). – p. 22/65

  23. example   1 a 12 a 14 a 15 a 16 a 21 1 a 23       a 32 1 a 34     a 41 a 43 1 a 45     a 51 a 54 1     a 61 1       4 − 1 0 − 1 − 1 − 1 y 1 (= 0) log( a 12 a 14 a 15 a 16 ) − 1 2 − 1 0 0 0 y 2 log( a 21 a 23 )                   0 − 1 2 − 1 0 0 y 3 log( a 32 a 34 )       =       − 1 0 − 1 3 − 1 0 y 4 log( a 41 a 43 a 45 )             − 1 0 0 − 1 2 0 y 5 log( a 51 a 54 )             − 1 0 0 0 0 1 y 6 log a 61 – p. 23/65

  24. Pairwise Comparison Matrix Calculator Weights from incomplete pairwise comparison matrices can be calculated at pcmc.online – p. 24/65

  25. The spanning tree approach (Tsyganok, 2000, 2010)   1 a 12 a 14 a 15 a 16 a 21 1 a 23       a 32 1 a 34     a 41 a 43 1 a 45     a 51 a 54 1     a 61 1 – p. 25/65

  26. The spanning tree approach (Tsyganok, 2000, 2010)   1 a 12 a 14 a 15 a 16 a 21 1 a 23       a 32 1 a 34     a 41 a 43 1 a 45     a 51 a 54 1     a 61 1   1 a 12 a 14 a 15 a 16 a 21 1 a 23       a 32 1     a 41 1     a 51 1     a 61 1 – p. 26/65

  27. – p. 27/65

  28. The spanning tree approach Every spanning tree induces a weight vector. Natural ways of aggregation: arithmetic mean, geometric mean etc. – p. 28/65

  29. Theorem (Lundy, Siraj, Greco, 2017): The geometric mean of weight vectors calculated from all spanning trees is logarithmic least squares optimal in case of complete pairwise comparison matrices. – p. 29/65

  30. Theorem (Lundy, Siraj, Greco, 2017): The geometric mean of weight vectors calculated from all spanning trees is logarithmic least squares optimal in case of complete pairwise comparison matrices. Theorem (Bozóki, Tsyganok): Let A be an incomplete or complete pairwise comparison matrix such that its associated graph is connected. Then the optimal solution of the logarithmic least squares problem is equal, up to a scalar multiplier, to the geometric mean of weight vectors calculated from all spanning trees. – p. 30/65

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend