A new efficient SAT formulation for learning NCS models : numerical results
- Kh. Belahcène, O. Khaled, V. Mousseau, W. Ouerdane, A. Tlili
A new efficient SAT formulation for learning NCS models : numerical - - PowerPoint PPT Presentation
A new efficient SAT formulation for learning NCS models : numerical results Kh. Belahcne, O. Khaled, V. Mousseau, W. Ouerdane, A. Tlili DA2PL2018 Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
2/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
3/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
4/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
project a b c d Category p1 ⋆⋆ ⋆⋆ ⋆ ⋆ ⋆ ⋆⋆ ? p2 ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆ ? p3 ⋆ ⋆ ⋆ ⋆ ⋆ ⋆⋆ ? p4 ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆⋆ ? p5 ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆ ? p6 ⋆ ⋆ ⋆ ⋆⋆ ⋆ ⋆ ?
◮ Getting an overall ⋆⋆ or ⋆ ⋆ ⋆ requires getting ⋆⋆ or ⋆ ⋆ ⋆ on a sufficient coalition of criteria ◮ Getting an overall ⋆ ⋆ ⋆ requires getting ⋆ ⋆ ⋆ on a sufficient coalition of criteria
5/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
project a b c d Category p1 ⋆⋆ ⋆⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆⋆ p2 ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆ ⋆ p3 ⋆ ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆⋆ p4 ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆⋆ ⋆⋆ p5 ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ p6 ⋆ ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆
◮ Getting an overall ⋆⋆ or ⋆ ⋆ ⋆ requires getting ⋆⋆ or ⋆ ⋆ ⋆ on a sufficient coalition of criteria ◮ Getting an overall ⋆ ⋆ ⋆ requires getting ⋆ ⋆ ⋆ on a sufficient coalition of criteria
6/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
7/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ MCDA method based on outranking relations ◮ Characterized by [Bouyssou and Marchant, 2007]
◮ It is better than the lower limit of the category on a sufficiently strong
◮ While this is not the case when comparing the object to the upper
8/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ 2 categories : Good (G), Bad (B) ◮ objects to be sorted : X = i∈N Xi with N = {1, . . . , n} ◮ i total preorder on Xi ◮ limit profile b = (b1, . . . , bn) ◮ F = family of sufficient coalitions, which is a subset of 2N up-closed
9/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ an ordered set C1 ≺ · · · ≺ Cp of p categories. ◮ objects to be sorted : X = i∈N Xi with N = {1, . . . , n} ◮ i total preorder on Xi, i ∈ N, ◮ limit profiles bh = (bh 1, . . . , bh n) such that bh i i bh−1 i
◮ bh is the upper limit of Ch, and the lower limit of Ch+1 ◮ p − 1 embedded families of sufficient coalitions
i } ∈ Fh and {i ∈ N : xi i bh+1 i
10/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ Nested intervals of values sufficient at level Ch,
◮ Nested upsets of coalitions of criteria sufficient at level Ch,
◮ U c-NCS : using a Unique set of sufficient coalitions of criteria
◮ U v-NCS : using a Unique set of sufficient value
◮ k-NCS : representing sufficient coalitions with a k-additive capacity ◮ 1-Uc-NCS = MR-Sort, [Leroy et al., 2011]
11/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
12/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ Best restoration of the learning set ◮ Solvable for small instances only
◮ No guarantee about the inferred model ◮ handle large learning sets
13/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ xi,h,k : on criterion i is the value k sufficient at level h or not? ◮ yB : is coalition B ⊆ N sufficient or not?
i∈B ¬xi,h,ui) ∨ ¬yB
i∈B xi,h,ai) ∨ yN \B
14/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
15/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ xi,k : is the value k ∈ Xi sufficiently good, i ∈ N, ◮ zi,g,b : for i ∈ N, g a good alternative (g ∈ α−1(Good)) and b a bad
i∈N zi,g,b)
16/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
∈α−1(Gh)(¬zi,h,g,b ∨ ¬xi,h,bi)
∈α−1(Gh)(¬zi,h,g,b ∨ xi,h,gi)
∈α−1(Gh)( i∈N zi,h,g,b)
17/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
18/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ Number of reference assignments |X∗| : 16, 32, 64, 128, 256, 512, 1024 ◮ Number of criteria |N| : 3, 5, 7, 9, 11 ◮ Number of categories p : 2,3,4,5 ◮ 100 iterations
19/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ Number of reference
◮ Number of criteria : 9 ◮ Number of categories : 3
16 32 64 128 256 512 1024 Number of reference assignments 10−1 100 101 102 103 Computation time (s) log scale SATC formulation SATP formulation
◮ The distribution of the computing time for both SAT formulations remains
◮ log(tSATC) is seemingly linearly (O(|X∗|)) dependent on log(|X∗|) ◮ The computation time of the second formulation quadratically (O(|X∗|2))
20/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ Number of reference
◮ Number of criteria : 3, 5,
◮ Number of categories : 3
3 5 7 9 11 Number of criteria 10−1 100 101 102 103 Computation time (s) log scale SATC formulation SATP formulation
◮ The distribution of the computing time for both SAT formulations remains
◮ The computing time of SAT C increases exponentially with the number of
◮ The number of criteria impacts linearly the computation time of SAT P.
21/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ T = Tmin, ◮ T = Trand, ◮ T = Tmax.
22/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
16 32 64 128 256 512 Number of reference assignments 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 Restoration Rate SATC formulation SATP formulation
3 5 7 9 11 Number of criteria 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 Restoration Rate SATC formulation SATP formulation
23/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
24/25
Introductory Example NCS Learning NCS model Computational study Discussion and conclusions
◮ For |N| > 5 and |X∗| < 150 : The second formulation is faster than the
◮ For |X∗| > 150 : The second formulation is penalized by the quadratic
◮ For |N| > 7 : The first formulation is penalized by the exponential side
◮ Starting a PhD ◮ Extension to noisy data ◮ weighted cardinality constraints → portfolio selection
25/25