A new efficient SAT formulation for learning NCS models : numerical - - PowerPoint PPT Presentation

a new efficient sat formulation for learning ncs models
SMART_READER_LITE
LIVE PREVIEW

A new efficient SAT formulation for learning NCS models : numerical - - PowerPoint PPT Presentation

A new efficient SAT formulation for learning NCS models : numerical results Kh. Belahcne, O. Khaled, V. Mousseau, W. Ouerdane, A. Tlili DA2PL2018 Introductory Example NCS Learning NCS model Computational study Discussion and conclusions


slide-1
SLIDE 1

A new efficient SAT formulation for learning NCS models : numerical results

  • Kh. Belahcène, O. Khaled, V. Mousseau, W. Ouerdane, A. Tlili

DA2PL’2018

slide-2
SLIDE 2

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Contents

Introductory Example NonCompensatory Sorting (NCS) Learning NCS model SAT formulation based on coalitions SAT formulation based on pairwise separation Computational study Discussion and conclusions

2/25

slide-3
SLIDE 3

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Introductory Example

Project

a b c d

Category p1 5 6 6 5 ? p2 3.5 1 3 9 ? p3 7.5 2 1 3 ? p4 2 8 2.5 7 ? p5 3 8.5 3 8.5 ? p6 8 4 1.5 1.5 ? ⋆ < 4 < 3 < 2 < 2 boundary between ⋆ and ⋆⋆ ⋆⋆ [4,7[ [3,8[ [2,5[ [2,8[ ⋆ ⋆ ⋆ ≥ 7 ≥ 8 ≥ 5 ≥ 8 boundary between ⋆⋆ and ⋆ ⋆ ⋆

3/25

slide-4
SLIDE 4

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Introductory Example

1st phase : criterion-wise sorting

project

a b c d

Category p1 ⋆⋆ ⋆⋆ ⋆ ⋆ ⋆ ⋆⋆ ? p2 ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆ ? p3 ⋆ ⋆ ⋆ ⋆ ⋆ ⋆⋆ ? p4 ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆⋆ ? p5 ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆ ? p6 ⋆ ⋆ ⋆ ⋆⋆ ⋆ ⋆ ? ⋆ < 4 < 3 < 2 < 2 boundary between ⋆ and ⋆⋆ ⋆⋆ [4,7[ [3,8[ [2,5[ [2,8[ ⋆ ⋆ ⋆ ≥ 7 ≥ 8 ≥ 5 ≥ 8 boundary between ⋆⋆ and ⋆ ⋆ ⋆

4/25

slide-5
SLIDE 5

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Introductory Example

2nd phase : noncompensatory multi criteria aggregation

project a b c d Category p1 ⋆⋆ ⋆⋆ ⋆ ⋆ ⋆ ⋆⋆ ? p2 ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆ ? p3 ⋆ ⋆ ⋆ ⋆ ⋆ ⋆⋆ ? p4 ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆⋆ ? p5 ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆ ? p6 ⋆ ⋆ ⋆ ⋆⋆ ⋆ ⋆ ?

Sufficient coalitions Insufficient coalitions

◮ Getting an overall ⋆⋆ or ⋆ ⋆ ⋆ requires getting ⋆⋆ or ⋆ ⋆ ⋆ on a sufficient coalition of criteria ◮ Getting an overall ⋆ ⋆ ⋆ requires getting ⋆ ⋆ ⋆ on a sufficient coalition of criteria

5/25

slide-6
SLIDE 6

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Introductory Example

2nd phase : noncompensatory multi criteria aggregation

project a b c d Category p1 ⋆⋆ ⋆⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆⋆ p2 ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆ ⋆ p3 ⋆ ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆⋆ p4 ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆⋆ ⋆⋆ p5 ⋆ ⋆ ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ p6 ⋆ ⋆ ⋆ ⋆⋆ ⋆ ⋆ ⋆

Sufficient coalitions Insufficient coalitions

◮ Getting an overall ⋆⋆ or ⋆ ⋆ ⋆ requires getting ⋆⋆ or ⋆ ⋆ ⋆ on a sufficient coalition of criteria ◮ Getting an overall ⋆ ⋆ ⋆ requires getting ⋆ ⋆ ⋆ on a sufficient coalition of criteria

6/25

slide-7
SLIDE 7

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Contents

Introductory Example NonCompensatory Sorting (NCS) Learning NCS model SAT formulation based on coalitions SAT formulation based on pairwise separation Computational study Discussion and conclusions

7/25

slide-8
SLIDE 8

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

The Noncompensatory Sorting Model (NCS)

◮ MCDA method based on outranking relations ◮ Characterized by [Bouyssou and Marchant, 2007]

An object is assigned to a category if :

◮ It is better than the lower limit of the category on a sufficiently strong

subset of criteria

◮ While this is not the case when comparing the object to the upper

limit of the category

8/25

slide-9
SLIDE 9

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

The Noncompensatory Sorting Model (NCS)

Simplest case : 2 categories

◮ 2 categories : Good (G), Bad (B) ◮ objects to be sorted : X = i∈N Xi with N = {1, . . . , n} ◮ i total preorder on Xi ◮ limit profile b = (b1, . . . , bn) ◮ F = family of sufficient coalitions, which is a subset of 2N up-closed

by inclusion

Assignment rule :

For all x = (x1, . . . , xn) ∈ X x ∈ G iff {i ∈ N : xi i bi} ∈ F

9/25

slide-10
SLIDE 10

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

The Noncompensatory Sorting Model (NCS)

More than 2 categories :

◮ an ordered set C1 ≺ · · · ≺ Cp of p categories. ◮ objects to be sorted : X = i∈N Xi with N = {1, . . . , n} ◮ i total preorder on Xi, i ∈ N, ◮ limit profiles bh = (bh 1, . . . , bh n) such that bh i i bh−1 i

, with i ∈ N, h = 1..p − 1

◮ bh is the upper limit of Ch, and the lower limit of Ch+1 ◮ p − 1 embedded families of sufficient coalitions

F1 ⊆ F2 ⊆ ... ⊆ Fp−1 (subsets of 2N up-closed by inclusion).

Assignment rule :

For all x = (x1, . . . , xn) ∈ X x ∈ Ch iff {i ∈ N : xi i bh

i } ∈ Fh and {i ∈ N : xi i bh+1 i

} / ∈ Fh+1

10/25

slide-11
SLIDE 11

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Variants of the NCS Model

Preference parameters

◮ Nested intervals of values sufficient at level Ch,

(i.e, b1, b2, ..., bp−1)

◮ Nested upsets of coalitions of criteria sufficient at level Ch,

(i.e, F 1 ⊆ F 2 ⊆ ... ⊆ F p−1) Particular cases

◮ U c-NCS : using a Unique set of sufficient coalitions of criteria

(i.e, F1 = F2 = ... = Fp−1)

◮ U v-NCS : using a Unique set of sufficient value

(i.e, b1 = b2 = ... = bp−1)

◮ k-NCS : representing sufficient coalitions with a k-additive capacity ◮ 1-Uc-NCS = MR-Sort, [Leroy et al., 2011]

11/25

slide-12
SLIDE 12

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Contents

Introductory Example NonCompensatory Sorting (NCS) Learning NCS model SAT formulation based on coalitions SAT formulation based on pairwise separation Computational study Discussion and conclusions

12/25

slide-13
SLIDE 13

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Learning NCS model

Learning NCS using a MIP formulation [Leroy et al. 2011]

◮ Best restoration of the learning set ◮ Solvable for small instances only

Learning NCS using an heuristic [Sobrie et al. 2015]

◮ No guarantee about the inferred model ◮ handle large learning sets

Learning NCS using SAT formulations

13/25

slide-14
SLIDE 14

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

A SAT formulation based on coalitions

Binary Variables

◮ xi,h,k : on criterion i is the value k sufficient at level h or not? ◮ yB : is coalition B ⊆ N sufficient or not?

Clauses

  • 1. Ascending scales : for all criteria, frontiers and ordered pairs of values,

k < k′, xi,h,k′ ∨ ¬xi,h,k

  • 2. Hierarchy of profiles : for all criteria, values and ordered pairs of frontiers,

h < h′, xi,h,k ∨ ¬xi,h′,k

  • 3. Coalitions strength : for all ordered pairs of coalitions, B ⊂ B′, yB′ ∨ ¬yB
  • 4. Alternatives are outranked by boundary above them : for all coalitions,

frontiers and alternatives a assigned immediately below the frontier (

i∈B ¬xi,h,ui) ∨ ¬yB

  • 5. Alternatives outrank the boundary below them : for all coalitions,

frontiers and alternatives b assigned immediately below the frontier (

i∈B xi,h,ai) ∨ yN \B

14/25

slide-15
SLIDE 15

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Contents

Introductory Example NonCompensatory Sorting (NCS) Learning NCS model SAT formulation based on coalitions SAT formulation based on pairwise separation Computational study Discussion and conclusions

15/25

slide-16
SLIDE 16

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

A SAT formulation based on pairwise separation

The simplest case : 2 categories Good (G), Bad (B)

α : X → {Good, Bad} : An assignment of alternatives to categories.

Binary Variables

◮ xi,k : is the value k ∈ Xi sufficiently good, i ∈ N, ◮ zi,g,b : for i ∈ N, g a good alternative (g ∈ α−1(Good)) and b a bad

alternative (b ∈ α−1(Bad)), criterion i distinguishes gi and bi.

Clauses

  • 1. Ascending scales :
  • i∈N
  • k′ik(xi,k′ ∨ ¬xi,k)
  • 2. Pairwise Separation :
  • i∈N , g∈α−1(Good), b∈α−1(Bad)(¬zi,g,b ∨ ¬xi,bi)
  • i∈N , g∈α−1(Good), b∈α−1(Bad)(¬zi,g,b ∨ xi,gi)
  • g∈α−1(Good), b∈α−1(Bad)(

i∈N zi,g,b)

16/25

slide-17
SLIDE 17

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

A SAT formulation based on pairwise separation

More than two categories

  • 1. Ascending scales :
  • i∈N , h∈[2..p]
  • k′ i k ∈X⋆(xi,h,k′ ∨ ¬xi,h,k)
  • 2. Hierarchy of profiles :
  • i∈N , h < h′∈[2..p], k∈X⋆(xi,h,k ∨ ¬xi,h′,k)
  • 3. Pairwise Separation :
  • i∈N , h∈[2..p]
  • g∈α−1(Gh), b/

∈α−1(Gh)(¬zi,h,g,b ∨ ¬xi,h,bi)

  • i∈N , h∈[2..p]
  • g∈α−1(Gh),b/

∈α−1(Gh)(¬zi,h,g,b ∨ xi,h,gi)

  • h∈[2..p]
  • g∈α−1(Gh), b/

∈α−1(Gh)( i∈N zi,h,g,b)

17/25

slide-18
SLIDE 18

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Contents

Introductory Example NonCompensatory Sorting (NCS) Learning NCS model SAT formulation based on coalitions SAT formulation based on pairwise separation Computational study Discussion and conclusions

18/25

slide-19
SLIDE 19

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Experimental design

Baseline configuration : 128 reference assignments, 9 criteria and 3 categories.

◮ Number of reference assignments |X∗| : 16, 32, 64, 128, 256, 512, 1024 ◮ Number of criteria |N| : 3, 5, 7, 9, 11 ◮ Number of categories p : 2,3,4,5 ◮ 100 iterations

19/25

slide-20
SLIDE 20

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Computing time

◮ Number of reference

assignments : 16, 32, 64, 128, 512, 1024

◮ Number of criteria : 9 ◮ Number of categories : 3

16 32 64 128 256 512 1024 Number of reference assignments 10−1 100 101 102 103 Computation time (s) log scale SATC formulation SATP formulation

◮ The distribution of the computing time for both SAT formulations remains

tightly grouped around its central value.

◮ log(tSATC) is seemingly linearly (O(|X∗|)) dependent on log(|X∗|) ◮ The computation time of the second formulation quadratically (O(|X∗|2))

increases with the number of reference assignments.

20/25

slide-21
SLIDE 21

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Computing time

◮ Number of reference

assignments : 128

◮ Number of criteria : 3, 5,

7, 9, 11

◮ Number of categories : 3

3 5 7 9 11 Number of criteria 10−1 100 101 102 103 Computation time (s) log scale SATC formulation SATP formulation

◮ The distribution of the computing time for both SAT formulations remains

tightly grouped around its central value.

◮ The computing time of SAT C increases exponentially with the number of

criteria (O(2N )).

◮ The number of criteria impacts linearly the computation time of SAT P.

21/25

slide-22
SLIDE 22

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Generalization : preliminary remark

The first formulation returns a unique set of sufficient coalitions. The second formulation returns a set of compatible sufficient coalitions.

Study the three situations :

◮ T = Tmin, ◮ T = Trand, ◮ T = Tmax.

22/25

slide-23
SLIDE 23

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Generalization

16 32 64 128 256 512 Number of reference assignments 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 Restoration Rate SATC formulation SATP formulation

Increasing the number of reference assignments improves generalization.

3 5 7 9 11 Number of criteria 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 Restoration Rate SATC formulation SATP formulation

Generalization decreases as the number of criteria increases.

23/25

slide-24
SLIDE 24

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Contents

Introductory Example NonCompensatory Sorting (NCS) Learning NCS model SAT formulation based on coalitions SAT formulation based on pairwise separation Computational study Discussion and conclusions

24/25

slide-25
SLIDE 25

Introductory Example NCS Learning NCS model Computational study Discussion and conclusions

Discussion and conclusions

◮ For |N| > 5 and |X∗| < 150 : The second formulation is faster than the

first one and the generalization is equivalent for both formulations.

◮ For |X∗| > 150 : The second formulation is penalized by the quadratic

side (clauses, number of examples).

◮ For |N| > 7 : The first formulation is penalized by the exponential side

Further research

◮ Starting a PhD ◮ Extension to noisy data ◮ weighted cardinality constraints → portfolio selection

25/25