Choquet integral in decision making and metric learning Vicen c - - PowerPoint PPT Presentation

choquet integral in decision making and metric learning
SMART_READER_LITE
LIVE PREVIEW

Choquet integral in decision making and metric learning Vicen c - - PowerPoint PPT Presentation

IUKM 2019 - Nara, Japan Choquet integral in decision making and metric learning Vicen c Torra Hamilton Institute, Maynooth University Ireland March 27, 2018 Outline Overview Basics and objectives: Using Choquet integral in two types


slide-1
SLIDE 1

IUKM 2019 - Nara, Japan

Choquet integral in decision making and metric learning Vicen¸ c Torra Hamilton Institute, Maynooth University Ireland March 27, 2018

slide-2
SLIDE 2

Outline

Overview

Basics and objectives:

  • Using Choquet integral in two types of applications

decision and metric learning (reidentification)

  • Distances
  • and distribution

(for non-additive measures)

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 1 / 62

slide-3
SLIDE 3

Outline

Outline

  • 1. Preliminaries
  • Choquet integral: mathematical perspective
  • Non-additive measures
  • Now we need an integral
  • Choquet integral: Application perspective
  • Aggregation operators and CI in decision: MCDM
  • Aggregation operators and CI in reidentification: risk assessment
  • Zooming out
  • 2. Distances in classification (filling the gaps)
  • 3. Distributions

IUKM 2019 - Nara, Japan 2 / 62

slide-4
SLIDE 4

Outline

Choquet integral: a mathematical introduction

IUKM 2019 - Nara, Japan 3 / 62

slide-5
SLIDE 5

Outline

Non-additive measures

IUKM 2019 - Nara, Japan 4 / 62

slide-6
SLIDE 6

Definitions Outline

Definitions: measures

Additive measures.

  • (X, A) a measurable space; then, a set function µ is an additive

measure if it satisfies (i) µ(A) ≥ 0 for all A ∈ A, (ii) µ(X) ≤ ∞ (iii) for every countable sequence Ai (i ≥ 1) of A that is pairwise disjoint (i.e,. Ai ∩ Aj = ∅ when i = j) µ(

  • i=1

Ai) =

  • i=1

µ(Ai)

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 5 / 62

slide-7
SLIDE 7

Definitions Outline

Definitions: measures

Additive measures.

  • (X, A) a measurable space; then, a set function µ is an additive

measure if it satisfies (i) µ(A) ≥ 0 for all A ∈ A, (ii) µ(X) ≤ ∞ (iii) for every countable sequence Ai (i ≥ 1) of A that is pairwise disjoint (i.e,. Ai ∩ Aj = ∅ when i = j) µ(

  • i=1

Ai) =

  • i=1

µ(Ai) Finite case: µ(A ∪ B) = µ(A) + µ(B) for disjoint A, B

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 5 / 62

slide-8
SLIDE 8

Definitions Outline

Definitions: measures

Additive measures. Example:

  • Lebesgue measure. Unique measure λ s.t. λ([a, b]) = b − a for

every finite interval [a, b]

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 6 / 62

slide-9
SLIDE 9

Definitions Outline

Definitions: measures

Additive measures. Example:

  • Lebesgue measure. Unique measure λ s.t. λ([a, b]) = b − a for

every finite interval [a, b]

  • Probability. When µ(X) = 1.

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 6 / 62

slide-10
SLIDE 10

Definitions Outline

Definitions: measures

Additive measures. Example:

  • Lebesgue measure. Unique measure λ s.t. λ([a, b]) = b − a for

every finite interval [a, b]

  • Probability. When µ(X) = 1.
  • Or just price ...

A B

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 6 / 62

slide-11
SLIDE 11

Definitions Outline

Definitions: measures

  • Non-additive measures
  • (X, A) a measurable space, a non-additive (fuzzy) measure µ on

(X, A) is a set function µ : A → [0, 1] satisfying the following axioms: (i) µ(∅) = 0, µ(X) = 1 (boundary conditions) (ii) A ⊆ B implies µ(A) ≤ µ(B) (monotonicity)

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 7 / 62

slide-12
SLIDE 12

Definitions Outline

Definitions: measures

  • Non-additive measures
  • (X, A) a measurable space, a non-additive (fuzzy) measure µ on

(X, A) is a set function µ : A → [0, 1] satisfying the following axioms: (i) µ(∅) = 0, µ(X) = 1 (boundary conditions) (ii) A ⊆ B implies µ(A) ≤ µ(B) (monotonicity)

  • Naturally, additivity implies monotonicity
  • E.g., B = A∪C (with A∩C = ∅) then µ(B) = µ(A)+µ(C) ≥ µ(A)
  • But in non-additive measures, we allow

µ(B = A ∪ C)<µ(A) + µ(C) µ(B = A ∪ C)>µ(A) + µ(C) As e.g., µ(B) = 0.5 < µ(A) + µ(C) = 0.3 + 0.4 = 0.7 A way to represent interactions

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 7 / 62

slide-13
SLIDE 13

Definitions Outline

Definitions: measures

  • Non-additive measures. Price
  • When we have a discount, for disjoints A and B, we have

µ(A ∪ B) < µ(A) + µ(B) but µ(A ∪ B) ≥ µ(A)

  • There quite a large number of families of measures

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 8 / 62

slide-14
SLIDE 14

Definitions Outline

Definitions: measures

  • Non-additive measures. Distorted probabilities
  • m : R+ → R+ a continuous and increasing function such that

m(0) = 0; P be a probability. µm,P(A) = m(P(A)) (1)

  • If m(x) = xp, then µm(A) = (λ(A))p

(a) (b) (c) (d)

  • Used in economics: Prospect theory (Kahneman and Tversky, 1979).

Small probabilities tend to be overestimated, while large ones, underestimated.

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 9 / 62

slide-15
SLIDE 15

Definitions Outline

Definitions: measures

  • Non-additive measures. Distorted Lebesgue
  • m : R+ → R+ a continuous and increasing function such that

m(0) = 0; λ be the Lebesgue measure. µm(A) = m(λ(A)) (2)

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 10 / 62

slide-16
SLIDE 16

Definitions Outline

Definitions: measures

  • Non-additive measures. Distorted Lebesgue
  • m : R+ → R+ a continuous and increasing function such that

m(0) = 0; λ be the Lebesgue measure. µm(A) = m(λ(A)) (2)

  • If m(x) = x2, then µm(A) = (λ(A))2
  • If m(x) = xp, then µm(A) = (λ(A))p

(a) (b) (c) (d)

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 10 / 62

slide-17
SLIDE 17

Definitions Outline

Definitions: measures

  • Non-additive measures. A large number of families
  • Sugeno λ-measures: µ(A∪B) = µ(A)+µ(B)+λµ(A)µ(B) (λ > −1)
  • For P a non empty set of probability measures, the upper and lower

probabilities ⊲ ¯ P(A) = supP ∈P P(A) ⊲ P(A) = infP ∈P P(A) (dual in the sense: ¯ P(A) = 1 − P(Ac))

  • m-dimensional distorted probabilities (NT/NT, 2005, 2011, 2012, 2018)

DP Unconstrained fuzzy measures

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 11 / 62

slide-18
SLIDE 18

Outline

Now we need an integral

IUKM 2019 - Nara, Japan 12 / 62

slide-19
SLIDE 19

Definitions Outline

Definitions: integrals

  • Additive measure: the way you add areas does not change1 results

bi bi−1 ai ai−1 bi bi−1 x1 x1 x1 xN xN x {x|f(x) ≥ ai} {x|f(x) = bi} (a) (b) (c)

  • Riemann integral (a) vs Lebesgue integral (c)
  • Riemann sum:

I∈C f(x(I)) ∗ µ(I)

(C non-overlapping collection, x(I) an element of I)

  • Lebesgue sum:

ai∈Range(f)(ai − ai−1)µ(Γ(ai))

where Γ(a) := {x|f(x) ≥ a}

1Well, if it is calculable IUKM 2019 - Nara, Japan 13 / 62

slide-20
SLIDE 20

Definitions Outline

Definitions: integrals

  • Lebesgue integral
  • fdµ :=

∞ µf(r)dr where µf(r) = µ({x|f(x) ≥ r})

IUKM 2019 - Nara, Japan 14 / 62

slide-21
SLIDE 21

Definitions Outline

Definitions: integrals

  • Choquet integral (Choquet, 1954):
  • µ a non-additive measure, f a measurable function. The Choquet

integral of f w.r.t. µ, where µf(r) := µ({x|f(x) > r}): (C)

  • fdµ :=

∞ µf(r)dr.

IUKM 2019 - Nara, Japan 15 / 62

slide-22
SLIDE 22

Definitions Outline

Definitions: integrals

  • Choquet integral (Choquet, 1954):
  • µ a non-additive measure, f a measurable function. The Choquet

integral of f w.r.t. µ, where µf(r) := µ({x|f(x) > r}): (C)

  • fdµ :=

∞ µf(r)dr.

  • Properties.
  • When the measure is additive, this is the Lebesgue integral

(standard integral)

IUKM 2019 - Nara, Japan 15 / 62

slide-23
SLIDE 23

Definitions Outline

Definitions: integrals

Choquet integral. Discrete version

  • µ a non-additive measure, f a measurable function. The Choquet

integral of f w.r.t. µ, (C)

  • fdµ =

N

  • i=1

[f(xs(i)) − f(xs(i−1))]µ(As(i)), where f(xs(i)) indicates that the indices have been permuted so that 0 ≤ f(xs(1)) ≤ · · · ≤ f(xs(N)) ≤ 1, and where f(xs(0)) = 0 and As(i) = {xs(i), . . . , xs(N)}.

IUKM 2019 - Nara, Japan 16 / 62

slide-24
SLIDE 24

Definitions Outline

Definitions: integrals

  • Choquet integral. Example:
  • Distorted probability µm(A) = m(P(A)) (with m(0) = 0, m(1) = 1)

CIµm(f): (a) → max, (b) → median, (c) → min, (d) → mean (expectation)

(a) (b) (c) (d)

  • Upper and lower probabilities: bounds for expectations

CIP(f) ≤ infP EP(f) ≤ supP EP(f) ≤ CI ¯

P(f)

IUKM 2019 - Nara, Japan 17 / 62

slide-25
SLIDE 25

Definitions Outline

Definitions: integrals

  • Choquet integral. Example:
  • Distorted probability µm(A) = m(P(A)) (with m(0) = 0, m(1) = 1)

CIµm(f): (a) → max, (b) → median, (c) → min, (d) → mean (expectation)

(a) (b) (c) (d)

  • Upper and lower probabilities: bounds for expectations

CIP(f) ≤ infP EP(f) ≤ supP EP(f) ≤ CI ¯

P(f)

  • (C)
  • χAdµ = µ(A)

IUKM 2019 - Nara, Japan 17 / 62

slide-26
SLIDE 26

Outline

Application I Aggregation operators & Choquet integral in Decision

IUKM 2019 - Nara, Japan 18 / 62

slide-27
SLIDE 27

Outline

MCDM: Aggregation for (numerical) utility functions

IUKM 2019 - Nara, Japan 19 / 62

slide-28
SLIDE 28

Outline

Aggregation and Choquet integral in MCDM

  • Decision, utility functions

IUKM 2019 - Nara, Japan 20 / 62

slide-29
SLIDE 29

Outline

Aggregation and Choquet integral in MCDM

  • Decision, utility functions

Alternatives = { Ford T, Seat 600, Simca 1000, VW, Citr.Acadiane }

IUKM 2019 - Nara, Japan 20 / 62

slide-30
SLIDE 30

Outline

Aggregation and Choquet integral in MCDM

  • Decision, utility functions

Alternatives = { Ford T, Seat 600, Simca 1000, VW, Citr.Acadiane } Criteria = { Seats, Security, Price, Comfort, trunk}

IUKM 2019 - Nara, Japan 20 / 62

slide-31
SLIDE 31

Outline

Aggregation and Choquet integral in MCDM

  • Decision, utility functions

Alternatives = { Ford T, Seat 600, Simca 1000, VW, Citr.Acadiane } Criteria = { Seats, Security, Price, Comfort, trunk} Decision making process:

IUKM 2019 - Nara, Japan 20 / 62

slide-32
SLIDE 32

Outline

Aggregation and Choquet integral in MCDM

  • Decision, utility functions

Alternatives = { Ford T, Seat 600, Simca 1000, VW, Citr.Acadiane } Criteria = { Seats, Security, Price, Comfort, trunk} Decision making process: Modelling=Criteria + Utilities, aggregation, selection

Number of Security Price Confort trunk seats Ford T 20 20 Seat 600 60 100 50 Simca 1000 100 30 100 50 70 VW Beetle 80 50 30 70 100 Citro¨ en Acadiane 20 40 60 40

IUKM 2019 - Nara, Japan 20 / 62

slide-33
SLIDE 33

Outline

Aggregation and Choquet integral in MCDM

  • Decision, utility functions

Alternatives = { Ford T, Seat 600, Simca 1000, VW, Citr.Acadiane } Criteria = { Seats, Security, Price, Comfort, trunk} Decision making process:

IUKM 2019 - Nara, Japan 21 / 62

slide-34
SLIDE 34

Outline

Aggregation and Choquet integral in MCDM

  • Decision, utility functions

Alternatives = { Ford T, Seat 600, Simca 1000, VW, Citr.Acadiane } Criteria = { Seats, Security, Price, Comfort, trunk} Decision making process: Modelling, aggregation = C, selection

Seats Security Price Comfort trunk C = AM Ford T 20 20 8 Seat 600 60 100 50 42 Simca 1000 100 30 100 50 70 70 VW 80 50 30 70 100 66

  • Citr. Acadiane

20 40 60 40 32

IUKM 2019 - Nara, Japan 21 / 62

slide-35
SLIDE 35

Outline

Aggregation and Choquet integral in MCDM

  • MCDM: Aggregation to deal with contradictory criteria

IUKM 2019 - Nara, Japan 22 / 62

slide-36
SLIDE 36

Outline

Aggregation and Choquet integral in MCDM

  • MCDM: Aggregation to deal with contradictory criteria
  • But there are occasions in which ordering is clear

when ai ≤ bi it is clear that a ≤ b E.g., Seats Security Price Comfort trunk C = AM Seat 600 60 100 50 42 Simca 1000 100 30 100 50 70 70

IUKM 2019 - Nara, Japan 22 / 62

slide-37
SLIDE 37

Outline

Aggregation and Choquet integral in MCDM

  • MCDM: Aggregation to deal with contradictory criteria
  • But there are occasions in which ordering is clear

when ai ≤ bi it is clear that a ≤ b E.g., Seats Security Price Comfort trunk C = AM Seat 600 60 100 50 42 Simca 1000 100 30 100 50 70 70 Aggregation operators are appropriate because they satisfy monotonicity

IUKM 2019 - Nara, Japan 22 / 62

slide-38
SLIDE 38

Outline

Aggregation and Choquet integral in MCDM

  • Decision making process:

IUKM 2019 - Nara, Japan 23 / 62

slide-39
SLIDE 39

Outline

Aggregation and Choquet integral in MCDM

  • Decision making process:

Modelling, aggregation, selection=order,first

IUKM 2019 - Nara, Japan 23 / 62

slide-40
SLIDE 40

Outline

Aggregation and Choquet integral in MCDM

  • Decision making process:

Modelling, aggregation, selection=order,first

  • The function of aggregation functions
  • Different aggregations lead to different orders (in the PF)

IUKM 2019 - Nara, Japan 23 / 62

slide-41
SLIDE 41

Outline

Aggregation and Choquet integral in MCDM

  • Decision making process:

Modelling, aggregation, selection=order,first

  • The function of aggregation functions
  • Different aggregations lead to different orders (in the PF)
  • Aggregation establishes which points are equivalent
  • Different aggregations, lead to different curves of points (level curves)

Ranking alt alt Consensus alt Criteria Satisfaction on: Price Quality Comfort FordT 206 0.2 0.8 0.3 0.7 0.7 0.8 FordT 206 FordT 206 0.35 0.72 0.72 0.35 ... ... ... ... ... ... x1 f1(x2) f1(x1) f1 f2 f2(x2) f2(x1) x2

IUKM 2019 - Nara, Japan 23 / 62

slide-42
SLIDE 42

Outline

Aggregation and Choquet integral in MCDM

  • Aggregation functions and different level curves
  • Arithmetic mean
  • Geometric mean, Harmonic mean, ...
  • Weighted mean
  • OWA, ...

IUKM 2019 - Nara, Japan 24 / 62

slide-43
SLIDE 43

Outline

Aggregation and Choquet integral in MCDM

  • Aggregation functions and different level curves
  • Arithmetic mean
  • Geometric mean, Harmonic mean, ...
  • Weighted mean
  • OWA, ...
  • Choquet integral (generalization of the AM, WM, OWA)

⊲ to represent interactions between criteria ⊲ non-independent criteria allowed

IUKM 2019 - Nara, Japan 24 / 62

slide-44
SLIDE 44

Outline

Aggregation and Choquet integral in MCDM

  • Aggregation functions and parameters
  • Arithmetic mean: no parameters
  • Geometric mean, Harmonic mean, ...: : no parameters
  • Weighted mean: weighting vector
  • OWA, ...: weighting vector
  • Choquet integral (generalization of the AM, WM, OWA) a measure

⊲ to represent interactions between criteria

w(security,price,confort) > (or <) w(security)+w(price)+w(confort)

⊲ non-independent criteria allowed

µ({c1, c2}) = µ({c1}) + µ({c2})

⊲ (C)

  • χAdµ = µ(A)

IUKM 2019 - Nara, Japan 25 / 62

slide-45
SLIDE 45

Outline

MCDM: What fuzzy measures (and CI) can represent?

IUKM 2019 - Nara, Japan 26 / 62

slide-46
SLIDE 46

Outline

Aggregation and Choquet integral in MCDM

  • Choquet integral can, and WM/Probability model cannot
  • An element/criteria is added into the set, and

the preference is reversed

IUKM 2019 - Nara, Japan 27 / 62

slide-47
SLIDE 47

Outline

Aggregation and Choquet integral in MCDM

  • Choquet integral can, and WM/Probability model cannot
  • An element/criteria is added into the set, and

the preference is reversed

  • Example. Buying a house.

When public transport is available, the preference changes2 ⊲ If there is no bus I prefer a public library than a restaurant, but if there is a bus then I instead prefer the restaurant near. ⊲ Mathematically, with B=Bus, R=Restaurant, L=Library we have µ({R}) ≤ µ({L}) but µ({R, B}) ≥ µ({L, B})

2Ellesberg’s paradox. IUKM 2019 - Nara, Japan 27 / 62

slide-48
SLIDE 48

Outline

MCDM: Learn/identify the parameters (e.g. the measures)

IUKM 2019 - Nara, Japan 28 / 62

slide-49
SLIDE 49

Outline

Aggregation and Choquet integral in MCDM

  • Available information?
  • Find

measures from

  • utcome:

column vector with

  • utcome

Seats Security Price Comfort trunk C = CIµ Seat 600 60 100 50 42 Simca 1000 100 30 100 50 70 70 . . .

  • Find measures from preferences – (partial) order <: S = {(ri, ti)}i

Seats Security Price Comfort trunk C = CIµ Seat 600 60 100 50 4th Simca 1000 100 30 100 50 70 1st . . .

IUKM 2019 - Nara, Japan 29 / 62

slide-50
SLIDE 50

Outline

Aggregation and Choquet integral in MCDM

  • Available information?
  • Measures from outcome: a column vector ⇒ min (CP(ar) − or)2
  • Measures from preferences – (partial) order <: S = {(ri, ti)}i

⊲ Formulation: Find µ such that, for all (r, t) ∈ S, it follows that CP(evaluation-car r) > CP(evaluation-car t)

  • r, with ar and as for rows r and s,

CP(ar1, . . . , arn) > CP(at1, . . . , atn) Unfortunately, often, no solution: minimize failures y(r,t) ≥ 0 CP(ar1, . . . , arn) − CP(rt1, . . . , atn) + y(r,t) > 0.

IUKM 2019 - Nara, Japan 30 / 62

slide-51
SLIDE 51

Outline

Aggregation and Choquet integral in MCDM

  • Available information?
  • Measures from outcome: a column vector ⇒ min (CP(ar) − or)2
  • Measures from preferences – (partial) order <: S = {(ri, ti)}i

⊲ Formulation: Find µ such that, for all (r, t) ∈ S, it follows that Minimize

(r,t)∈S y(r,t)

Subject to CP(ar1, . . . , arn) − CP(at1, . . . , atn)+ y(r,t) > 0 y(r,t) ≥ 0 logical constraints on P

IUKM 2019 - Nara, Japan 31 / 62

slide-52
SLIDE 52

Outline

Aggregation and Choquet integral in MCDM

  • Aggregation and selection
  • Selection of the one with maximum value of C = CI with µ

(maximum distance to nadir – worst combination) d((a1, . . . , an), (0, . . . , 0))

  • Selection of the one with minimum distance to ideal

d((a1, . . . , an), (100, . . . , 100)) where d is computed as an aggregation

x1 f1(x2) f1(x1) f1 f2 f2(x2) f2(x1) x2

IUKM 2019 - Nara, Japan 32 / 62

slide-53
SLIDE 53

Outline

Application II The Choquet integral in metric learning: reidentification

IUKM 2019 - Nara, Japan 33 / 62

slide-54
SLIDE 54

Outline

Aggregation operators and CI in reidentification

  • Re-identification. Record linkage for databases, supervised approach
  • ML/Optimization for distance-based RL (A and B aligned).

⊲ Goal: as many correct reidentifications as possible: for each record i, we need d(ai, bj) ≥ d(ai, bi) for all j ai = (ai1, . . . , ain) and bi = (bi1, . . . , bin)

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 34 / 62

slide-55
SLIDE 55

Outline

Aggregation operators and CI in reidentification

  • Re-identification. Record linking for databases. Supervised approach
  • ML/Optimization for distance-based approach. (A and B aligned)

⊲ Goal: as many correct reidentifications as possible. But, if error for ai: Ki = 1 and d(ai, bj)+CKi ≥ d(ai, bi) for all j ⊲ or, expanding d,

Cp(diff1(ai1, bj1), . . . , diffn(ain, bjn)+CKi ≥ Cp(diff1(ai1, bi1), . . . , diffn(ain, bin))

  • Formalization:

Minimize

N

  • i=1

Ki Subject to:Cp(diff1(ai1, bj1), . . . , diffn(ain, bjn))− − Cp(diff1(ai1, bi1), . . . , diffn(ai1, bi1)) + CKi > 0 Ki ∈ {0, 1} Additional constraints according to C

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 35 / 62

slide-56
SLIDE 56

Outline

Aggregation operators and CI in reidentification

  • Re-identification. Record linking for databases. Supervised approach
  • ML/Optimization for distance-based approach. (A and B aligned)
  • Formalization for CI

Minimize

N

  • i=1

Ki Subject to:CIµ(diff1(ai1, bj1), . . . , diffn(ain, bjn))− − CIµ(diff1(ai1, bi1), . . . , diffn(ai1, bi1)) + CKi > 0 Ki ∈ {0, 1} Additional constraints for µ

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 36 / 62

slide-57
SLIDE 57

Outline

Aggregation operators and CI in reidentification

  • Re-identification. Record linking for databases. Supervised approach
  • ML/Optimization for distance-based approach. (A and B aligned)
  • Formalization for CI

Minimize

N

  • i=1

Ki Subject to:CIµ(diff1(ai1, bj1), . . . , diffn(ain, bjn))− − CIµ(diff1(ai1, bi1), . . . , diffn(ai1, bi1)) + CKi > 0 Ki ∈ {0, 1} Additional constraints for µ

(but also WM, OWA, and Bilinear distance)

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 36 / 62

slide-58
SLIDE 58

Outline

Zooming out: trying to understand

Aggregation, distances, and independence

IUKM 2019 - Nara, Japan 37 / 62

slide-59
SLIDE 59

Outline

Aggregation, distance and independence

  • Aggregation and distance.
  • Arithmetic mean (AM): Euclidean distance
  • Weighted mean (WM): Weighted euclidean
  • Choquet integral (CI): Choquet integral-based distance
  • ——

: Bilinear/Mahalanobis distance

  • In a single picture: Mahalanobis and Choquet distance

Mahalanobis Distance Choquet Integral Fuzzy measure Covariance Matrix Weighted Choquet integral Weighted mean Additive measure Diagonal Matrix Euclidean Euclidean Arithmetic mean Uniform 1/n 1/n diag. IUKM 2019 - Nara, Japan 38 / 62

slide-60
SLIDE 60

Outline

Aggregation, distance and independence

  • Aggregation, distance and independence.
  • Only with Choquet integral and Mahalanobis distances

⊲ Mahalanobis: covariance matrix ⊲ Choquet integral: fuzzy measure

  • In a single framework: Mahalanobis and Choquet distance

Mahalanobis Distance Choquet Integral Fuzzy measure Covariance Matrix Weighted Choquet integral Weighted mean Additive measure Diagonal Matrix Euclidean Euclidean Arithmetic mean Uniform 1/n 1/n diag. IUKM 2019 - Nara, Japan 39 / 62

slide-61
SLIDE 61

Outline

Filling gaps:

Aggregation, distances, and independence

IUKM 2019 - Nara, Japan 40 / 62

slide-62
SLIDE 62

Outline

Aggregation, distance and independence

  • Mahalanobis distance.
  • between x ∈ Rd and a vector m ∈ Rd

with respect to the covariance matrix Σ (x − m)Σ−1(x − m))

IUKM 2019 - Nara, Japan 41 / 62

slide-63
SLIDE 63

Outline

Aggregation, distance and independence

  • Choquet integral distance.
  • between x ∈ Rd and a vector m ∈ Rd

with respect to a non-additive measure µ CIµ((x − m) ◦ (x − m)) v ◦ w is the Hadamard or Schur (elementwise) product of v and w

(i.e., (v ◦ w) = (v1w1 . . . vnwn)).

IUKM 2019 - Nara, Japan 42 / 62

slide-64
SLIDE 64

Outline

Aggregation, distance and independence

  • Choquet-Mahalanobis integral distance.
  • between x ∈ Rd and a vector m ∈ Rd

with respect to µ and a positive-definite matrix Q CMI(m, µ, Q) = CIµ(v ◦ w) where ⊲ LLT = Q is the Cholesky decomposition of the matrix Q, ⊲ v = (x − m)TL, ⊲ w = LT(x − m), and where ⊲ v ◦ w is the Hadamard (elementwise) product of v and w.

IUKM 2019 - Nara, Japan 43 / 62

slide-65
SLIDE 65

CMI distribution Outline

Choquet integral based distribution: generalized distance

Well defined when Σ is a covariance matrix.

  • When Σ−1 is a definite-positive matrix, the Cholesky descomposition is unique.

This is the case when Σ is a covariance matrix valid for generating a probability- density function.

Proper generalization:

  • Generalization of both the Mahalanobis and the Choquet integral

based distance.

  • The definition with Σ equal to the identity results into the Choquet integral of

(x − ¯ x) ⊗ (x − ¯ x) with respect to µ.

  • The definition with µ corresponding to an additive probability µ(A) = 1/|A|

results into 1/n of the Mahalanobis distance with respect to Σ.

IUKM 2019 - Nara, Japan 44 / 62

slide-66
SLIDE 66

Outline

Aggregation, distance and independence

  • Aggregation and distance.
  • Arithmetic mean (AM): Euclidean distance
  • Weighted mean (WM): Weighted euclidean
  • Choquet integral (CI): Choquet integral-based distance
  • ——

: Bilinear/Mahalanobis distance

  • Choquet-Mahalanobis integral: CMI-distance

Mahalanobis Distance Choquet Integral Fuzzy measure Covariance Matrix Weighted Choquet integral Weighted mean Additive measure Diagonal Matrix Euclidean Euclidean Arithmetic mean Uniform 1/n 1/n diag. Choquet−Mahalanobis distance Semi−definite positive matrix Fuzzy measure

IUKM 2019 - Nara, Japan 45 / 62

slide-67
SLIDE 67

Outline

A natural construction:

Distributions

IUKM 2019 - Nara, Japan 46 / 62

slide-68
SLIDE 68

Outline

Distributions

  • E.g. in Classification data drawn from normal Gaussian distributions.
  • Parameters N(µ, Σ) determined from real data or known
  • Set of k classes Ω = {ω1, . . . , ωk}
  • covariance matrices Σi
  • means ¯

xi class-conditional probability-density function Gaussian distribution P(x|ωi) =

1 (2π)m/2|Σi|1/2e−1

2(x−¯

xi)T Σ−1

i

(x−¯ xi)

−2 2 4 6 8 −2 2 4 6 8

Two classes

table[,1] table[,2] 5 10 5 10

Two classes with different correlations

table[,1] table[,2]

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 47 / 62

slide-69
SLIDE 69

Outline

Distributions

  • Define distributions based on the Choquet integral. Why?
  • Non-additive measures on a set X permit us to represent interactions

between objects in X !! ... similar to covariances but different types of interactions !!

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 48 / 62

slide-70
SLIDE 70

Outline

Distributions

Definition:

  • Y = {Y1, . . . , Yn} random variables; µ : 2Y → [0, 1] a non-additive

measure and m a vector in Rn.

  • The exponential family of Choquet integral based class-conditional

probability-density functions is defined by: PCm,µ(x) = 1 Ke−1

2CIµ((x−m)◦(x−m))

where K is a constant that is defined so that the function is a probability, and where v ◦ w denotes the Hadamard or Schur (elementwise) product of vectors v and w (i.e., (v ◦ w) = (v1w1 . . . vnwn)). Notation:

  • We denote it by C(m, µ).

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 49 / 62

slide-71
SLIDE 71

Outline

Distributions

  • Shapes (level curves)

(-15.0,-15.0) 15.0 15.0 qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q qq qq qq qq qq qq qqq qqq qqq qqq qqqqqqq qqqqqqqqqqqqqqqqqq qqqqqqq qqq qqq qqq qqq qq qq qq qq qq qq q qq q qq q qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq qq qq qqq qqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qqq qqq qq qq qq qq (-15.0,-15.0) 15.0 15.0 q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqq q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqq (-15.0,-15.0) 15.0 15.0 qqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqq qq qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq qq qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqq qqq qq qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq qq qqq qqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqq (-15.0,-15.0) 15.0 15.0 qqq qqqq qq qqqqqq qq qq qq qq q qq q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q qq q qq qq qq qq qqqqqq qq qqqq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqq qqqqqqqqqqqqqqqq qq qqqq qq qq qq qq q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq q qq qq qq qq qqqq qq qqqqqqqqqqqqqqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqq

(a) µA({x}) = 0.1 and µA({y}) = 0.1, (b) µB({x}) = 0.9 and µB({y}) = 0.9, (c) µC({x}) = 0.2 and µC({y}) = 0.8, and (d) µD({x}) = 0.4 and µD({y}) = 0.9.

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 50 / 62

slide-72
SLIDE 72

Outline

Distributions

Property:

  • The family of distributions N(m, Σ) in Rn with a diagonal matrix Σ
  • f rank n, and the family of distributions C(m, µ) with an additive

measure µ with all µ({xi}) = 0 are equivalent.

(µ(X) is not necessarily here 1)

Follows from additivity in µ = probability = diagonal Σ

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 51 / 62

slide-73
SLIDE 73

Outline

Distributions

Property:

  • The family of distributions N(m, Σ) in Rn with a diagonal matrix Σ
  • f rank n, and the family of distributions C(m, µ) with an additive

measure µ with all µ({xi}) = 0 are equivalent.

(µ(X) is not necessarily here 1)

Follows from additivity in µ = probability = diagonal Σ Corollary:

  • The distribution N(0, I) corresponds to C(0, µ1) where µ1 is the

additive measure defined as µ1(A) = |A| for all A ⊆ X.

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 51 / 62

slide-74
SLIDE 74

Outline

Distributions

Properties:

  • In general, the two families of distributions N(m, Σ) and C(m, µ)

are different.

  • C(m, µ) always symmetric w.r.t. Y1 and Y2 axis.

(-15.0,-15.0) 15.0 15.0 qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q qq qq qq qq qq qq qqq qqq qqq qqq qqqqqqq qqqqqqqqqqqqqqqqqq qqqqqqq qqq qqq qqq qqq qq qq qq qq qq qq q qq q qq q qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq qq qq qqq qqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qqq qqq qq qq qq qq

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 52 / 62

slide-75
SLIDE 75

Outline

Distributions

Properties:

  • In general, the two families of distributions N(m, Σ) and C(m, µ)

are different.

  • C(m, µ) always symmetric w.r.t. Y1 and Y2 axis.

(-15.0,-15.0) 15.0 15.0 qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q qq qq qq qq qq qq qqq qqq qqq qqq qqqqqqq qqqqqqqqqqqqqqqqqq qqqqqqq qqq qqq qqq qqq qq qq qq qq qq qq q qq q qq q qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq qq qq qqq qqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qqq qqq qq qq qq qq

  • Using the CMI distance, we consider both types of interactions
  • Mahalanobis: Σ
  • Choquet (measure): µ

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 52 / 62

slide-76
SLIDE 76

Outline

Distributions

Definition:

  • Y = {Y1, . . . , Yn} random variables, µ : 2Y → [0, 1] a measure, m a

vector in Rn, and Q a positive-definite matrix.

  • The exponential family of Choquet-Mahalanobis integral based class-

conditional probability-density functions is defined by: PCMm,µ,Q(x) = 1 Ke−1

2CIµ(v◦w)

where K is a constant that is defined so that the function is a probability, where LLT = Q is the Cholesky decomposition of the matrix Q, v = (x − m)TL, w = LT(x − m), and where v ◦ w denotes the elementwise product of vectors v and w. Notation:

  • We denote it by CMI(m, µ, Q).

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 53 / 62

slide-77
SLIDE 77

Outline

Distributions

Property:

  • The distribution CMI(m, µ, Q) generalizes the multivariate normal

distributions and the Choquet integral based distribution. In addition

  • A CMI(m, µ, Q) with µ = µ1 corresponds to multivariate normal

distributions,

  • A CMI(m, µ, Q) with Q = I corresponds to a CI(m, µ).

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 54 / 62

slide-78
SLIDE 78

Outline

Distributions

Graphically:

  • Choquet integral (CI distribution), Mahalobis distance (multivariate

normal distribution), generalization (CMI distribution)

Mahalanobis Distance Choquet Integral Fuzzy measure Covariance Matrix Weighted Choquet integral Weighted mean Additive measure Diagonal Matrix Euclidean Euclidean Arithmetic mean Uniform 1/n 1/n diag. Choquet−Mahalanobis distance Semi−definite positive matrix Fuzzy measure

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 55 / 62

slide-79
SLIDE 79

Outline

Distributions

1st Example: Interactions only expressed in terms of a measure.

  • No correlation exists between the variables.
  • CMI with σ1 = 1, σ2 = 1, ρ12 = 0.0, µx = 0.01, µy = 0.01.

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 56 / 62

slide-80
SLIDE 80

Outline

Distributions

2nd Example: Interactions only in terms of a covariance matrix.

  • CMI with σ1 = 1, σ2 = 1, ρ12 = 0.9, µx = 0.10, µy = 0.90.

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 57 / 62

slide-81
SLIDE 81

Outline

Distributions

3rd Example: Interactions both: covariance matrix and measure.

  • CMI with σ1 = 1, σ2 = 1, ρ12 = 0.9, µx = 0.01, µy = 0.01.

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 58 / 62

slide-82
SLIDE 82

Outline

Distributions

More properties: Data not always acc. normality assumption

  • spherical, elliptical distributions
  • They generalize, respectively, N(0, I) and N(m, Σ)

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 59 / 62

slide-83
SLIDE 83

Outline

Distributions

More properties: Data not always acc. normality assumption

  • spherical, elliptical distributions
  • They generalize, respectively, N(0, I) and N(m, Σ)
  • Neither CMI(m, µ, Q) ⊆ / ⊇ spherical / elliptical distributions.

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 59 / 62

slide-84
SLIDE 84

Outline

Distributions

More properties: Data not always acc. normality assumption

  • spherical, elliptical distributions
  • They generalize, respectively, N(0, I) and N(m, Σ)
  • Neither CMI(m, µ, Q) ⊆ / ⊇ spherical / elliptical distributions.

Example:

  • Non-additive µ: CMI(m, µ, Q) not repr. spherical/elliptical

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 59 / 62

slide-85
SLIDE 85

Outline

Distributions

More properties: Data not always acc. normality assumption

  • spherical, elliptical distributions
  • They generalize, respectively, N(0, I) and N(m, Σ)
  • Neither CMI(m, µ, Q) ⊆ / ⊇ spherical / elliptical distributions.

Example:

  • Non-additive µ: CMI(m, µ, Q) not repr. spherical/elliptical
  • No

CMI for the following spherical distribution: Spherical distribution with density

f(r) = (1/K)e

− r−r0

σ

2

,

where r0 is a radius over which the density is maximum, σ is a variance, and K is the normalization constant.

Vicen¸ c Torra; Choquet integral in decision making and metric learning IUKM 2019 - Nara, Japan 59 / 62

slide-86
SLIDE 86

Outline

Summary

IUKM 2019 - Nara, Japan 60 / 62

slide-87
SLIDE 87

Summary Outline

Summary

Summary:

  • Choquet integral and

non-additive measures for decision and reidentification

  • Definition of distances based on the Choquet integral
  • Comparison with the Mahalanobis distance
  • Construction of distributions
  • Relationship with multivariate normal and spherical distributions

IUKM 2019 - Nara, Japan 61 / 62

slide-88
SLIDE 88

Outline

Thank you

IUKM 2019 - Nara, Japan 62 / 62