Prrs rs tt - - PowerPoint PPT Presentation

pr r s r s t t
SMART_READER_LITE
LIVE PREVIEW

Prrs rs tt - - PowerPoint PPT Presentation

Prrs rs tt s r r


slide-1
SLIDE 1

Pr❡❢❡r❡♥❝❡s✱ ■♥✈❛r✐❛♥❝❡s✱ ❖♣t✐♠✐③❛t✐♦♥

■❧②❛ ▲♦s❤❝❤✐❧♦✈✱ ▼❛r❝ ❙❝❤♦❡♥❛✉❡r✱ ▼✐❝❤è❧❡ ❙❡❜❛❣ ❚❆❖✱ ❈◆❘❙ − ■◆❘■❆ − ❯♥✐✈❡rs✐té P❛r✐s✲❙✉❞ ❉❛❣st✉❤❧✱ ▼❛r❝❤ ✷✵✶✹

slide-2
SLIDE 2

P♦s✐t✐♦♥

❖♥❡ ❣♦❛❧ ♦❢

◮ ▼❛❝❤✐♥❡ ❧❡❛r♥✐♥❣✿

♦♣t✐♠❛❧ ❞❡❝✐s✐♦♥ ♠❛❦✐♥❣

◮ Pr❡❢❡r❡♥❝❡ ❧❡❛r♥✐♥❣✿

♦♣t✐♠✐③❛t✐♦♥ ❚❤✐s t❛❧❦✿ ❜❧❛❝❦ ❜♦① ♦♣t✐♠✐③❛t✐♦♥ ❲❤❡♥ ✉s✐♥❣ ♣r❡❢❡r❡♥❝❡ ❧❡❛r♥✐♥❣ ❄

◮ ✇❤❡♥ ❞❡❛❧✐♥❣ ✇✐t❤ t❤❡ ✉s❡r ✐♥ t❤❡ ❧♦♦♣

❍❡r❞② ❡t ❛❧✳✱ ✾✻

◮ ✇❤❡♥ ❞❡❛❧✐♥❣ ✇✐t❤ ❝♦♠♣✉t❛t✐♦♥❛❧❧② ❡①♣❡♥s✐✈❡ ❝r✐t❡r✐❛

❍❡r❞② ❡t ❛❧✳ ✾✻ ❙✉rr♦❣❛t❡ ♠♦❞❡❧s

slide-3
SLIDE 3

P♦s✐t✐♦♥

❖♥❡ ❣♦❛❧ ♦❢

◮ ▼❛❝❤✐♥❡ ❧❡❛r♥✐♥❣✿

♦♣t✐♠❛❧ ❞❡❝✐s✐♦♥ ♠❛❦✐♥❣

◮ Pr❡❢❡r❡♥❝❡ ❧❡❛r♥✐♥❣✿

♠✉❧t✐✲♦❜❥❡❝t✐✈❡ ♦♣t✐♠✐③❛t✐♦♥ ❚❤✐s t❛❧❦✿ ❜❧❛❝❦ ❜♦① ♠✉❧t✐✲♦❜❥❡❝t✐✈❡ ♦♣t✐♠✐③❛t✐♦♥ ❲❤❡♥ ✉s✐♥❣ ♣r❡❢❡r❡♥❝❡ ❧❡❛r♥✐♥❣ ❄

◮ ✇❤❡♥ ❞❡❛❧✐♥❣ ✇✐t❤ t❤❡ ✉s❡r ✐♥ t❤❡ ❧♦♦♣

❍❡r❞② ❡t ❛❧✳✱ ✾✻

◮ ✇❤❡♥ ❞❡❛❧✐♥❣ ✇✐t❤ ❝♦♠♣✉t❛t✐♦♥❛❧❧② ❡①♣❡♥s✐✈❡ ❝r✐t❡r✐❛

❍❡r❞② ❡t ❛❧✳ ✾✻ ❙✉rr♦❣❛t❡ ♠♦❞❡❧s

slide-4
SLIDE 4

❖♣t✐♠✐③✐♥❣ ❝♦✛❡❡ t❛st❡

❋❡❛t✉r❡s

◮ ❙❡❛r❝❤ s♣❛❝❡ ❳ ⊂ ■

❘+ ❞ ✭r❡❝✐♣❡ ①✿ ✸✸✪ ❛r❛❜✐❝❛✱ ✷✺✪ r♦❜✉st❛✱ ❡t❝✮

◮ ❆ ♥♦♥✲❝♦♠♣✉t❛❜❧❡ ♦❜❥❡❝t✐✈❡ ◮ ❊①♣❡rt ❝❛♥ ✭❜② t❛st✐♥❣✮ ❡♠✐t ♣r❡❢❡r❡♥❝❡s ① ≺ ①′✳

■♥t❡r❛❝t✐✈❡ ♦♣t✐♠✐③❛t✐♦♥ s❡❡ ❛❧s♦

❱✐❛♣♣✐❛♥✐ ❡t ❛❧✳ ✶✶

✶✳ ❆❧❣✳ ❣❡♥❡r❛t❡s t✇♦ ♦r ♠♦r❡ ❝❛♥❞✐❞❛t❡s ①, ①′, ①“, .. ✷✳ ❊①♣❡rt ❡♠✐ts ♣r❡❢❡r❡♥❝❡s ✸✳ ❣♦t♦ ✶✳ ■ss✉❡s

◮ ❆s❦✐♥❣ ❛s ❢❡✇ q✉❡st✐♦♥s ❛s ♣♦ss✐❜❧❡

= ❛❝t✐✈❡ r❛♥❦✐♥❣

◮ ▼♦❞❡❧❧✐♥❣ t❤❡ ❡①♣❡rt✬s t❛st❡

s✉rr♦❣❛t❡ ♠♦❞❡❧

◮ ❊♥❢♦r❝❡ t❤❡ ❡①♣❧♦r❛t✐♦♥ ✈s ❡①♣❧♦✐t❛t✐♦♥ tr❛❞❡✲♦✛

slide-5
SLIDE 5

❊①♣❡♥s✐✈❡ ❜❧❛❝❦✲❜♦① ♦♣t✐♠✐③❛t✐♦♥

◆♦t❛t✐♦♥s

◮ ❙❡❛r❝❤ s♣❛❝❡✿ ❳ ⊂ ■

❘❞

◮ ❈♦♠♣✉t❛❜❧❡ ♦❜❥❡❝t✐✈❡ F✿ ❳ → ■

◮ ◆♦t ✇❡❧❧ ❜❡❤❛✈❡❞ ✭♥♦♥ ❝♦♥✈❡①✱ ♥♦♥ ❞✐✛❡r❡♥t✐❛❜❧❡✱ ❡t❝✮✳

❊✈♦❧✉t✐♦♥❛r② ♦♣t✐♠✐③❛t✐♦♥ ✶✳ ❆❧❣✳ ❣❡♥❡r❛t❡s ❝❛♥❞✐❞❛t❡ s♦❧✉t✐♦♥s ✭♣♦♣✉❧❛t✐♦♥✮ ①✶, . . . ①λ ✷✳ ❈♦♠♣✉t❡ F(①✐) ❛♥❞ r❛♥❦ ①✐ ❛❝❝♦r❞✐♥❣❧② ✸✳ ❣♦t♦ ✶✳ ■ss✉❡s

◮ ❈♦♠♣✉t❛t✐♦♥❛❧ ❝♦st

♥✉♠❜❡r ♦❢ F ❝♦♠♣✉t❛t✐♦♥s

◮ ▲❡❛r♥

F s✉rr♦❣❛t❡ ♠♦❞❡❧

◮ ❲❤❡♥ t♦ ✉s❡ F ❛♥❞ ✇❤❡♥

F ❄ ✇❤❡♥ t♦ r❡❢r❡s❤ F ❄

slide-6
SLIDE 6

❖✈❡r✈✐❡✇

▼♦t✐✈❛t✐♦♥s ❇❧❛❝❦✲❜♦① ♦♣t✐♠✐③❛t✐♦♥✳✳✳ ✳✳✳ ✇✐t❤ s✉rr♦❣❛t❡ ♠♦❞❡❧s ▼✉❧t✐✲♦❜❥❡❝t✐✈❡ ♦♣t✐♠✐③❛t✐♦♥

slide-7
SLIDE 7

Covariance-Matrix Adaptation (CMA-ES) Rank-µ Update

xi = m + σ yi, yi ∼ Ni(0, C) , m ← m + σyw yw = µ

i=1 wi yi:λ xi = m + σ yi, yi ∼ N (0, C)

sampling of λ = 150 solutions where C = I and σ = 1

Cµ = 1 µ yi:λyT i:λ C ← (1 − 1) × C + 1 × Cµ

calculating C from µ = 50 points, w1 = · · · = wµ = 1

µ

mnew ← m + 1 µ yi:λ

new distribution

Remark: the old (sample) distribution shape has a great influence on the new distribution − → iterations needed

◮ Source codes available: https://www.lri.fr/~hansen/cmaes_inmatlab.html

slide-8
SLIDE 8

Invariance: Guarantees for Generalization

Invariance properties of CMA-ES

◮ Invariance to order preserving

transformations in function space

like all comparison-based algorithms

◮ Translation and rotation invariance

to affine transformations of the search space

−3 −2 −1 1 2 3 −3 −2 −1 1 2 3 −3 −2 −1 1 2 3 −3 −2 −1 1 2 3

CMA-ES is almost parameterless

◮ Tuning on a small set of functions

Hansen & Ostermeier 2001

◮ Except: population size for multi-modal functions

More: IPOP-CMA-ES Auger & Hansen, 05 and BIPOP-CMA-ES Hansen, 09

Information-Geometric Optimization

Yann Ollivier et al. 2012

slide-9
SLIDE 9

BBOB – Black-Box Optimization Benchmarking

◮ ACM-GECCO workshops: 2009, 2010, 2012 ◮ Set of 25 benchmark functions, dimensions 2 to 40 ◮ With known difficulties (non-separability, #local optima, condition

number...)

◮ Noisy and non-noisy versions

Competitors include

◮ BFGS (Matlab version), ◮ Fletcher-Powell, ◮ DFO (Derivative-Free

Optimization, Powell 04)

◮ Differential Evolution ◮ Particle Swarm Optimization ◮ and others.

Fraction of runs reaching specified accuracy vs number of F computation.

slide-10
SLIDE 10

❖✈❡r✈✐❡✇

▼♦t✐✈❛t✐♦♥s ❇❧❛❝❦✲❜♦① ♦♣t✐♠✐③❛t✐♦♥✳✳✳ ✳✳✳ ✇✐t❤ s✉rr♦❣❛t❡ ♠♦❞❡❧s ▼✉❧t✐✲♦❜❥❡❝t✐✈❡ ♦♣t✐♠✐③❛t✐♦♥

slide-11
SLIDE 11

Surrogate Models for CMA-ES

Exploiting first evaluated solutions as training set E = {(xi, F(xi)} Using Ranking-SVM

◮ Builds

F using Ranking-SVM xi ≻ xj iff F(xi) < F(xj)

◮ Kernel and parameters problem-dependent

  • T. Runarsson (2006). "Ordinal Regression in Evolutionary Computation"

◮ ACM: Use C from CMA-ES as Gaussian kernel

  • I. Loschilov et al. (2010). "Comparison-based optimizers need comparison-based surrogates”
  • I. Loschilov et al. (2012). "Self-Adaptive Surrogate-Assisted CMA-ES”
slide-12
SLIDE 12

About Model Learning

Non-separable Ellipsoid problem

K(xi, xj) = e−

(xi−xj )t(xi−xj ) 2σ2

; KC(xi, xj) = e−

(xi−xj )tC−1 µ (xi−xj ) 2σ2 −1 −0.5 0.5 1 −1 −0.5 0.5 1 X1 X2 −1 −0.5 0.5 1 −1 −0.5 0.5 1 X1 X2

Invariance to affine transformations of the search space.

slide-13
SLIDE 13

The devil is in the hyper-parameters

SVM Learning

◮ Number of training points: Ntraining = 30

√ d for all problems, except Rosenbrock and Rastrigin, where Ntraining = 70 √ d

◮ Number of iterations: Niter = 50000

√ d

◮ Kernel function: RBF function with σ equal to the average

distance of the training points

◮ The cost of constraint violation: Ci = 106(Ntraining − i)2.0

Offspring Selection

◮ Number of test points: Ntest = 500 ◮ Number of evaluated offsprings: λ′ = λ 3 ◮ Offspring selection pressure parameters: σ2 sel0 = 2σ2 sel1 = 0.8

slide-14
SLIDE 14

Sensitivity analysis

The speed-up of ACM-ES is very sensitive

◮ w.r.t. number of training points.

50 100 150 200 250 300 350 400 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

Number of training points Speedup Dimension 10

F1 Sphere F5 LinearSlope F6 AttractSector F8 Rosenbrock F10 RotEllipsoid F11 Discus F12 Cigar F13 SharpRidge F14 SumOfPow Average

◮ w.r.t. lifelength of the surrogate model

slide-15
SLIDE 15

Self-adaptation of F lifelength

Principle: iterated preference learning

◮ After n generations, gather new examples {xi, F(xi)} ◮ Evaluate rank loss of old

F

◮ Low error:

F could have been used for more generations

◮ High error:

F should have been relearned earlier. Self-adaptation n = g( rank loss( F)) Model Error Number of generations

0.5 1.0

slide-16
SLIDE 16

ACM-ES algorithm

Surrogate-assisted CMA-ES with online adaptation of model hyper-parametets.

slide-17
SLIDE 17

Online adaptation of model hyper-parameters

1000 2000 3000 4000 5000 6000 0.5 1

F8 Rosenbrock 20−D Number of function evaluations Value

1000 2000 3000 4000 5000 6000 10

−10

10 10

5

Fitness

Ntraining Cbase Cpow csigma

Online-adaptation of hyper-parameters: improves on optimally tuned hyper-parameters

slide-18
SLIDE 18

Results on black-box optimization competition (BBOB)

BIPOP-s∗aACM and IPOP-s∗aACM (with restarts) on 24 noiseless 20 dimensional functions

1 2 3 4 5 6 7 8 9 log10 of (ERT / dimension) 0.0 0.5 1.0 Proportion of functions

RANDOMSEARCH SPSA BAYEDA DIRECT DE-PSO GA LSfminbnd LSstep RCGA Rosenbrock MCS ABC PSO POEMS EDA-PSO NELDERDOERR NELDER

  • POEMS

FULLNEWUOA ALPS GLOBAL PSO_Bounds BFGS ONEFIFTH Cauchy-EDA NBC-CMA CMA-ESPLUSSEL NEWUOA AVGNEWUOA G3PCX 1komma4mirser 1plus1 CMAEGS DEuniform DE-F-AUC MA-LS-CHAIN VNS iAMALGAM IPOP-CMA-ES AMALGAM IPOP-ACTCMA-ES IPOP-saACM-ES MOS BIPOP-CMA-ES BIPOP-saACM-ES best 2009

f1-24 ACM-XX significantly improves on XX (BIPOP-CMA, IPOP-CMA) progress on the top of advanced CMA-ES variants.

slide-19
SLIDE 19

❖✈❡r✈✐❡✇

▼♦t✐✈❛t✐♦♥s ❇❧❛❝❦✲❜♦① ♦♣t✐♠✐③❛t✐♦♥✳✳✳ ✳✳✳ ✇✐t❤ s✉rr♦❣❛t❡ ♠♦❞❡❧s ▼✉❧t✐✲♦❜❥❡❝t✐✈❡ ♦♣t✐♠✐③❛t✐♦♥

slide-20
SLIDE 20

Multi-objective CMA-ES (MO-CMA-ES)

◮ MO-CMA-ES = µmo independent (1+1)-CMA-ES. ◮ Each (1+1)-CMA samples new offspring. The size of the

temporary population is 2µmo.

◮ Only µmo best solutions should be chosen for new population

after the hypervolume-based non-dominated sorting.

◮ Update of CMA individuals takes place.

Objective 1 Objective 2 Dominated Pareto

slide-21
SLIDE 21

A Multi-Objective Surrogate Model

Rationale

◮ Rationale: find a unique function F(x) that defines the

aggregated quality of the solution x in multi-objective case.

◮ Idea originally proposed using a mixture of One-Class SVM and

regression-SVM1

F

S V M

Objective 1 Objective 2

p

  • e

p + e

Dominated Pareto New Pareto X1 X2

p+e p-e 2e

  • 1I. Loshchilov, M. Schoenauer, M. Sebag (GECCO 2010). "A Mono Surrogate for Multiobjective Optimization"
slide-22
SLIDE 22

Unsing the Surrogate Model

Filtering

◮ Generate Ninform pre-children ◮ For each pre-children A and the nearest parent B calculate

Gain(A, B) = Fsvm(A) − Fsvm(B)

◮ New children is the point with the maximum value of Gain

X1 X2 true Pareto SVM Pareto

slide-23
SLIDE 23

Dominance-Based Surrogate

Using Rank-SVM

Which ordered pairs?

◮ Considering all possible ≻ relations may be too expensive. ◮ Primary constraints: x and its nearest dominated point ◮ Secondary constraints: any 2 points not belonging to the same

front (according to non-dominated sorting)

Objective 1 Objective 2

FSVM

Primary Secondary

  • constraints:

“>”

a b c d e f

All primary constraints, and a limited number of secondary constraints

slide-24
SLIDE 24

Dominance-Based Surrogate (2)

Construction of the surrogate model

◮ Initialize archive Ωactive as the set of Primary constraints, and

Ωpassive as the set of Secondary constraints.

◮ Learn the model for 1000 |Ωactive| iterations. ◮ Add the most violated passive contraint from Ωpassive to Ωactive

and optimize the model for 10 |Ωactive| iterations.

◮ Repeat the last step 0.1|Ωactive| times.

slide-25
SLIDE 25

Experimental Validation

Parameters

Surrogate Models

◮ ASM - aggregated surrogate model based on One-Class SVM

and Regression SVM

◮ RASM - proposed Rank-based SVM

SVM Learning

◮ Number of training points: at most Ntraining = 1000 points ◮ Number of iterations: 1000 |Ωactive| + |Ωactive|2 ≈ 2N 2 training ◮ Kernel function: RBF function with σ equal to the average

distance of the training points

◮ The cost of constraint violation: C = 1000

Offspring Selection

◮ Number of pre-children: p = 2 and p = 10

slide-26
SLIDE 26

Experimental Validation

Comparative Results

ASM and Rank-based ASM applied on top of NSGA-II (with hypervolume secondary criterion) and MO-CMA-ES,

  • n ZDT and IHR

functions. N = How many more true evaluations than best performer

slide-27
SLIDE 27

❉✐s❝✉ss✐♦♥ ✶✳ Pr❡❢❡r❡♥❝❡s → ❖♣t✐♠✐③❛t✐♦♥

Pr❡❢❡r❡♥❝❡ ❧❡❛r♥✐♥❣ ❢♦r r♦❜✉st ❜❧❛❝❦✲❜♦① ♦♣t✐♠✐③❛t✐♦♥

◮ ❆❈▼✲❊❙✿ s♣❡❡❞✲✉♣ ✭×✷✱ ×✹✮ ♦♥ t❤❡ st❛t❡ ♦❢ t❤❡ ❛rt ♦♥

✉♥✐✲♠♦❞❛❧ ♣r♦❜❧❡♠s✳

◮ ■♥✈❛r✐❛♥t t♦ r❛♥❦✲♣r❡s❡r✈✐♥❣ ❛♥❞ ♦rt❤♦❣♦♥❛❧ tr❛♥s❢♦r♠❛t✐♦♥s ♦❢

t❤❡ s❡❛r❝❤ s♣❛❝❡

◮ ❚❤❡ ❝♦st ♦❢ s♣❡❡❞✲✉♣ ✐s ❖(❞✸) ◮ ❙♦✉r❝❡ ❝♦❞❡s ❛✈❛✐❧❛❜❧❡✿ ❤tt♣s✿✴✴✇✇✇✳❧r✐✳❢r✴⑦✐❧②❛✴

▲❡ss♦♥s ❧❡❛r♥❡❞

◮ Pr❡❢❡r❡♥❝❡ ❧❡❛r♥✐♥❣ r❡♣❡❛t❡❞❧② ✉s❡❞ ✐♥ t❤❡ ♦♣t✐♠✐③❛t✐♦♥

♣❧❛t❢♦r♠

◮ ❍②♣❡r✲♣❛r❛♠❡t❡r ❛❞❥✉st♠❡♥t ✐s ❝r✐t✐❝❛❧ ◮ ▼▲ ❛ss❡ss♠❡♥t ❝r✐t❡r✐♦♥✿ ♥♦t t❤❡ ❛✈❡r❛❣❡ ❝❛s❡✿ t❤❡ ✇♦rst ❝❛s❡

❛ ❝r✐t✐❝❛❧ ❤②♣❡r✲♣❛r❛♠❡t❡r✿ t❤❡ st♦♣♣✐♥❣ ❝r✐t❡r✐♦♥✳ ■❧❧✲❝♦♥❞✐t✐♦♥❡❞ ●r❛♠ ♠❛tr✐❝❡s ❛r❡ ❡♥❝♦✉♥t❡r❡❞ ✇✐t❤ ♣r♦❜❛❜✐❧✐t② ✶✳

slide-28
SLIDE 28

❉✐s❝✉ss✐♦♥ ✷✳ ❖♣t✐♠✐③❛t✐♦♥ → Pr❡❢❡r❡♥❝❡s ❄

❊❧✐❝✐t✐♥❣ ♣r❡❢❡r❡♥❝❡s ❄

◮ ❙✉❜❥❡❝t✐✈❡♥❡ss ✐s ❛♥ ✐ss✉❡ ✭♣❤r❛s✐♥❣ ♦❢ q✉❡st✐♦♥s✱ ❝❤♦✐❝❡ ♦❢

✉♥✐ts✮

◮ ❉❡s✐❣♥✐♥❣ ❛ q✉❡st✐♦♥❛✐r❡✿ ❛♥ ♦♣t✐♠✐③❛t✐♦♥ ♣r♦❜❧❡♠ ❄ ◮ ❲❤✐❝❤ ❝r✐t❡r✐♦♥✿ st❛❜✐❧✐t② ❄

❉❡s✐❣♥✐♥❣ ❛❣❣r❡❣❛t✐♦♥ ♦♣❡r❛t♦rs ✴ ✈♦t✐♥❣ r✉❧❡s ❄

◮ ❆ ❧❡❛r♥✐♥❣ ♦r ❛♥ ♦♣t✐♠✐③❛t✐♦♥ ♣r♦❜❧❡♠ ❄

slide-29
SLIDE 29

❘❡❢❡r❡♥❝❡s

❇❧❛❝❦ ❜♦① ♦♣t✐♠✐③❛t✐♦♥

❆r♥♦❧❞✱ ▲✳✱ ❆✉❣❡r✱ ❆✳✱ ❍❛♥s❡♥✱ ◆✳✱ ❛♥❞ ❖❧❧✐✈✐❡r✱ ❨✳ ■♥❢♦r♠❛t✐♦♥✲●❡♦♠❡tr✐❝ ❖♣t✐♠✐③❛t✐♦♥ ❆❧❣♦r✐t❤♠s✿ ❆ ❯♥✐❢②✐♥❣ P✐❝t✉r❡ ✈✐❛ ■♥✈❛r✐❛♥❝❡ Pr✐♥❝✐♣❧❡s✳ ❆r❳✐✈ ❡✲♣r✐♥ts✱ ✷✵✶✶✳ ❍❛♥s❡♥✱ ◆✳✱ ❖st❡r♠❡✐❡r✱ ❆✳ ❈♦♠♣❧❡t❡❧② ❞❡r❛♥❞♦♠✐③❡❞ s❡❧❢✲❛❞❛♣t❛t✐♦♥ ✐♥ ❡✈♦❧✉t✐♦♥ str❛t❡❣✐❡s✳ ❊✈♦❧✉t✐♦♥❛r② ❝♦♠♣✉t❛t✐♦♥ ✾ ✭✷✮✱ ✶✺✾✲✶✾✺✱ ✷✵✵✶✳

❙✉rr♦❣❛t❡ ♠♦❞❡❧s ❢♦r ♦♣t✐♠✐③❛t✐♦♥

▲♦s❤❝❤✐❧♦✈✱ ■✳✱ ❙❝❤♦❡♥❛✉❡r✱ ▼✳✱ ❙❡❜❛❣✱ ▼✳ ❙❡❧❢✲❛❞❛♣t✐✈❡ s✉rr♦❣❛t❡✲❛ss✐st❡❞ ❝♦✈❛r✐❛♥❝❡ ♠❛tr✐① ❛❞❛♣t❛t✐♦♥ ❡✈♦❧✉t✐♦♥ str❛t❡❣②✳ ●❊❈❈❖ ✷✵✶✷✱ ❆❈▼ Pr❡ss✿ ✸✷✶✲✸✷✽✱ ✷✵✶✷✳ −✱ ❆ ♠♦♥♦ s✉rr♦❣❛t❡ ❢♦r ♠✉❧t✐♦❜❥❡❝t✐✈❡ ♦♣t✐♠✐③❛t✐♦♥✳ ●❊❈❈❖ ✷✵✶✵✱ ❆❈▼ Pr❡ss✿ ✹✼✶✲✹✼✽✳

❘❡❧❛t❡❞

❱✐❛♣♣✐❛♥✐✱ P✳✱ ❇♦✉t✐❧✐❡r✱ ❈✳ ❖♣t✐♠❛❧ ❇❛②❡s✐❛♥ ❘❡❝♦♠♠❡♥❞❛t✐♦♥ ❙❡ts ❛♥❞ ▼②♦♣✐❝❛❧❧② ❖♣t✐♠❛❧ ❈❤♦✐❝❡ ◗✉❡r② ❙❡ts✳ ◆■P❙ ✷✵✶✵✿ ✷✸✺✷✲✷✸✻✵ ❍♦♦s✱ ❍✳ ❍✳ Pr♦❣r❛♠♠✐♥❣ ❜② ♦♣t✐♠✐③❛t✐♦♥✳ ❈♦♠♠✉♥✳ ❆❈▼ ✺✺✭✷✮✿ ✼✵✲✽✵✳