&
Charlotte Bunne Poster #173 173
LEARNING GENERATIVE MODELS ACROSS INCOMPARABLE SPACES
Cha harlot
- tte Bunne
unne, David Alvarez-Melis, Andreas Krause, Stefanie Jegelka
Pos
- ster #173
LEARNING GENERATIVE MODELS ACROSS INCOMPARABLE SPACES Cha harlot - - PowerPoint PPT Presentation
LEARNING GENERATIVE MODELS ACROSS INCOMPARABLE SPACES Cha harlot otte Bunne unne , David Alvarez-Melis, Andreas Krause, Stefanie Jegelka Pos oster #173 173 Poster #173 173 & Charlotte Bunne Generative Modeling generative = network
&
Charlotte Bunne Poster #173 173
Cha harlot
unne, David Alvarez-Melis, Andreas Krause, Stefanie Jegelka
&
Charlotte Bunne Poster #173 173
generative network
noise
… …
Px
data
&
Charlotte Bunne Poster #173 173
generative network
noise
… …
Px
data
… enf nfor
&
Charlotte Bunne Poster #173 173
generative network
noise
… …
Px
data … learn n acros
different nt di dimens nsions
&
Charlotte Bunne Poster #173 173
generative network
noise
… …
Px
data … trans nslate be between n repr present ntation.
x y
graph ?
&
Charlotte Bunne Poster #173 173 x z y
generative network
noise
… …
Px
data … learn n mani nifol
ds.
x y
&
Charlotte Bunne Poster #173 173
generative network
noise
… …
Px
data … learn n mani nifol
ds.
E x z y e x y
1
2
3 How to compare samples from incomparable spaces? What should be preserved? What can we modify? How to stabilize learning despite additional freedom?
&
Charlotte Bunne Poster #173 173
Optimal Transpor
L(yi, xl) L(yj, xk)
... distance between distributions: mini nimal cos
noise
… …
yh xl xm yi yj xk
generative network
Px
... find an opt
nspor port pl plan n T.
… classical Wasserstein distances assume that spaces are com
parabl ble!
&
Charlotte Bunne Poster #173 173
Gromov-Wasserstein Discrepancy
yh xl noise
… …
Dkl Dij xm xk Dhi yi L(Dhi, Dlm) L(Dij, Dkl) intra-space distances yj
Px
intra-space distances
transport plan
generative network
GW(D, ¯ D) := min
T
L(Dik, ¯ Djl)TijTkl
Definition
Dlm
tot
acros
1
) := {
<latexit sha1_base64="5dO7VeWeQwCDzXJsk/Ifg7of+5Q=">AB6XicbVDLSgNBEOyNrxhfUY9eBoPgKezGgB4DgniMYh6QLGF2MpsMmZ1dZnqFsAT8AC8eFPHqH3nzb5w8DpY0FBUdPdFSRSGHTdbye3tr6xuZXfLuzs7u0fFA+PmiZONeMNFstYtwNquBSKN1Cg5O1EcxoFkreC0fXUbz1ybUSsHnCcD+iAyVCwSha6b6b9Yolt+zOQFaJtyAlWKDeK351+zFLI6QSWpMx3MT9DOqUTDJ4VuanhC2YgOeMdSRSNu/Gx26YScWaVPwljbUkhm6u+JjEbGjKPAdkYUh2bZm4r/eZ0Uwys/EypJkSs2XxSmkmBMpm+TvtCcoRxbQpkW9lbChlRThjacg3BW35lTQrZe+iXLmrlmo3T/M48nACp3AOHlxCDW6hDg1gEMIzvMKbM3JenHfnY96acxYRHsMfOJ8/xAaN7Q=</latexit>&
Charlotte Bunne Poster #173 173
g!(z) = y generator data
noise
… …
. . . . . .
… …
D D GW f⍵() adversary
&
Charlotte Bunne Poster #173 173
… …
GW GAN
… recovers geometrical properties of the target distribution, … but global aspects are undetermined
sha hape pe the he ge gene nerated d di distribut bution
via de design gn cons
nts
style adversary
2
&
Charlotte Bunne Poster #173 173
gθ(Z)
samples in ge generator
generated samples in fe feature space data samples in fe feature space
fω(X)
fω(gθ(Z))
… …
GW GAN
… recovers geometrical properties of the target distribution, … but global aspects are undetermined … adversary can arbitrarily … distort the space
regul gularize adv dversary by by enf nfor
ng it to
define ne uni unitary trans nsfor
sha hape pe the he ge gene nerated d di distribut bution
via de design gn cons
nts
style adversary
3
2
&
Charlotte Bunne Poster #173 173
By utilizing the Grom
Wasser erstei tein discrepancy we disentangle data and generator space.
`
This enables us to learn ge generative mod
dimension
shape the generated distributions with design constraints.
`
More details, tonight at