solving mode collapse with autoencoder gans
play

Solving mode collapse with Autoencoder GANs Mihaela Rosca Thanks - PowerPoint PPT Presentation

Solving mode collapse with Autoencoder GANs Mihaela Rosca Thanks to: Balaji Lakshminarayanan, David Warde-Farley, Shakir Mohamed Autoencoders code L 1 /L 2 reconstruction loss Image Credit: mikicon, the Noun Project Adversarial autoencoders


  1. Solving mode collapse with Autoencoder GANs Mihaela Rosca Thanks to: Balaji Lakshminarayanan, David Warde-Farley, Shakir Mohamed

  2. Autoencoders code L 1 /L 2 reconstruction loss Image Credit: mikicon, the Noun Project

  3. Adversarial autoencoders Improve reconstruction quality by adding a GAN loss. Adversarial Autoencoders A. Makhzani, J. Shlens, N.Jaitly, I. Goodfellow, B. Frey

  4. By construction, autoencoders learn to cover the entire training data.

  5. Autoencoder GANs Combine the reconstruction power of autoencoders with the sampling power of GANs!

  6. How to sample?

  7. Work on the code space - not data space.

  8. 1) Learning the code distribution Assumption learning the code distribution is simpler than learning data distribution learn p(codes) reconstructing sampling Image Credit: mikicon, the Noun Project

  9. 2) Match the code distribution to a desired prior match prior sampling reconstructing

  10. Working on the code space ) s e d o c ( p n r a e l match prior reconstructing sampling

  11. Learning the code distribution

  12. Learning the code distribution: PPGN Plug and play generative models A. Nguyen, J. Clune, Y, Bengio, A. Dosovitskiy, J. Yosinski

  13. PPGN - key ingredients Reconstructions ● ○ Autoencoder + GAN + Perceptual loss in feature space Samples ● ○ Markov Chain ○ Conditioning

  14. PPGN L 1 /L 2 reconstruction GAN loss Perceptual loss Not depicted: ● Activation Maximization Encoder is pretrained classifier ●

  15. PPGN: sampling using a denoising autoencoder Learn ∇ p c) using a DAE, then sample using MCMC

  16. Limitations of PPGN Learning the code distribution Not end to end ● Markov Chains ● ○ Need to pretrain an encoder on ○ when to stop? the same dataset ○ missing rejection step ● Depends on labels for: ○ Conditioning samples Pretrained encoder ○

  17. Matching a desired prior

  18. 2) Match the code distribution to a desired prior match prior reconstructing sampling

  19. Sounds familiar?

  20. Variational inference - the ELBO KL term likelihood term good reconstructions match the prior

  21. Variational autoencoders computed analytically match prior computed analytically reconstructing sampling

  22. AlphaGAN combining GANs and VAEs ● variational ● implicit ● inference ○ encoder ○ reconstructions ○ decoder AlphaGAN GANs VAEs ○ encoder network ● discriminators used to ● the posterior latent match distributions matches the prior Variational Approaches for Auto-Encoding Generative Adversarial Networks M. Rosca, B. Lakshminarayanan, D. Warde-Farley, S. Mohamed

  23. AlphaGAN estimated from samples codes discriminator match prior estimated from samples data discriminator sampling reconstructing

  24. Q: how do we estimate the terms in the ELBO using GANs?

  25. Density ratio trick Estimate the ratio of two distributions only from samples, by building a binary classifier to distinguish between them.

  26. Using GANs for variational inference - the ELBO

  27. ELBO - likelihood term

  28. ELBO - the KL term

  29. From the ELBO to loss functions We want to match: ● the reconstruction and data distributions likelihood term ○ ● the code and prior distributions ○ the KL Tools for matching distributions: ● ( GAN ) the density ratio trick ● ( VAE ) observer likelihoods reconstructions losses ○

  30. New loss functions - via the density ratio trick GAN loss (make samples close to reconstructions) match prior Reconstruction loss (avoid mode collapse) + GAN loss sampling reconstructing (improve recon quality)

  31. Samples

  32. Cifar10 - Inception score Classifier trained on Imagenet Classifier trained on Cifar10 Improved Techniques for Training GANs T. Salimans, I. Goodfellow, W. Zaremba, V. Cheung, A. Radford, X. Chen

  33. CelebA - sample diversity (1 - MS-SSIM) Improved Techniques for Training GANs T. Salimans, I. Goodfellow, W. Zaremba, V. Cheung, A. Radford, X. Chen

  34. Matching the prior - AGE ● the encoder is the discriminator forces data codes to match the prior ○ ○ sample codes to not match the prior It Takes (Only) Two: Adversarial Generator-Encoder Networks D. Ulyanov, A. Vedaldi, V. Lempitsky

  35. Analysis of methods

  36. Summary Try Autoencoder GANs if mode collapse is a problem. Combining different learning principles results in a family of novel algorithms.

  37. References It Takes (Only) Two: Adversarial Generator-Encoder Networks Adversarial Variational Bayes: Unifying Variational Autoencoders and Generative D. Ulyanov, A. Vedaldi, V. Lempitsky Adversarial Networks L. Mescheder, S. Nowozin, A. Geiger Improved Techniques for Training GANs Unsupervised Representation Learning with Deep Convolutional Generative T. Salimans, I. Goodfellow, W. Zaremba, V. Cheung, A. Radford, X. Chen Adversarial Networks A. Radford, L. Metz, S. Chintala Variational Approaches for Auto-Encoding Generative Adversarial Networks M. Rosca, B. Lakshminarayanan, D. Warde-Farley, S. Mohamed Auto-Encoding Variational Bayes Plug and play generative models D. P. Kingma, M. Welling A. Nguyen, J. Clune, Y, Bengio, A. Dosovitskiy, J. Yosinski Adversarial Autoencoders A. Makhzani, J. Shlens, N.Jaitly, I. Goodfellow, B. Frey Generative Adversarial Networks I. Goodfellow, J. Pouget-Abadie, M. Mirza, B.Xu, D. Warde-Farley, S. Ozair, A.Courville, Y. Bengio

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend