residual flows
play

Residual Flows for Invertible Generative Modeling Ricky T. Q. Chen, - PowerPoint PPT Presentation

Residual Flows for Invertible Generative Modeling Ricky T. Q. Chen, Jens Behrmann, David Duvenaud, Jrn-Henrik Jacobsen Invertible Residual Networks (i-ResNet) It can be shown that residual blocks can be inverted by fixed-point iteration and


  1. Residual Flows for Invertible Generative Modeling Ricky T. Q. Chen, Jens Behrmann, David Duvenaud, Jörn-Henrik Jacobsen

  2. Invertible Residual Networks (i-ResNet) It can be shown that residual blocks can be inverted by fixed-point iteration and has a unique inverse (ie. invertible) if i.e. Lipschitz. Enforced with (Behrmann et al. 2019) spectral normalization.

  3. Applying Change of Variables to i-ResNets If Then (Behrmann et al. 2019)

  4. Unbiased Estimation of Log Probability Density Enter the “Russian roulette” estimator (Kahn, 1955). Suppose we want to estimate (Require )

  5. Unbiased Estimation of Log Probability Density Enter the “Russian roulette” estimator (Kahn, 1955). Suppose we want to estimate (Require ) Flip a coin b with probability q .

  6. Unbiased Estimation of Log Probability Density Enter the “Russian roulette” estimator (Kahn, 1955). Suppose we want to estimate (Require ) Flip a coin b with probability q .

  7. Unbiased Estimation of Log Probability Density Enter the “Russian roulette” estimator (Kahn, 1955). Suppose we want to estimate (Require ) Flip a coin b with probability q .

  8. Unbiased Estimation of Log Probability Density Enter the “Russian roulette” estimator (Kahn, 1955). Suppose we want to estimate (Require ) Flip a coin b with probability q . Has probability q of being evaluated in finite time.

  9. Unbiased Estimation of Log Probability Density If we repeatedly apply the same procedure infinitely many times , we obtain an unbiased estimator of the infinite series. Computed in finite time with prob. 1 !! k-th term is weighted by Directly sample the first prob. of seeing >= k tosses. successful coin toss. Residual Flow:

  10. Decoupled Training Objective & Estimation Bias Unbiased but... variable compute and memory!

  11. Constant-Memory Backpropagation Naive gradient computation: 1. Estimate 2. Differentiate Alternative (Neumann series) gradient formulation: 1. Analytically Differentiate 2. Estimate Don’t need to store random number of terms in memory!!

  12. Density Estimation Experiments (LipSwish) Contribution Summary: - Unbiased estimator of log-likelihood. - Memory-efficient computation of log-likelihood. - LipSwish activation function [not discussed in talk].

  13. Density Estimation Experiments (LipSwish) Contribution Summary: - Unbiased estimator of log-likelihood. - Memory-efficient computation of log-likelihood. - LipSwish activation function [not discussed in talk].

  14. Qualitative Samples CelebA: Data Residual Flow CIFAR10: Residual Flow Data PixelCNN Flow++

  15. Qualitative Samples CelebA: Data Residual Flow CelebA-HQ 256x256 :

  16. Thanks for Listening! Code and pretrained models: https://github.com/rtqichen/residual-flows Co-authors: Jens Behrmann David Duvenaud Jörn-Henrik Jacobsen

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend