Progressive Growing of GANs for Improved Quality, Stability, and - - PowerPoint PPT Presentation

progressive growing of gans for improved quality
SMART_READER_LITE
LIVE PREVIEW

Progressive Growing of GANs for Improved Quality, Stability, and - - PowerPoint PPT Presentation

Progressive Growing of GANs for Improved Quality, Stability, and Variation Paper: T.Karras, T.Aila, S.Laine, J. Lehtinen Nvidia and Aalto Univ. ICLR Oral 2018 Stefano Blumberg, UCL Why I chose Paper Excellent Results Introduced New


slide-1
SLIDE 1

Progressive Growing of GANs for Improved Quality, Stability, and Variation

Paper: T.Karras, T.Aila, S.Laine, J. Lehtinen Nvidia and Aalto Univ. ICLR Oral 2018 Stefano Blumberg, UCL

slide-2
SLIDE 2

Why I chose Paper

  • Excellent Results
  • Introduced New Pipeline
  • Idea for a current project
  • ‘Traning paper’ many applications
  • Made me feel poor
slide-3
SLIDE 3

Everybody Knows about GANs?

slide-4
SLIDE 4

Learning Across Different Scales

  • Training at high-resolution too difficult
slide-5
SLIDE 5

Fading in Layers

slide-6
SLIDE 6

Increasing Variation

  • Prevent Mode Collapse
  • Minibatch Discrimination (Salimans et al.), feature

statistics across minibatch – encourage statistics across different training, generated images. Minibatch Standard Deviation

  • Compute std for each feature in each spatial location
  • Average estimates over all features
  • Replicate value and concatenate to all spatial locations
  • Add value as a feature map towards end discriminator
  • (More complicated methods didn’t improve results)
slide-7
SLIDE 7

Equalised Learning Rate

  • Ignore complex weight initialisation, scale

weights at runtime

  • Parameter dynamic range and learning speed

is same for all weights

slide-8
SLIDE 8

Pixelwise Feature Vector Normalisation

  • Prevent Magnitudes of generator and

Discriminator from spiralling out of control

  • Normalise feature vector in each pixel to unit

length

  • Replaces Batch,Layer et.c-Norm
slide-9
SLIDE 9

Multi Scale Statistical Similarity for Assessing GAN Results

  • Current methods fail to react to variation in

colour or textures, or assess image quality

  • Intuition: Samples produced have local image

structure similar to training set in all scales

  • Statistical similarity from Laplacian Pyramid

(specific spatial frequency band)

  • Then Wasserstein distance
slide-10
SLIDE 10

CelebA-HQ

  • High quality version of CelebA
  • 30000 images 1024**2
slide-11
SLIDE 11

Contributions

  • Learning across Different Scales
  • Fading in Layers
  • Increasing Variation
  • Equalised Learning Rate
  • Multi Scale Statistical Similarity for Asessing

GANs

  • CelebA-HQ
slide-12
SLIDE 12

Loss and Evaluation

  • Design choices orthogonal to loss function

chosen (LSGAN,WGAN-GP)

  • Sliced Wasserstein Distance and Multi-Scale

Structural Similarity

slide-13
SLIDE 13

Training

  • 8 Tesla V100 (10-11K$ each), 4 days
  • Reduce MB size to preserve memory
slide-14
SLIDE 14

Results

  • Progressive Growing 2-5.4x speedup and better

minima

  • LSUN bedroom
  • Record inception score of 8.80 in unsupervised

CIFAR10

slide-15
SLIDE 15

Takeaway

  • Better hardware is nice
  • Progressive growing is very good
  • Being ‘very hacky’ does produce advantages