Generative Adversarial Networks (GANs) Prof. Seungchul Lee - - PowerPoint PPT Presentation

generative adversarial networks gans
SMART_READER_LITE
LIVE PREVIEW

Generative Adversarial Networks (GANs) Prof. Seungchul Lee - - PowerPoint PPT Presentation

Generative Adversarial Networks (GANs) Prof. Seungchul Lee Industrial AI Lab. Source 1 GAN (Generative Adversarial Network) by YouTube: https://www.youtube.com/watch?v=odpjk7_tGY0 Slides:


slide-1
SLIDE 1

Generative Adversarial Networks (GANs)

  • Prof. Seungchul Lee

Industrial AI Lab.

slide-2
SLIDE 2

Source

  • 1시간만에 GAN (Generative Adversarial Network) 완전 정복하기

– by 최윤제 – YouTube: https://www.youtube.com/watch?v=odpjk7_tGY0 – Slides: https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network

  • CSC321 Lecture 19: GAN

– By Prof. Roger Grosse at Univ. of Toronto – http://www.cs.toronto.edu/~rgrosse/courses/csc321_2018/

  • CS231n: CNN for Visual Recognition

– Lecture 13: Generative Models – By Prof. Fei-Fei Li at Stanford University – http://cs231n.stanford.edu/

2

slide-3
SLIDE 3

Supervised Learning

  • Discriminative model

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 3

slide-4
SLIDE 4

Unsupervised Learning

  • Generative model

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 4

= Latent space

slide-5
SLIDE 5

Model Distribution vs. Data Distribution

5

slide-6
SLIDE 6

Probability Distribution

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 6

slide-7
SLIDE 7

Probability Distribution

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 7

slide-8
SLIDE 8

Probability Distribution

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 8

slide-9
SLIDE 9

Probability Density Estimation Problem

  • If 𝑄𝑛𝑝𝑒𝑓𝑚(𝑦) can be estimated as close to 𝑄𝑒𝑏𝑢𝑏(𝑦), then data can be generated by sampling from

𝑄𝑛𝑝𝑒𝑓𝑚(𝑦)

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 9

slide-10
SLIDE 10

Generative Models from Lower Dimension

  • Learn transformation via a neural network
  • Start by sampling the code vector 𝑨 from a fixed, simple distribution (e.g. uniform distribution or

Gaussian distribution)

  • Then this code vector is passed as input to a deterministic generator network 𝐻, which produces an
  • utput sample 𝑦 = 𝐻(𝑨)

Latent space =

Source: Prof. Roger Grosse at U of Toronto 10

slide-11
SLIDE 11

Deterministic Transformation (by Network)

  • 1-dimensional example:
  • Remember

– Network does not generate distribution, but – It maps known distribution to target distribution

Source: Prof. Roger Grosse at U of Toronto 11

slide-12
SLIDE 12

Deterministic Transformation (by Network)

  • High dimensional example:

Source: Prof. Roger Grosse at U of Toronto 12

slide-13
SLIDE 13
  • Prob. Density Function by Deep Learning
  • Generative model of image

Source: Prof. Roger Grosse at U of Toronto 13

slide-14
SLIDE 14

Generative Adversarial Networks (GANs)

  • In generative modeling, we'd like to train a network that models a distribution, such as a distribution
  • ver images.
  • GANs do not work with any explicit density function !

– Instead, take game-theoretic approach

14

slide-15
SLIDE 15

Turing Test

  • One way to judge the quality of the model is to sample from it.
  • GANs are based on a very different idea:

– Model to produce samples which are indistinguishable from the real data, as judged by a discriminator network whose job is to tell real from fake

15

slide-16
SLIDE 16

Generative Adversarial Networks (GAN)

  • The idea behind Generative Adversarial Networks (GANs): train two different networks

– Generator network: try to produce realistic-looking samples – Discriminator network: try to distinguish between real and fake data

  • The generator network tries to fool the discriminator network

16

slide-17
SLIDE 17

Autoencoder

  • Dimension reduction
  • Recover the input data

– Learns an encoding of the inputs so as to recover the original input from the encodings as well as possible

17

slide-18
SLIDE 18

Generative Adversarial Networks (GAN)

  • Analogous to Turing Test

Generated

Generator Data Generator

18

slide-19
SLIDE 19

Generative Adversarial Networks (GAN)

  • Analogous to Turing Test

Generated Real Real Fake

Generator Discriminator

19

slide-20
SLIDE 20

Intuition for GAN

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 20

slide-21
SLIDE 21

Discriminator Perspective (1/2)

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 21

slide-22
SLIDE 22

Discriminator Perspective (2/2)

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 22

slide-23
SLIDE 23

Generator Perspective

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 23

slide-24
SLIDE 24

Loss Function of Discriminator

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 24

slide-25
SLIDE 25

Loss Function of Generator

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 25

slide-26
SLIDE 26

Non-Saturating Game

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 26

slide-27
SLIDE 27

Non-Saturating Game

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 27

slide-28
SLIDE 28

Solving a MinMax Problem

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 28

slide-29
SLIDE 29

GAN Implementation in TensorFlow

29

slide-30
SLIDE 30

TensorFlow Implementation

784 784 256 256 100 1

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 30

slide-31
SLIDE 31

TensorFlow Implementation

784 784 256 256 100 1

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 31

slide-32
SLIDE 32

TensorFlow Implementation

784 784 256 256 100 1

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 32

slide-33
SLIDE 33

TensorFlow Implementation

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 33

slide-34
SLIDE 34

TensorFlow Implementation

784 784 256 256 100 1

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 34

slide-35
SLIDE 35

TensorFlow Implementation

784 784 256 256 100 1

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 35

slide-36
SLIDE 36

TensorFlow Implementation

784 784 256 256 100 1

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 36

slide-37
SLIDE 37

After Training

  • After training, use generator network to generate new data

https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 37

slide-38
SLIDE 38

GAN Samples

38

slide-39
SLIDE 39

Conditional GAN

39

slide-40
SLIDE 40

Conditional GAN

  • In an unconditioned generative model, there is no control on modes of the data being generated.
  • In the Conditional GAN (CGAN), the generator learns to generate a fake sample with a specific

condition or characteristics (such as a label associated with an image or more detailed tag) rather than a generic sample from unknown noise distribution.

40

slide-41
SLIDE 41

Conditional GAN

  • MNIST digits generated conditioned on their class label

41

slide-42
SLIDE 42

Conditional GAN

  • Simple modification to the original GAN framework that

conditions the model on additional information for better multi-modal learning

  • Many practical applications of GANs when we have

explicit supervision available

42

slide-43
SLIDE 43

Normal Distribution of MNIST

  • A standard normal distribution
  • This is how we would like points corresponding to MNIST digit images to be distributed in the latent

space

43

slide-44
SLIDE 44

Generator at GAN

44 Generator Generator

slide-45
SLIDE 45

Generator at Conditional GAN

  • Feed a random point in latent space and desired number.
  • Even if the same latent point is used for two different numbers, the process will work correctly since

the latent space only encodes features such as stroke width or angle

45 Generator Generator

slide-46
SLIDE 46

CGAN Implementation

46

slide-47
SLIDE 47

CGAN Implementation

47

slide-48
SLIDE 48

CGAN Implementation

48

slide-49
SLIDE 49

CGAN Implementation

49

  • Generate fake MNIST images by CGAN