Sequential Neural Processes Gautam Singh* 1 Jaesik Yoon* 2 Youngsung - - PowerPoint PPT Presentation

sequential neural processes
SMART_READER_LITE
LIVE PREVIEW

Sequential Neural Processes Gautam Singh* 1 Jaesik Yoon* 2 Youngsung - - PowerPoint PPT Presentation

Sequential Neural Processes Gautam Singh* 1 Jaesik Yoon* 2 Youngsung Son 3 Sungjin Ahn 1 1 Rutgers University 2 SAP 3 ETRI *Equal Contribution Background: GQN, NP and Meta-Learning "What if the stochastic process also had some underlying


slide-1
SLIDE 1

Sequential Neural Processes

*Equal Contribution

2SAP 3ETRI 1Rutgers University

Gautam Singh*1 Jaesik Yoon*2 Youngsung Son3 Sungjin Ahn1

slide-2
SLIDE 2

Background: GQN, NP and Meta-Learning

"What if the stochastic process also had some underlying temporal dynamics?"

[1] Eslami, SM Ali, et al. "Neural scene representation and rendering." Science 360.6394 (2018): 1204-1210. [2] Garnelo, Marta, et al. "Neural processes." arXiv preprint arXiv:1807.01622 (2018).

slide-3
SLIDE 3

Background: GQN, NP and Meta-Learning

"What if the stochastic process also had some underlying temporal dynamics?"

[1] Eslami, SM Ali, et al. "Neural scene representation and rendering." Science 360.6394 (2018): 1204-1210. [2] Garnelo, Marta, et al. "Neural processes." arXiv preprint arXiv:1807.01622 (2018).

slide-4
SLIDE 4

Background: GQN, NP and Meta-Learning

"What if the stochastic process also had some underlying temporal dynamics?"

[1] Eslami, SM Ali, et al. "Neural scene representation and rendering." Science 360.6394 (2018): 1204-1210. [2] Garnelo, Marta, et al. "Neural processes." arXiv preprint arXiv:1807.01622 (2018).

slide-5
SLIDE 5

Background: GQN, NP and Meta-Learning

"What if the stochastic process also had some underlying temporal dynamics?"

[1] Eslami, SM Ali, et al. "Neural scene representation and rendering." Science 360.6394 (2018): 1204-1210. [2] Garnelo, Marta, et al. "Neural processes." arXiv preprint arXiv:1807.01622 (2018).

slide-6
SLIDE 6

Background: GQN, NP and Meta-Learning

"What if the stochastic process also had some underlying temporal dynamics?"

[1] Eslami, SM Ali, et al. "Neural scene representation and rendering." Science 360.6394 (2018): 1204-1210. [2] Garnelo, Marta, et al. "Neural processes." arXiv preprint arXiv:1807.01622 (2018).

slide-7
SLIDE 7

Background: GQN, NP and Meta-Learning

"What if the stochastic process also had some underlying temporal dynamics?"

[1] Eslami, SM Ali, et al. "Neural scene representation and rendering." Science 360.6394 (2018): 1204-1210. [2] Garnelo, Marta, et al. "Neural processes." arXiv preprint arXiv:1807.01622 (2018).

slide-8
SLIDE 8

Stochastic Processes with Time Structure

Bouncing 2D Shapes Temperature of a rod changing with time Moving 3D Object

slide-9
SLIDE 9

Stochastic Processes with Time Structure

Bouncing 2D Shapes Temperature of a rod changing with time Moving 3D Object

slide-10
SLIDE 10

Stochastic Processes with Time Structure

Bouncing 2D Shapes Temperature of a rod changing with time Moving 3D Object

slide-11
SLIDE 11

Stochastic Processes with Time Structure

Bouncing 2D Shapes Temperature of a rod changing with time Moving 3D Object

slide-12
SLIDE 12

Stochastic Processes with Time Structure

Bouncing 2D Shapes Temperature of a rod changing with time Moving 3D Object

slide-13
SLIDE 13

Stochastic Processes with Time Structure

Bouncing 2D Shapes Temperature of a rod changing with time Moving 3D Object

slide-14
SLIDE 14

Stochastic Processes with Time Structure

Bouncing 2D Shapes Temperature of a rod changing with time Moving 3D Object

slide-15
SLIDE 15

Stochastic Processes with Time Structure

Bouncing 2D Shapes Temperature of a rod changing with time Moving 3D Object

slide-16
SLIDE 16

Stochastic Processes with Time Structure

Bouncing 2D Shapes Temperature of a rod changing with time Moving 3D Object

slide-17
SLIDE 17

Simple Extension of the Baselines

  • Append time t to the query in

Neural Processes or GQN.

  • Our findings show that this

does not work well since it does not model time explicitly.

  • Poor generation quality
  • Cannot generalize to long time-

horizons

slide-18
SLIDE 18

Simple Extension of the Baselines

  • Append time t to the query in

Neural Processes or GQN.

  • Our findings show that this

does not work well since it does not model time explicitly.

  • Poor generation quality
  • Cannot generalize to long time-

horizons

slide-19
SLIDE 19

Simple Extension of the Baselines

  • Append time t to the query in

Neural Processes or GQN.

  • Our findings show that this

does not work well since it does not model time explicitly.

  • Poor generation quality
  • Cannot generalize to long time-

horizons

slide-20
SLIDE 20

Sequential Neural Processes

Meta-Transfer Learning. "We not only learn from the current context but also utilize our knowledge of the past stochastic processes"

slide-21
SLIDE 21

Sequential Neural Processes

Meta-Transfer Learning. "We not only learn from the current context but also utilize our knowledge of the past stochastic processes" Transition Model

slide-22
SLIDE 22

Sequential Neural Processes

Meta-Transfer Learning. "We not only learn from the current context but also utilize our knowledge of the past stochastic processes"

slide-23
SLIDE 23

Sequential Neural Processes

Meta-Transfer Learning. "We not only learn from the current context but also utilize our knowledge of the past stochastic processes"

slide-24
SLIDE 24

Sequential Neural Processes

Meta-Transfer Learning. "We need not learn everything from current context but only use it to update our prior hypothesis."

slide-25
SLIDE 25

Inference and Learning

  • We train the model via a variational approximation.
  • This leads to the following ELBO training objective.

A realization of the inference model using a backward RNN.

slide-26
SLIDE 26

Inference and Learning

  • We train the model via a variational approximation.
  • This leads to the following ELBO training objective.

A realization of the inference model using a backward RNN.

slide-27
SLIDE 27

Inference and Learning

  • We train the model via a variational approximation.
  • This leads to the following ELBO training objective.

A realization of the inference model using a backward RNN.

slide-28
SLIDE 28

Demonstrations

slide-29
SLIDE 29

Color Cube

Context is shown in the first 5 time-steps and the remaining are predicted purely on the command of the actions provided to the object. The actions can be translation (L, R, U, D) or rotations (Clockwise, A-Clockwise) 1st time-step without context

slide-30
SLIDE 30

Color Cube

Context is shown in the first 5 time-steps and the remaining are predicted purely on the command of the actions provided to the object. The actions can be translation (L, R, U, D) or rotations (Clockwise, A-Clockwise) 10th time-step without context

slide-31
SLIDE 31

Color Cube

Context is shown in the first 5 time-steps and the remaining are predicted purely on the command of the actions provided to the object. The actions can be translation (L, R, U, D) or rotations (Clockwise, A-Clockwise) 20th time-step without context. Beyond training time horizon

slide-32
SLIDE 32

Meta-Transfer Learning

slide-33
SLIDE 33

Meta-Transfer Learning

slide-34
SLIDE 34

Meta-Transfer Learning

slide-35
SLIDE 35

Comparing against GQN

slide-36
SLIDE 36

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is being shown.

slide-37
SLIDE 37

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is being shown.

slide-38
SLIDE 38

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is being shown.

slide-39
SLIDE 39

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is being shown.

slide-40
SLIDE 40

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is being shown.

slide-41
SLIDE 41

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is removed.

slide-42
SLIDE 42

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is removed.

slide-43
SLIDE 43

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is removed.

slide-44
SLIDE 44

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is removed.

slide-45
SLIDE 45

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is removed.

slide-46
SLIDE 46

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is removed.

slide-47
SLIDE 47

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is removed.

slide-48
SLIDE 48

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is shown again.

slide-49
SLIDE 49

Color Shapes : Tracking and Updating

Context is shown intermittently and we allow the predictions to diverge from the true. On seeing the context, we observe that the belief about the object is updated. Context is shown again.

slide-50
SLIDE 50

1D Regression

slide-51
SLIDE 51

1D Regression

slide-52
SLIDE 52

1D Regression

slide-53
SLIDE 53

1D Regression

slide-54
SLIDE 54

1D Regression

slide-55
SLIDE 55

Negative Log-Likelihood

slide-56
SLIDE 56

Thank You

Today 10:45 AM - 12:45 PM East Exhib ibit itio ion Hal all l B + C Poster 132

Do visit our poster