AMMI Introduction to Deep Learning 1.3. What is really happening? - - PowerPoint PPT Presentation

ammi introduction to deep learning 1 3 what is really
SMART_READER_LITE
LIVE PREVIEW

AMMI Introduction to Deep Learning 1.3. What is really happening? - - PowerPoint PPT Presentation

AMMI Introduction to Deep Learning 1.3. What is really happening? Fran cois Fleuret https://fleuret.org/ammi-2018/ Wed Aug 29 16:56:56 CAT 2018 COLE POLYTECHNIQUE FDRALE DE LAUSANNE (Zeiler and Fergus, 2014) Fran cois Fleuret


slide-1
SLIDE 1

AMMI – Introduction to Deep Learning 1.3. What is really happening?

Fran¸ cois Fleuret https://fleuret.org/ammi-2018/ Wed Aug 29 16:56:56 CAT 2018

ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE

slide-2
SLIDE 2

(Zeiler and Fergus, 2014)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 1 / 13

slide-3
SLIDE 3

(Zeiler and Fergus, 2014)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 2 / 13

slide-4
SLIDE 4

(Google’s Deep Dreams)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 3 / 13

slide-5
SLIDE 5

(Google’s Deep Dreams)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 4 / 13

slide-6
SLIDE 6

(Thorne Brandt)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 5 / 13

slide-7
SLIDE 7

(Duncan Nicoll)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 6 / 13

slide-8
SLIDE 8

(Szegedy et al., 2014) (Nguyen et al., 2015)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 7 / 13

slide-9
SLIDE 9

Relations with the biology

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 8 / 13

slide-10
SLIDE 10

LN LN ... LN LN ... LN LN LN LN LN LN Stimulus

a b c

Encoding Decoding Neurons Behavior RGC LGN V2 V4 V1 DOG ? ? ? PIT CIT AIT

...

Φ1 Φ2 Φk ⊗ ⊗ ⊗ Operations in linear-nonlinear layer Filter Threshold Pool Normalize ... ... ... Spatial convolution

  • ver image input

100-ms visual presentation Pixels LN PIT V2 V4 V1 CIT AIT T(•)

Figure 1 HCNNs as models of sensory

  • cortex. (a) The basic framework in which

sensory cortex is studied is one of encoding—the process by which stimuli are transformed into patterns of neural activity—and decoding, the process by which neural activity generates

  • behavior. HCNNs have been used to make models of the encoding step; that is, they describe

6

(Yamins and DiCarlo, 2016)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 9 / 13

slide-11
SLIDE 11

b

HCNN top hidden layer response prediction IT neural response Test images (sorted by category) IT site 56

a

r = . 8 7 ± . 1 5 H C N N m

  • d

e l s 0.6 1.0 50 IT single-site neural predictivity (% explained variance) HMO (top hidden layer) V2-like HMAX PLOS09 SIFT V1-like Pixels Category ideal

  • bserver

Categorization performance (balanced accuracy)

d

HCNN model Human IT (fMRI) Animate Human Not human Body Face Body Face Natural Artificial Inanimate

A = 0.38

1 2 3 4 1 2 3 4

c

Monkey V4 (n = 128) Monkey IT (n = 168) Ideal

  • bservers

Control models HCNN layers Control models Ideal

  • bservers

HCNN layers Pixels V1-like Category All variables PLOS09 HMAX V2-Like SIFT Pixels V1-Like PLOS09 HMAX V2-like SIFT 50 50 Single-site neural predictivity (% explained variance)

** **** **** **** **** ****

e

0.2 0.4 0.2 0.4 Human V1–V3 Human IT RDM voxel correlation (Kendall’s A) Scores Layer 1 Layer 2 Layer 3 Layer 4 Layer 5 Layer 6 Layer 7 Layer 1 Layer 2 Layer 3 Layer 4 Layer 5 Layer 6 Layer 7 Convolutional Fully connected

**** **** **** * **** **** **** ****

SVM Geometry- supervised

**** τ 6

(Yamins and DiCarlo, 2016)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 10 / 13

slide-12
SLIDE 12

Species

  • Nb. neurons
  • Nb. synapses

Roundworm 302 7.5 × 103 Jellyfish 800 Sea slug 1.8 × 104 Fruit fly 1.0 × 105 1.0 × 107 Ant 2.5 × 105 Cockroach 1.0 × 106 Frog 1.6 × 107 Mouse 7.1 × 107 1.0 × 1011 Rat 2.0 × 108 4.5 × 1011 Octopus 3.0 × 108 Human 8.6 × 1010 1.0 × 1015

(Wikipedia “List of animals by number of neurons”)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 11 / 13

slide-13
SLIDE 13

Device

  • Nb. transistors

Intel i7 Haswell-E (8 cores) 2.6 × 109 Intel Xeon Broadwell-E5 (22 cores) 7.2 × 109 AMD Epyc (32 cores) 19.2 × 109 Nvidia GeForce GTX 1080 7.2 × 109 AMD Vega 10 12.5 × 109 NVidia GV100 21.1 × 109 (Wikipedia “Transistor count”)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 12 / 13

slide-14
SLIDE 14

103 106 109 1012 1015 1018 1960 1970 1980 1990 2000 2010 2020

  • Nb. human synapses
  • Nb. mouse synapses
  • Nb. fruit fly synapses
  • Nb. Transistors

Number of transistors per CPU/GPU CPUs GPUs

(Wikipedia “Transistor count”)

Fran¸ cois Fleuret AMMI – Introduction to Deep Learning / 1.3. What is really happening? 13 / 13

slide-15
SLIDE 15

The end

slide-16
SLIDE 16

References

  • A. M. Nguyen, J. Yosinski, and J. Clune. Deep neural networks are easily fooled: High

confidence predictions for unrecognizable images. In Conference on Computer Vision and Pattern Recognition (CVPR), 2015.

  • C. Szegedy, W. Zaremba, I. Sutskever, J. Bruna, D. Erhan, I. Goodfellow, and R. Fergus.

Intriguing properties of neural networks. In International Conference on Learning Representations (ICLR), 2014.

  • D. L. K. Yamins and J. J. DiCarlo. Using goal-driven deep learning models to understand

sensory cortex. Nature neuroscience, 19:356–65, Feb 2016.

  • M. D. Zeiler and R. Fergus. Visualizing and understanding convolutional networks. In

European Conference on Computer Vision (ECCV), 2014.