Deep Learning Made Easy with GraphLab Create Piotr Teterwak Dato - - PowerPoint PPT Presentation

deep learning made easy with graphlab create
SMART_READER_LITE
LIVE PREVIEW

Deep Learning Made Easy with GraphLab Create Piotr Teterwak Dato - - PowerPoint PPT Presentation

Deep Learning Made Easy with GraphLab Create Piotr Teterwak Dato Team 1 Who I am Piotr Teterwak Software Engineer 2 GraphLab Create A platform for building predictive applications, fast Data engineering on Big


slide-1
SLIDE 1

Deep Learning Made Easy with GraphLab Create

Piotr Teterwak Dato Team

1

slide-2
SLIDE 2

Who I am

Piotr ¡Teterwak ¡ Software ¡Engineer

2

slide-3
SLIDE 3

GraphLab Create

  • A platform for building predictive

applications, fast

  • Data engineering on Big Data
  • Interactive visualization
  • Fast machine learning toolkits
  • Easy deployment
  • Python frontend, C++ backend

3

slide-4
SLIDE 4

The Dato Team

4

slide-5
SLIDE 5

Making Deep Learning Easy

5

slide-6
SLIDE 6

Deep Learning

6

slide-7
SLIDE 7

Deep Learning Made Easy

  • Intuitive API
  • Transfer Learning
  • Integration with other tools in GraphLab

Create

7

slide-8
SLIDE 8

What is Deep Learning?

8

slide-9
SLIDE 9

Machine Learning

  • Algorithms that can learn from data without

being explicitly programmed.

  • One example would be image

classification, i.e binning an image as one

  • f a fixed number of categories.

9

slide-10
SLIDE 10

Machine Learning

“cat”

10

slide-11
SLIDE 11

Deep Learning

“cat”

f1(x) f2(x) f3(x)

11

slide-12
SLIDE 12

http://deeplearning.stanford.edu/wiki/images/4/40/Network3322.png

Deep Neural Networks

P(cat|x) P(dog|x)

12

slide-13
SLIDE 13

Deep Neural Networks

  • Can model any function with enough

hidden units.

  • This is tremendously powerful: given

enough units, it is possible to train a neural network to solve arbitrarily difficult problems.

  • But also very difficult to train, too many

parameters means too much memory +computation time.

13

slide-14
SLIDE 14

Neural Nets and GPU’s

  • Many operations in Neural Net training can

happen in parallel

  • Reduces to matrix operations, many of

which can be easily parallelized on a GPU.

14

slide-15
SLIDE 15

Convolutional Neural Nets

  • Strategic removal of edges

15

Input ¡Layer Hidden ¡Layer

slide-16
SLIDE 16

Convolutional Neural Nets

  • Strategic removal of edges

16

Input ¡Layer Hidden ¡Layer

slide-17
SLIDE 17

Convolutional Neural Nets

  • Strategic removal of edges

17

Input ¡Layer Hidden ¡Layer

slide-18
SLIDE 18

Convolutional Neural Nets

  • Strategic removal of edges

18

Input ¡Layer Hidden ¡Layer

slide-19
SLIDE 19

Convolutional Neural Nets

  • Strategic removal of edges

19

Input ¡Layer Hidden ¡Layer

slide-20
SLIDE 20

Convolutional Neural Nets

  • Strategic removal of edges

20

Input ¡Layer Hidden ¡Layer

slide-21
SLIDE 21

Convolutional Neural Nets

http://ufldl.stanford.edu/wiki/images/6/6c/Convolution_schematic.gif

21

slide-22
SLIDE 22

Pooling layer

Ranzato, ¡LSVR ¡tutorial ¡@ ¡CVPR, ¡2014. ¡

www.cs.toronto.edu/~ranzato

22

slide-23
SLIDE 23

Pooling layer

http://ufldl.stanford.edu/wiki/images/6/6c/Pooling_schematic.gif

23

slide-24
SLIDE 24

Overall architecture

¡A. ¡Krizhevsky, ¡I. ¡Sutskever ¡and ¡G.E. ¡Hinton. ¡“ImageNet ¡ Classification ¡with ¡Deep ¡Convolutional ¡Neural ¡Networks”. ¡NIPS ¡ (2012)

24

slide-25
SLIDE 25

Hierarchichal Representation

  • Y. ¡Bengio ¡(2009)

25

Hands ¡-­‑ ¡Face ¡-­‑ ¡Ground

slide-26
SLIDE 26

Input Learned ¡hierarchy

Lee ¡et ¡al. ¡‘Convolutional ¡Deep ¡Belief ¡Networks ¡for ¡Scalable ¡Unsupervised ¡Learning ¡of ¡Hierarchical ¡Representations’ ¡ICML ¡2009

Deep learning features

Output

26

slide-27
SLIDE 27

Where can we use Deep Learning?

27

slide-28
SLIDE 28

Image tagging

28

slide-29
SLIDE 29

A quick demo!

29

slide-30
SLIDE 30
  • Notice the cycle…you can only break out of this with

intuition, time, and lots of frustration.

  • But, when you do, magic happens!

Create Model

Labelled data Train Set Test Set 80% 20%

Validate?

Probably ¡not ¡good ¡enough

Adjust ¡hyper-­‑parameters

Deep learning workflow

30

slide-31
SLIDE 31

Simplifying Deep Learning with Deep Features and Transfer Learning

31

slide-32
SLIDE 32

Transfer learning

  • Train a model on one task, use it for

another task

  • Examples
  • Learn to walk, use that knowledge to run
  • Train image tagger to recognize cars, use that

knowledge to recognize trucks.

32

slide-33
SLIDE 33

Input Learned ¡hierarchy

Lee ¡et ¡al. ¡‘Convolutional ¡Deep ¡Belief ¡Networks ¡for ¡Scalable ¡Unsupervised ¡Learning ¡of ¡Hierarchical ¡Representations’ ¡ICML ¡2009

Deep learning features

Output

33

slide-34
SLIDE 34

Lee ¡et ¡al. ¡‘Convolutional ¡Deep ¡Belief ¡Networks ¡for ¡Scalable ¡Unsupervised ¡Learning ¡of ¡Hierarchical ¡Representations’ ¡ICML ¡2009

Mid-­‑level ¡features ¡probably ¡ useful ¡for ¡other ¡tasks ¡which ¡ ¡ require ¡detection ¡of ¡facial ¡ ¡ anatomy

Feature extraction

Input Learned ¡hierarchy

34

slide-35
SLIDE 35

http://deeplearning.stanford.edu/wiki/images/4/40/Network3322.png Extract ¡activations ¡from ¡some ¡ deep ¡layer ¡of ¡neural ¡network

Feature extraction

35

slide-36
SLIDE 36

Create Simpler Model

Labelled data

Extract Features using Neural Net trained on different task

Train Set Test Set 80% 20%

Validate?

Probably ¡works Deploy

$$$

Transfer learning using deep features

36

slide-37
SLIDE 37

Using ImageNet-trained network as extractor for general features

  • Using classic AlexNet architechture pioneered by

Alex Krizhevsky et. al in ImageNet Classification with Deep Convolutional Neural Networks

  • It turns out that a neural network trained on ~1

million images of about 1000 classes makes a surprisingly general feature extractor

  • First illustrated by Donahue et al in DeCAF: A

Deep Convolutional Activation Feature for Generic Visual Recognition

37

slide-38
SLIDE 38

Demo

38

slide-39
SLIDE 39

Caltech-101

39

slide-40
SLIDE 40

Caltech-101

40

Extract ¡features ¡here

slide-41
SLIDE 41

Deep Features and Logistic Regression

41

slide-42
SLIDE 42

What else can we do with Deep Features?

42

slide-43
SLIDE 43

Finding similar images

43

slide-44
SLIDE 44

Clustering images

Goldberger et al.

Set of Images

44

slide-45
SLIDE 45

How general are these Deep Features?

45

slide-46
SLIDE 46

Deep Features are Generalizable

46

slide-47
SLIDE 47

Thank you!

  • To learn more: http://www.dato.com/learn
  • Play with the demos:
  • Pathways: https://pathways-demo.herokuapp.com/
  • Phototag: http://phototag.herokuapp.com/
  • Contact:
  • piotr@dato.com
  • We are hiring! jobs@dato.com
  • Thank you to Nvidia for designing and providing the hardware to make this

possible

  • Please complete the Presenter Evaluation. Your feedback is important!

47