Neural Networks Part 1 Yingyu Liang yliang@cs.wisc.edu Computer - - PowerPoint PPT Presentation

neural networks
SMART_READER_LITE
LIVE PREVIEW

Neural Networks Part 1 Yingyu Liang yliang@cs.wisc.edu Computer - - PowerPoint PPT Presentation

Neural Networks Part 1 Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison slide 1 [Based on slides from Jerry Zhu] Motivation I: learning features Example task Experience/Data: images with


slide-1
SLIDE 1

slide 1

Neural Networks

Part 1

Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison

[Based on slides from Jerry Zhu]

slide-2
SLIDE 2

slide 2

Motivation I: learning features

  • Example task

indoor

  • utdoor

Experience/Data: images with labels Indoor

slide-3
SLIDE 3

slide 3

Motivation I: learning features

  • Featured designed for the example task

Indoor

Extract features

slide-4
SLIDE 4

slide 4

Motivation I: learning features

  • More complicated tasks: hard to design
  • Would like to learn features
slide-5
SLIDE 5

slide 5

Motivation II: neuroscience

  • Inspirations from human brains
  • Networks of simple and homogenous units
slide-6
SLIDE 6

slide 7

Motivation II: neuroscience

  • Human brain: 100, 000, 000, 000 neurons
  • Each neuron receives input from 1,000 others
  • Impulses arrive simultaneously
  • Added together*

▪ an impulse can either increase or decrease the possibility of nerve pulse firing

  • If sufficiently strong, a nerve pulse is generated
  • The pulse forms the input to other neurons.
  • The interface of two neurons is called a synapse

http://www.bris.ac.uk/synaptic/public/brainbasic.html

slide-7
SLIDE 7

slide 8

Successful applications

  • Computer vision: object location

Slides from Kaimin He, MSRA

slide-8
SLIDE 8

slide 9

Successful applications

  • NLP: Question & Answer

Figures from the paper “Ask Me Anything: Dynamic Memory Networks for Natural Language Processing ”, by Ankit Kumar, Ozan Irsoy, Peter Ondruska, Mohit Iyyer, James Bradbury, Ishaan Gulrajani, Richard Socher

slide-9
SLIDE 9

slide 10

Successful applications

  • Game: AlphaGo
slide-10
SLIDE 10

slide 11

Outline

  • A single neuron

▪ Linear perceptron ▪ Non-linear perceptron ▪ Learning of a single perceptron ▪ The power of a single perceptron

  • Neural network: a network of neurons

▪ Layers, hidden units ▪ Learning of neural network: backpropagation ▪ The power of neural network ▪ Issues

  • Everything revolves around gradient descent
slide-11
SLIDE 11

slide 13

Linear perceptron

1

slide-12
SLIDE 12

slide 14

Learning in linear perceptron

slide-13
SLIDE 13

slide 15

Learning in linear perceptron

slide-14
SLIDE 14

slide 16

Visualization of gradient descent

slide-15
SLIDE 15

slide 17

The (limited) power of linear perceptron

  • 1
slide-16
SLIDE 16

slide 18

Non-linear perceptron

1

slide-17
SLIDE 17

slide 19

Non-linear perceptron

1 Now we see the reason for bias terms

slide-18
SLIDE 18

slide 20

Non-linear perceptron for AND

slide-19
SLIDE 19

slide 21

Example Question

? Weather Company Proximity

  • Will you go to the festival?
  • Go only if at least two conditions are favorable

All inputs are binary; 1 is favorable

slide-20
SLIDE 20

slide 22

Example Question

? Weather Company Proximity

  • Will you go to the festival?
  • Go only if Weather is favorable and at least one of

the other two conditions is favorable All inputs are binary; 1 is favorable

slide-21
SLIDE 21

slide 23

Sigmod activation function: Our second non-linear perceptron

slide-22
SLIDE 22

slide 24

Sigmod activation function: Our second non-linear perceptron

slide-23
SLIDE 23

slide 25

Sigmod activation function: Our second non-linear perceptron

slide-24
SLIDE 24

slide 26

Learning in non-linear perceptron

slide-25
SLIDE 25

slide 27

The (limited) power of non-linear perceptron

  • Even with a non-linear sigmoid function, the decision

boundary a perceptron can produce is still linear

  • AND, OR, NOT revisited
  • How about XOR?
slide-26
SLIDE 26

slide 28

The (limited) power of non-linear perceptron

  • Even with a non-linear sigmoid function, the decision

boundary a perceptron can produce is still linear

  • AND, OR, NOT revisited
  • How about XOR?
  • This contributed to the first AI winter
slide-27
SLIDE 27

slide 29

Brief history of neural networks

slide-28
SLIDE 28

slide 30

(Multi-layer) neural network

  • Given sigmoid perceptrons
  • Can you produce output like
  • which had non-linear decision boundarys

0 1 0 1 0