neural networks

Neural Networks Part 1 Yingyu Liang yliang@cs.wisc.edu Computer - PowerPoint PPT Presentation

Neural Networks Part 1 Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison slide 1 [Based on slides from Jerry Zhu] Motivation I: learning features Example task Experience/Data: images with


  1. Neural Networks Part 1 Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison slide 1 [Based on slides from Jerry Zhu]

  2. Motivation I: learning features • Example task Experience/Data: images with labels indoor Indoor outdoor slide 2

  3. Motivation I: learning features • Featured designed for the example task Extract features Indoor 0 slide 3

  4. Motivation I: learning features • More complicated tasks: hard to design • Would like to learn features slide 4

  5. Motivation II: neuroscience • Inspirations from human brains • Networks of simple and homogenous units slide 5

  6. Motivation II: neuroscience • Human brain: 100, 000, 000, 000 neurons • Each neuron receives input from 1,000 others • Impulses arrive simultaneously • Added together* ▪ an impulse can either increase or decrease the possibility of nerve pulse firing • If sufficiently strong, a nerve pulse is generated • The pulse forms the input to other neurons. • The interface of two neurons is called a synapse http://www.bris.ac.uk/synaptic/public/brainbasic.html slide 7

  7. Successful applications • Computer vision: object location Slides from Kaimin He, MSRA slide 8

  8. Successful applications • NLP: Question & Answer Figures from the paper “Ask Me Anything: Dynamic Memory Networks for Natural Language Processing ”, by Ankit Kumar, Ozan Irsoy, Peter Ondruska, Mohit Iyyer, James Bradbury, Ishaan Gulrajani, Richard Socher slide 9

  9. Successful applications • Game: AlphaGo slide 10

  10. Outline • A single neuron ▪ Linear perceptron ▪ Non-linear perceptron ▪ Learning of a single perceptron ▪ The power of a single perceptron • Neural network: a network of neurons ▪ Layers, hidden units ▪ Learning of neural network: backpropagation ▪ The power of neural network ▪ Issues • Everything revolves around gradient descent slide 11

  11. Linear perceptron • 1 … slide 13

  12. Learning in linear perceptron • slide 14

  13. Learning in linear perceptron slide 15

  14. Visualization of gradient descent slide 16

  15. The (limited) power of linear perceptron • 1 slide 17

  16. Non-linear perceptron • 1 … slide 18

  17. Non-linear perceptron • 1 … Now we see the reason for bias terms slide 19

  18. Non-linear perceptron for AND • slide 20

  19. Example Question • Will you go to the festival? • Go only if at least two conditions are favorable ? Weather Company Proximity All inputs are binary; 1 is favorable slide 21

  20. Example Question • Will you go to the festival? • Go only if Weather is favorable and at least one of the other two conditions is favorable ? Weather Company Proximity All inputs are binary; 1 is favorable slide 22

  21. Sigmod activation function: Our second non-linear perceptron • slide 23

  22. Sigmod activation function: Our second non-linear perceptron • slide 24

  23. Sigmod activation function: Our second non-linear perceptron • slide 25

  24. Learning in non-linear perceptron slide 26

  25. The (limited) power of non-linear perceptron • Even with a non-linear sigmoid function, the decision boundary a perceptron can produce is still linear • AND, OR, NOT revisited • How about XOR? slide 27

  26. The (limited) power of non-linear perceptron • Even with a non-linear sigmoid function, the decision boundary a perceptron can produce is still linear • AND, OR, NOT revisited • How about XOR? • This contributed to the first AI winter slide 28

  27. Brief history of neural networks slide 29

  28. (Multi-layer) neural network • Given sigmoid perceptrons • Can you produce output like • which had non-linear decision boundarys 0 1 0 1 0 slide 30

Recommend


More recommend