CSSE463: Image Recognition Day 18
Upcoming schedule:
Lightning talks shortly Midterm exam Monday Sunset detector due Wednesday
CSSE463: Image Recognition Day 18 Upcoming schedule: Lightning - - PowerPoint PPT Presentation
CSSE463: Image Recognition Day 18 Upcoming schedule: Lightning talks shortly Midterm exam Monday Sunset detector due Wednesday Multilayer feedforward neural nets Many perceptrons Organized into layers x 1 y 1 Input
Upcoming schedule:
Lightning talks shortly Midterm exam Monday Sunset detector due Wednesday
Many perceptrons
Organized into layers
Input (sensory) layer
Hidden layer(s): 2 proven sufficient to model any arbitrary function
Output (classification) layer
Powerful!
Calculates functions of input, maps to output layers
Example x1 x2 x3 y1 Sensory (HSV) Hidden (functions) Classification (apple/orange/banana) y2 y3 Q4
2 inputs 1 hidden layer of 5
1 output
Initialize all weights randomly
For each labeled example:
Calculate output using current
network
Update weights across
network, from output to input, using Hebbian learning
Iterate until convergence
Epsilon decreases at every
iteration
Matlab does this for you. matlabNeuralNetDemo.m
x1 x2 x3 y1 y2 y3
R peat Q5
Most networks are reasonably robust with
However, figuring out how to
normalize your input determine the architecture of your net
is a black art. You might need to experiment.
Re-run network with different initial weights and
Sonka, pp. 404-407 Laurene Fausett. Fundamentals of Neural Networks.
Approachable for beginner.
C.M. Bishop. Neural Networks for Pattern
Technical reference focused on the art of constructing
networks (learning rate, # of hidden layers, etc.)
Matlab neural net help
SVM: Training can be expensive
Training can take a long time with large data sets.
But the classification runtime and space are O(sd),
In the worst case, s = size of whole training set (like
But no worse than implementing a neural net with s
Empirically shown to have good generalizability even
Neural networks: can tune architecture.
Q3
y1 is just the weighted sum of contributions of individual support vectors: d = data dimension, e.g., 294, s = kernel width. numSupVecs, svcoeff (alpha) and bias are learned during training. Note: looking at which of your training examples are support vectors can be revealing! (Keep in mind for sunset detector and term project)
numSupVecs i sv x d i
i
1 )* / 1 (
2
s
Much easier computation than training Could implement on a device without MATLAB (e.g., a