CSSE463: Image Recognition Day 31 Today: Bayesian classifiers - - PowerPoint PPT Presentation

csse463 image recognition day 31
SMART_READER_LITE
LIVE PREVIEW

CSSE463: Image Recognition Day 31 Today: Bayesian classifiers - - PowerPoint PPT Presentation

CSSE463: Image Recognition Day 31 Today: Bayesian classifiers Questions? Bayesian classifiers Use training data p(x) Assume that you know P(x| w 1 ) probabilities of each feature. If 2 classes: P(x| w 2 ) Classes w 1 and


slide-1
SLIDE 1

CSSE463: Image Recognition Day 31

 Today: Bayesian classifiers  Questions?

slide-2
SLIDE 2

Bayesian classifiers

 Use training data

 Assume that you know

probabilities of each feature.

 If 2 classes:

 Classes w1 and w2  Say, circles vs. non-circles  A single feature, x  Both classes equally likely  Both types of errors equally

bad

 Where should we set the

threshold between classes? Here?

 Where in graph are 2 types of

errors?

x p(x) P(x|w1) Non-circles P(x|w2) Circles Detected as circles Q1-4

slide-3
SLIDE 3

What if we have prior information?

 Bayesian probabilities say that if we only

expect 10% of the objects to be circles, that should affect our classification

Q5-8

slide-4
SLIDE 4

Bayesian classifier in general

 Bayes rule:

 Verify with example

 For classifiers:

 x = feature(s)  wi = class  P(w|x) = posterior probability  P(w) = prior  P(x) = unconditional probability  Find best class by maximum a

posteriori (MAP) priniciple. Find class i that maximizes P(wi|x).

 Denominator doesn’t affect

calculations

 Example:

 indoor/outdoor classification

) ( ) ( ) | ( ) | ( b p a p a b p b a p  ) ( ) ( ) | ( ) | ( x p p x p x p

i i i

w w w 

Learned from examples (histogram) Learned from training set (or leave out if unknown) Fixed

slide-5
SLIDE 5

Indoor vs. outdoor classification

 I can use low-level image info (color,

texture, etc)

 But there’s another source of really helpful

info!

slide-6
SLIDE 6

Camera Metadata Distributions

p(FF|I) p(FF|O) 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 On Off

p(FF|I) p(FF|O)

1 2 3 4 5 7 9 17 p(SD|I) 0.05 0.1 0.15 0.2 0.25 0.3 0.35 p(SD|I) p(SD|O) 0.01 0.017 0.022 0.03 0.05 0.07 0.1 0.12 p(ET|I) p(ET|O) 0.2 0.4 0.6 p(ET|I) p(ET|O)

Exposure Time Flash Subject Distance

  • 6
  • 0.5

1 2.5 4 5.5 7 8.5 10 11.5 p(BV|I) 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 p(BV|I) p(BV|O)

Subject Distance Scene Brightness

slide-7
SLIDE 7

Why we need Bayes Rule

Problem: We know conditional probabilities like P(flash was on | indoor) We want to find conditional probabilities like P(indoor | flash was on, exp time = 0.017, sd=8 ft, SVM output) Let w = class of image, and x = all the evidence. More generally, we know P( x | w ) from the training set (why?) But we want P(w | x)

) ( ) ( ) | ( ) | ( x p p x p x p

i i i

w w w 

slide-8
SLIDE 8

Using Bayes Rule

P(w|x) = P(x|w)P(w)/P(x)

The denominator is constant for an image, so

P(w|x) = aP(x|w)P(w)

Q9

slide-9
SLIDE 9

Using Bayes Rule

P(w|x) = P(x|w)P(w)/P(x)

The denominator is constant for an image, so

P(w|x) = aP(x|w)P(w) We have two types of features, from image metadata (M) and from low-level features, like color (L) Conditional independence means P(x|w) = P(M|w)P(L|w) P(w|X) = aP(M|w) P(L|w) P(w)

From histograms From SVM Priors (initial bias)

slide-10
SLIDE 10

Bayesian network

 Efficient way to encode conditional

probability distributions and calculate marginals

 Use for classification by having the

classification node at the root

 Examples

 Indoor-outdoor classification  Automatic image orientation detection

slide-11
SLIDE 11

Indoor vs. outdoor classification

SVM KL Divergence Color Features SVM Texture Features EXIF header

Each edge in the graph has an associated matrix of conditional probabilities

slide-12
SLIDE 12

Effects of Image Capture Context

Recall for a class C is fraction of C classified correctly

slide-13
SLIDE 13

Orientation detection

 See IEEE TPAMI paper

 Hardcopy or posted

 Also uses single-feature Bayesian classifier

(answer to #1-4)

 Keys:

 4-class problem (North, South, East, West)  Priors really helped here!

 You should be able to understand the two

papers (both posted)