CSSE463: Image Recognition Day 17 Today: Bayesian classifiers - - PowerPoint PPT Presentation

csse463 image recognition day 17
SMART_READER_LITE
LIVE PREVIEW

CSSE463: Image Recognition Day 17 Today: Bayesian classifiers - - PowerPoint PPT Presentation

CSSE463: Image Recognition Day 17 Today: Bayesian classifiers Tomorrow: Lightning talks and exam questions Weds night: Lab 4 due Thursday: Exam Bring your calculator. Questions? Exam Thursday Closed book, notes, computer


slide-1
SLIDE 1

CSSE463: Image Recognition Day 17

 Today: Bayesian classifiers  Tomorrow: Lightning talks and exam questions  Weds night: Lab 4 due  Thursday: Exam

 Bring your calculator.

 Questions?

slide-2
SLIDE 2

Exam Thursday

Closed book, notes, computer

BUT you may bring handwritten notes (1-side of paper)

You may also want a calculator.

Similar to the written assignment in format:

Some questions from daily quizzes

Some extensions of quizzes

Some applications of image-processing algorithms

Some questions asking about process you followed in lab

Also a few Matlab-specific questions

Write 2-3 lines of code to…

slide-3
SLIDE 3

Bayesian classifiers

 Use training data

 Assume that you know

probabilities of each feature.

 If 2 classes:

 Classes

and

 Say, circles vs. non-circles  A single feature, x  Both classes equally likely  Both types of errors equally

bad

 Where should we set the

threshold between classes? Here?

 Where in graph are 2 types of

errors?

x p(x) P(x|

1)

Non-circles P(x|

2)

Circles Detected as circles Q1-4

slide-4
SLIDE 4

What if we have prior information?

 Bayesian probabilities say that if we only

expect 10% of the objects to be circles, that should affect our classification

Q5-8

slide-5
SLIDE 5

Bayesian classifier in general

 Bayes rule:

 Verify with example

 For classifiers:

 x = feature(s) 

i = class

 P( |x) = posterior probability  P( ) = prior  P(x) = unconditional probability  Find best class by maximum a

posteriori (MAP) priniciple. Find class i that maximizes P(

i|x).  Denominator doesn’t affect

calculations

 Example:

 indoor/outdoor classification

) ( ) ( ) | ( ) | ( b p a p a b p b a p ) ( ) ( ) | ( ) | ( x p p x p x p

i i i Learned from examples (histogram) Learned from training set (or leave out if unknown) Fixed

slide-6
SLIDE 6

Indoor vs. outdoor classification

 I can use low-level image info (color,

texture, etc)

 But there’s another source of really helpful

info!

slide-7
SLIDE 7

Camera Metadata Distributions

p(FF|I) p(FF|O) 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 On Off

p(FF|I) p(FF|O)

1 2 3 4 5 7 9 17 p(SD|I) 0.05 0.1 0.15 0.2 0.25 0.3 0.35 p(SD|I) p(SD|O) 0.01 0.017 0.022 0.03 0.05 0.07 0.1 0.12 p(ET|I) p(ET|O) 0.2 0.4 0.6 p(ET|I) p(ET|O)

Exposure Time Flash Subject Distance

  • 6
  • 0.5

1 2.5 4 5.5 7 8.5 10 11.5 p(BV|I) 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 p(BV|I) p(BV|O)

Subject Distance Scene Brightness

slide-8
SLIDE 8

Why we need Bayes Rule

Problem: We know conditional probabilities like P(flash was on | indoor) We want to find conditional probabilities like P(indoor | flash was on, exp time = 0.017, sd=8 ft, SVM output) Let = class of image, and x = all the evidence. More generally, we know P( x | ) from the training set (why?) But we want P( | x)

) ( ) ( ) | ( ) | ( x p p x p x p

i i i

slide-9
SLIDE 9

Using Bayes Rule

P( |x) = P(x| )P( )/P(x)

The denominator is constant for an image, so

P( |x) = P(x| )P( )

Q9

slide-10
SLIDE 10

Using Bayes Rule

P( |x) = P(x| )P( )/P(x)

The denominator is constant for an image, so

P( |x) = P(x| )P( ) We have two types of features, from image metadata (M) and from low-level features, like color (L) Conditional independence means P(x| ) = P(M| )P(L| ) P( |X) = P(M| ) P(L| ) P( )

From histograms From SVM Priors (initial bias)

slide-11
SLIDE 11

Bayesian network

 Efficient way to encode conditional

probability distributions and calculate marginals

 Use for classification by having the

classification node at the root

 Examples

 Indoor-outdoor classification  Automatic image orientation detection

slide-12
SLIDE 12

Indoor vs. outdoor classification

SVM KL Divergence Color Features SVM Texture Features EXIF header

Each edge in the graph has an associated matrix of conditional probabilities

slide-13
SLIDE 13

Effects of Image Capture Context

Recall for a class C is fraction of C classified correctly

slide-14
SLIDE 14

Orientation detection

 See IEEE TPAMI paper

 Hardcopy or posted

 Also uses single-feature Bayesian classifier

(answer to #1-4)

 Keys:

 4-class problem (North, South, East, West)  Priors really helped here!

 You should be able to understand the two

papers (both posted)