csse463 image recognition day 17
play

CSSE463: Image Recognition Day 17 Today: Bayesian classifiers - PowerPoint PPT Presentation

CSSE463: Image Recognition Day 17 Today: Bayesian classifiers Tomorrow: Lightning talks and exam questions Weds night: Lab 4 due Thursday: Exam Bring your calculator. Questions? Exam Thursday Closed book, notes, computer


  1. CSSE463: Image Recognition Day 17  Today: Bayesian classifiers  Tomorrow: Lightning talks and exam questions  Weds night: Lab 4 due  Thursday: Exam  Bring your calculator.  Questions?

  2. Exam Thursday Closed book, notes, computer  BUT you may bring handwritten notes (1-side of paper)  You may also want a calculator.  Similar to the written assignment in format:  Some questions from daily quizzes  Some extensions of quizzes  Some applications of image-processing algorithms  Some questions asking about process you followed in lab  Also a few Matlab-specific questions  Write 2- 3 lines of code to… 

  3. Bayesian classifiers  Use training data p(x)  Assume that you know P(x| 1 ) probabilities of each feature.  If 2 classes: P(x| 2 )  Classes and  Say, circles vs. non-circles  A single feature, x Circles  Both classes equally likely  Both types of errors equally Non-circles bad  Where should we set the threshold between classes? x Here? Detected as circles  Where in graph are 2 types of errors? Q1-4

  4. What if we have prior information?  Bayesian probabilities say that if we only expect 10% of the objects to be circles, that should affect our classification Q5-8

  5. Bayesian classifier in general  Bayes rule: ( | ) ( ) p b a p a  Verify with example ( | ) p a b  For classifiers: ( ) p b  x = feature(s) i = class   P( |x) = posterior probability ( | ) ( ) p x p  P( ) = prior i i ( | ) p x  P(x) = unconditional probability i ( ) p x  Find best class by maximum a posteriori (MAP) priniciple. Find Fixed class i that maximizes P( i |x). Learned from  Denominator doesn’t affect Learned from examples calculations training set (or (histogram)  Example: leave out if  indoor/outdoor classification unknown)

  6. Indoor vs. outdoor classification  I can use low-level image info (color, texture, etc)  But there’s another source of really helpful info!

  7. Camera Metadata Distributions 0.16 p(FF|I) 0.14 1 0.12 p(BV|I) p(FF|O) 0.9 0.1 p(BV|O) 0.8 0.08 0.7 0.06 0.6 0.04 0.5 0.02 0.4 0 -6 0.3 -0.5 1 0.2 2.5 4 p(FF|O) 5.5 0.1 7 8.5 0 10 p(FF|I) p(BV|I) 11.5 Subject Distance On Off Scene Brightness Flash 0.35 0.3 0.6 0.25 0.4 p(SD|I) 0.2 p(ET|I) p(SD|O) 0.2 0.15 p(ET|O) 0.1 0 0 0.05 0.01 0 0.017 0 0.022 1 2 0.03 p(ET|I) 3 4 0.05 5 p(ET|O) 0.07 7 9 0.1 17 p(SD|I) 0.12 Exposure Time Subject Distance

  8. Why we need Bayes Rule Problem: We know conditional probabilities like P( flash was on | indoor) We want to find conditional probabilities like P(indoor | flash was on, exp time = 0.017, sd=8 ft, SVM output) Let = class of image, and x = all the evidence. More generally, we know P( x | ) from the training set (why?) But we want P( | x) ( | ) ( ) p x p i i ( | ) p x i ( ) p x

  9. Using Bayes Rule P( |x) = P(x| )P( )/P(x) The denominator is constant for an image, so P( |x) = P(x| )P( ) Q9

  10. Using Bayes Rule P( |x) = P(x| )P( )/P(x) The denominator is constant for an image, so P( |x) = P(x| )P( ) We have two types of features, from image metadata (M) and from low-level features, like color (L) Conditional independence means P(x| ) = P(M| )P(L| ) P( |X) = P(M| ) P(L| ) P( ) Priors From histograms From SVM (initial bias)

  11. Bayesian network  Efficient way to encode conditional probability distributions and calculate marginals  Use for classification by having the classification node at the root  Examples  Indoor-outdoor classification  Automatic image orientation detection

  12. Indoor vs. outdoor classification Each edge in the graph has an associated matrix of conditional probabilities SVM SVM Color Texture Features Features KL Divergence EXIF header

  13. Effects of Image Capture Context Recall for a class C is fraction of C classified correctly

  14. Orientation detection  See IEEE TPAMI paper  Hardcopy or posted  Also uses single-feature Bayesian classifier (answer to #1-4)  Keys:  4-class problem (North, South, East, West)  Priors really helped here!  You should be able to understand the two papers (both posted)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend