model fitting
play

Model Fitting " - PowerPoint PPT Presentation

Model Fitting " ... Fitting: Motivation 9300 Harris Corners Pkwy, Charlotte, NC Weve learned how to detect edges,


  1. Model Fitting ע הנבנש רועיש לע ססובמ" לט ירנסה

  2. תורוקמ •דומילה רפס ינפ לע רזופמ...

  3. Fitting: Motivation 9300 Harris Corners Pkwy, Charlotte, NC • We’ve learned how to detect edges, corners, blobs. Now what? • We would like to form a higher-level, more compact representation of the features in the image by grouping multiple features according to a simple model

  4. Fitting • Choose a parametric model to represent a set of features simple model: circles simple model: lines complicated model: car Source: K. Grauman

  5. Fitting • Choose a parametric model to represent a set of features • Line, ellipse, spline, etc. • Three main questions: • What model represents this set of features best? • Which of several model instances gets which feature? • How many model instances are there? • Computational complexity is important • It is infeasible to examine every possible set of parameters and every possible combination of features

  6. Fitting: Issues Case study: Line detection • Noise in the measured feature locations • Extraneous data: clutter (outliers), multiple lines • Missing data: occlusions

  7. Fitting: Issues • If we know which points belong to the line, how do we find the “optimal” line parameters? • Least squares • What if there are outliers? • RANSAC • What if there are many lines? • Voting methods: Hough transform • What if we’re not even sure it’s a line? • Model selection

  8. תודוקנל וק תמאתה"תחת-שער"

  9. Least squares line fitting Data: ( x 1 , y 1 ), …, ( x n , y n ) y=mx+b Line equation: y i = m x i + b Find ( m , b ) to minimize ( x i , y i )    n   2 ( ) E y m x b i i 1 i

  10. Least squares line fitting Data: ( x 1 , y 1 ), …, ( x n , y n ) y=mx+b Line equation: y i = m x i + b Find ( m , b ) to minimize ( x i , y i )    n   2 ( ) E y m x b i i 1 i 2     1 y x 2     1 1       m m        2 n         1     E y x Y XB       i i 1     i  b  b        1  y x n n       T T T T ( ) ( ) 2 ( ) ( ) ( ) Y XB Y XB Y Y XB Y XB XB dE    T T 2 2 0 X XB X Y dB Normal equations: least squares solution to  T T X XB X Y XB=Y

  11. ב- MATLAB : תואוושמה תכרעמל ןורתפה • XB=Y = B X\Y;

  12. Problem with “vertical” least squares • Not rotation-invariant • Fails completely for vertical lines

  13. Total least squares Distance between point ( x i , y i ) and ax+by=d line ax+by=d ( a 2 +b 2 = 1): Unit normal: N= ( a, b ) | ax i + by i – d | ( ( x i , y i )    n   2 E a x b y d ) i i 1 i

  14. Total least squares Distance between point ( x i , y i ) and ax+by=d line ax+by=d ( a 2 +b 2 = 1): Unit normal: N= ( a, b ) | ax i + by i – d | ( ( x i , y i )    n   2 E a x b y d ) i i 1 i Proof: (from wikipedia)

  15. Total least squares Distance between point ( x i , y i ) and ax+by=d line ax+by=d ( a 2 +b 2 = 1): | ax i + by i – d | Find ( a , b , d ) to minimize the sum of Unit normal: squared perpendicular distances N= ( a, b ) ( ( x i , y i )    n   2 E a x b y d ) i i 1 i    n   2 ( ) E a x b y d i i 1 i

  16. Total least squares Distance between point ( x i , y i ) and ax+by=d line ax+by=d ( a 2 +b 2 = 1): | ax i + by i – d | Find ( a , b , d ) to minimize the sum of Unit normal: squared perpendicular distances N= ( a, b ) ( ( x i , y i )    n   2 E a x b y d ) i i 1 i    n   2 ( ) E a x b y d i i 1 i  E a b      n      n  n   2 ( ) 0 d x y ax by a x b y d   i  i i i 1 1 i i 1 i n n d 2     x x y y 1 1     a    n        2 T ( ( ) ( )) ( ) ( )   E a x x b y y UN UN   i i 1   i b       x x y y dE n n   T 2 ( ) 0 U U N dN Solution to ( U T U ) N = 0, subject to || N || 2 = 1 : eigenvector of U T U associated with the smallest eigenvalue (least squares solution to homogeneous linear system UN = 0 )

  17. Total least squares     n n     x x y y    2 ( ) ( )( ) x x x x y y   1 1   i i i         T 1 1 i i U U U   n n        2 ( )( ) ( )   x x y y y y       x x y y i i i   n n   i 1 i 1 second moment matrix

  18. Total least squares     n n     x x y y    2 ( ) ( )( ) x x x x y y   1 1   i i i         T 1 1 i i U U U   n n        2 ( )( ) ( )   x x y y y y       x x y y i i i   n n   i 1 i 1 second moment matrix N = ( a , b )   ( , ) x x y y i i ( , ) x y

  19. Least squares: Robustness to noise Least squares fit to the red points:

  20. Least squares: Robustness to noise Least squares fit with an outlier: Problem: squared error heavily penalizes outliers

  21. תוינוציח תודוקנ שישכ הרוק המ?

  22. RANSAC • Random sample consensus (RANSAC): Very general framework for model fitting in the presence of outliers M. A. Fischler, R. C. Bolles. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Comm. of the ACM, Vol 24, pp 381-395, 1981.

  23. Fitting a Line Least squares fit

  24. RANSAC • Select sample of m points at random

  25. RANSAC • Select sample of m points at random • Calculate model parameters that fit the data in the sample

  26. RANSAC • Select sample of m points at random • Calculate model parameters that fit the data in the sample • Calculate error function for each data point

  27. RANSAC • Select sample of m points at random • Calculate model parameters that fit the data in the sample • Calculate error function for each data point • Select data that support current hypothesis

  28. RANSAC • Select sample of m points at random • Calculate model parameters that fit the data in the sample • Calculate error function for each data point • Select data that support current hypothesis

  29. RANSAC for line fitting Repeat N times: • Draw s points uniformly at random • Fit line to these s points • Find inliers to this line among the remaining points (i.e., points whose distance from the line is less than t ) • If there are d or more inliers, accept the line and refit using all inliers

  30. Choosing the parameters • Initial number of points s • Typically minimum number needed to fit the model • Distance threshold t • Choose t so probability for inlier is p (e.g. 0.95) • Zero-mean Gaussian noise with std. dev. σ : t 2 =3.84 σ 2 • Number of iterations N • Choose N so that, with probability p , at least one random sample is free from outliers (e.g. p =0.99) (outlier ratio: e ) proportion of outliers e s 5% 10% 20% 25% 30% 40% 50% 2 2 3 5 6 7 11 17 3 3 4 7 9 11 19 35 4 3 5 9 13 17 34 72 5 4 6 12 17 26 57 146 6 4 7 16 24 37 97 293 7 4 8 20 33 54 163 588 8 5 9 26 44 78 272 1177 Source: M. Pollefeys

  31. Choosing the parameters • Initial number of points s • Typically minimum number needed to fit the model • Distance threshold t • Choose t so probability for inlier is p (e.g. 0.95) • Zero-mean Gaussian noise with std. dev. σ : t 2 =3.84 σ 2 • Number of iterations N • Choose N so that, with probability p , at least one random sample is free from outliers (e.g. p =0.99) (outlier ratio: e ) • Consensus set size d • Should match expected inlier ratio Source: M. Pollefeys

  32. RANSAC pros and cons • Pros • Simple and general • Applicable to many different problems • Often works well in practice • Cons • Lots of parameters to tune • Can’t always get a good initialization of the model based on the minimum number of samples • Sometimes too many iterations are required • Can fail for extremely low inlier ratios • We can often do better than brute-force sampling

  33. דחא וקמ רתוי שישכ הרוק המ?

  34. Voting schemes • Let each feature vote for all the models that are compatible with it • Hopefully the noise features will not vote consistently for any single model • Missing data doesn’t matter as long as there are enough features remaining to agree on a good model

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend