fast and robust normal estimation for point clouds with
play

Fast and Robust Normal Estimation for Point Clouds with Sharp - PowerPoint PPT Presentation

Fast and Robust Normal Estimation for Point Clouds with Sharp Features Alexandre Boulch & Renaud Marlet University Paris-Est, LIGM (UMR CNRS), Ecole des Ponts ParisTech Symposium on Geometry Processing 2012 1/37 Normal estimation Normal


  1. Fast and Robust Normal Estimation for Point Clouds with Sharp Features Alexandre Boulch & Renaud Marlet University Paris-Est, LIGM (UMR CNRS), Ecole des Ponts ParisTech Symposium on Geometry Processing 2012 1/37

  2. Normal estimation Normal estimation for point clouds Our method Experiments Conclusion 2/37

  3. Normal Estimation Normal estimation for point clouds Our method Experiments Conclusion 3/37

  4. Data Sensitivity to noise Point clouds from photogram- metry or laser acquisition: ◮ may be noisy Robustness to noise 4/37

  5. Data Point clouds from photogram- metry or laser acquisition: ◮ may be noisy ◮ may have outliers P Regression Plane 4/37

  6. Data Point clouds from photogram- metry or laser acquisition: Smoothed ◮ may be noisy sharp features ◮ may have outliers ◮ most often have sharp features Preserved sharp features 4/37

  7. Data Point clouds from photogram- metry or laser acquisition: Sensitivity to ◮ may be noisy anisotropy ◮ may have outliers ◮ most often have sharp features ◮ may be anisotropic Robustness to anisotropy 4/37

  8. Data Point clouds from photogram- metry or laser acquisition: ◮ may be noisy ◮ may have outliers ◮ most often have sharp features ◮ may be anisotropic ◮ may be huge (more than 20 million points) 4/37

  9. Normal Estimation Normal estimation for point clouds Our method Experiments Conclusion 5/37

  10. Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. 6/37

  11. Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. We consider two cases: ◮ P lies on a planar surface 6/37

  12. Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. We consider two cases: ◮ P lies on a planar surface 6/37

  13. Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. We consider two cases: ◮ P lies on a planar surface ◮ P lies next to a sharp feature 6/37

  14. Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. We consider two cases: ◮ P lies on a planar surface ◮ P lies next to a sharp feature 6/37

  15. Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. We consider two cases: ◮ P lies on a planar surface ◮ P lies next to a sharp feature If Area ( N 1 ) > Area ( N 2 ) , picking points in N 1 × N 1 is more probable than N 2 × N 2 , and N 1 × N 2 leads to “random” normals. 6/37

  16. Basics of the method (2D case here for readability) Main Idea Draw as many primitives as necessary to estimate the normal distribution, and then the most probable normal. P ◮ Discretize the problem Normal direction N.B. We compute the normal direction, not orientation. 7/37

  17. Basics of the method (2D case here for readability) Main Idea Draw as many primitives as necessary to estimate the normal distribution, and then the most probable normal. P ◮ Discretize the problem ◮ Fill a Hough accumulator Normal direction N.B. We compute the normal direction, not orientation. 7/37

  18. Basics of the method (2D case here for readability) Main Idea Draw as many primitives as necessary to estimate the normal distribution, and then the most probable normal. P ◮ Discretize the problem ◮ Fill a Hough accumulator ◮ Select the good normal Normal direction N.B. We compute the normal direction, not orientation. 7/37

  19. Robust Randomized Hough Transform ◮ T , number of primitives picked after T iteration. ◮ T min , number of primitives to pick ◮ M , number of bins of the accumulator ◮ ˆ p m , empirical mean of the bin m ◮ p m , theoretical mean of the bin m 8/37

  20. Robust Randomized Hough Transform Global upper bound T min such that: P ( m ∈{ 1 ,..., M } | ˆ max p m − p m | ≤ δ ) ≥ α From Hoeffding’s inequality, for a given bin: p m − p m | ≥ δ ) ≤ 2 exp ( − 2 δ 2 T min ) P ( | ˆ Considering the whole accumulator: 2 δ 2 ln ( 2 M 1 T min ≥ 1 − α ) 9/37

  21. Robust Randomized Hough Transform Confidence Interval Idea: if we pick often enough the same bin, we want to stop drawing primitives. From the Central Limit Theorem, we can stop if: � 1 p m 1 − ˆ p m 2 ≥ 2 ˆ T i.e. the confidence intervals of the most voted bins do not in- tersect (confidence level 95 % ) 10/37

  22. Accumulator Our primitives are planes direc- tions (defined by two angles). We use the accumulator of Borrmann & al ( 3D Research , 2011). ◮ Fast computing ◮ Bins of similar area 11/37

  23. Discretization issues P The use of a discrete accumulator may be a cause of error. 12/37

  24. Discretization issues P The use of a discrete accumulator may be a cause of error. 12/37

  25. Discretization issues P The use of a discrete accumulator may be a cause of error. Solution Iterate the algorithm using randomly rotated accumulators. 12/37

  26. Normal Selection P Normal directions obtained by rotation of the accumulator Mean over Mean over all Best confidence best cluster the normals 13/37

  27. Dealing with anisotropy The robustness to anisotropy depends of the way we select the planes (triplets of points) Sensitivity to anisotropy Robustness to anisotropy 14/37

  28. Random point selection among nearest neighbors Dealing with anisotropy The triplets are randomly selected among the K nearest neighbors. Fast but cannot deal with anisotropy. 15/37

  29. Uniform point selection on the neighborhood ball Dealing with anisotropy ◮ Pick a point Q in the neighborhood ball Q P 16/37

  30. Uniform point selection on the neighborhood ball Dealing with anisotropy ◮ Pick a point Q in the neighborhood ball ◮ Consider a small ball Q around Q P 16/37

  31. Uniform point selection on the neighborhood ball Dealing with anisotropy ◮ Pick a point Q in the neighborhood ball ◮ Consider a small ball Q around Q P ◮ Pick a point randomly in the small ball 16/37

  32. Uniform point selection on the neighborhood ball Dealing with anisotropy ◮ Pick a point Q in the neighborhood ball ◮ Consider a small ball around Q ◮ Pick a point randomly in the small ball ◮ Iterate to get a triplet Deals with anisotropy, but for a high computation cost. 16/37

  33. Cube discretization of the neighborhood ball Dealing with anisotropy ◮ Discretize the neighborhood ball P 17/37

  34. Cube discretization of the neighborhood ball Dealing with anisotropy ◮ Discretize the neighborhood ball ◮ Pick a cube P 17/37

  35. Cube discretization of the neighborhood ball Dealing with anisotropy ◮ Discretize the neighborhood ball ◮ Pick a cube P ◮ Pick a point randomly in this cube 17/37

  36. Cube discretization of the neighborhood ball Dealing with anisotropy ◮ Discretize the neighborhood ball ◮ Pick a cube ◮ Pick a point randomly in this cube ◮ Iterate to get a triplet Good compromise between speed and robustness to anisotropy. 17/37

  37. Normal Estimation Normal estimation for point clouds Our method Experiments Conclusion 18/37

  38. Methods used for comparison ◮ Regression ◮ Hoppe & al ( SIGGRAPH ,1992): plane fitting ◮ Cazals & Pouget Plane fitting Jet fitting ( SGP , 2003): jet fitting Noise � � Outliers Sharp fts Anisotropy Fast � � 19/37

  39. Methods used for comparison ◮ Regression ◮ Hoppe & al ( SIGGRAPH ,1992): plane fitting ◮ Cazals & Pouget Plane fitting Jet fitting ( SGP , 2003): jet NormFet fitting ◮ Voronoï diagram ◮ Dey & Goswami Noise � � ( SCG , 2004): Outliers NormFet Sharp fts � Anisotropy � Fast � � � 19/37

  40. Methods used for comparison ◮ Regression ◮ Hoppe & al ( SIGGRAPH ,1992): Sample Consensus plane fitting ◮ Cazals & Pouget Plane fitting Jet fitting ( SGP , 2003): jet NormFet fitting ◮ Voronoï diagram ◮ Dey & Goswami Noise � � � ( SCG , 2004): Outliers � NormFet Sharp fts � � ◮ Sample Consensus Anisotropy � Models Fast � � � ◮ Li & al ( Computer & Graphics , 2010) 19/37

  41. Precision Two error measures: ◮ Root Mean Square (RMS): � � 1 � 2 � RMS = � � n P , ref n P , est |C| P ∈C ◮ Root Mean Square with threshold (RMS_ τ ): � � 1 � � v 2 RMS _ τ = � P |C| P ∈C where More suited for sharp features � � if � n P , ref n P , est < τ n P , ref n P , est v P = π otherwise 2 20/37

  42. Visual on error distances Same RMS, different RMS τ 21/37

  43. Precision (with noise) Precision for cube uniformly sampled, depending on noise. 22/37

  44. Precision (with noise and anisotropy) Precision for a corner with anisotropy, depending on noise. 23/37

  45. Computation time Computation time for sphere, function of the number of points. 24/37

  46. Robustness to outliers Noisy model (0.2%) + 100% of outliers. 25/37

  47. Robustness to outliers Noisy model (0.2%) + 200% of outliers. 26/37

  48. Robustness to anisotropy 27/37

  49. Preservation of sharp features 28/37

  50. Robustness to “natural” noise, outliers and anisotropy Point cloud created by photogrammetry. 29/37

  51. Normal Estimation Normal estimation for point clouds Our method Experiments Conclusion 30/37

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend