object contour tracking using multi feature fusion based
play

Object Contour Tracking Using Multi-feature Fusion based Particle - PowerPoint PPT Presentation

Object Contour Tracking Using Multi-feature Fusion based Particle Filter Xiaofeng Lu [1][3] , Li Song [1][2] , Songyu Yu [1] , Nam Ling [2] [1] Institute of Image Communication and Information Processing, SJTU, China. [2] Department of Computer


  1. Object Contour Tracking Using Multi-feature Fusion based Particle Filter Xiaofeng Lu [1][3] , Li Song [1][2] , Songyu Yu [1] , Nam Ling [2] [1] Institute of Image Communication and Information Processing, SJTU, China. [2] Department of Computer Engineering, Santa Clara University, USA. [3] SCIE, Shanghai University, China. ICIEA2012, Singapore 18 July 2012 1

  2. Outline Main Idea 1. Multi-Feature Fusion based Particle Filter 2. ---- Rough Object Location Object Contour Tracking 3. ---- Accurate Object Contour Location Summary and Future Work 4. 2

  3. 1. Main Idea Main idea � Accurate object contour is useful to CV and PR applications � Algorithm involves a object rough location step and a rough region- based accurate contour detection step � Object rough location is realized by color histogram and Harris corner features fusion method in particle filter framework � Region-based temporal differencing model is adopted for object contour detection, which is faster compared to traditional active contour models 3

  4. 1. Main Idea Previous works � Our previous CamShift guided particle filter (CAMSGPF) improved accuracy significantly for object tracking [1] � Active contour model (ACM) was first proposed by Kass et al. in 1988 for object contour segmentation � Parametric active contour model: Snake representation � Geometric active contour model: Curve evolution using level set � Utilize an energy minimizing function to a deformable contour to approach the true target boundaries � Real time accurate object contour tracking is critical for further pattern recognition application [1] Z. Wang, X. Yang, Y. Xu, and S. Yu, “CamShift guided particle filter for visual tracking,” Pattern Recognition Letters, Vol.30, pp. 407-413, Mar. 2009. 4

  5. 2. Multi-Feature Fusion based Particle Filter Particle Filter Introduction [M. Isard and A. Blake, 1998] � Bayesian approach to dynamic state estimation � Multi-hypothesis and complexity grows with sample number � Observation model and state transition model as (2)(3) ( ) p x z � Construct the posterior probability density function (pdf) k 1: k ( ) = x f x , n (2) − − k k k 1 k 1 ( ) = x θ z E , (3) k k k k Target in T fame Color space convert Particles propagation w1=0.22 w2=0.51 w3=0.21 w4=0.43 w5=0.32 w5=0.31 ….. W20=0.36 Most closely matched target in T frame 采样20个粒子,每个粒子代表一个状态 States of all particles 5

  6. 2. Multi-Feature Fusion based Particle Filter Particle Filter for object tracking � Initialization of PF Particle filter is an optimal Bayesian algorithm for nonlinear and non- Gaussian object tracking. And initialize PF as (1). � System observation model in PF Posterior probability density function is approximated by a set of particles with weights. And the weight of particles is updated as (4). � Dynamic state transition model in PF The evolution of the particle set is achieved by propagating each particle with a dynamic model as (5). ( ) ( ) ( ) = T i i i p z x p x x ( ) x x , y w h , , (1) − ⎡ ⎤ k k k 0: k 1 ∝ ∝ − λ i 2 i c c w ( ) exp( D h h x , ) (4) ⎣ ⎦ k 0 k i i q x x , z − k 0: k 1 k ( ) ′ = + − + ˆ k x x x x n (5) − − − k 1 k 1 k 2 k 6

  7. 2. Multi-Feature Fusion based Particle Filter Color weight estimation – color feature � HSV color space histogram for color weight estimation we use Bhattacharyya distance and Bhattacharyya coefficient to present the similarity measurement between candidate and the reference template as in (9) and (10) After particle propagation. � Color feature is robust to deformation but sensitive to illumination changes and similar color clutters. N − ∑ = − ρ = i i (9) d 1 [ p , q ] 1 p ( y ) q ( y ) color ref can ref can o t = i 1 = πσ − σ j 2 2 (10) weight (1/ 2 )exp( d / 2 ) color color 7

  8. 2. Multi-Feature Fusion based Particle Filter Harris corner weight estimation – corner feature � Obtain motion edge through frame difference and Canny operator � Harris corner detector is used to abstract feature points on motion edge to reduce the edge match calculation Fig 1. (a) and (b) are adjacent frames; (c) result of temporal differencing; (d) result of Canny edge detection; (e) result of AND logic operation; (f) result of Harris corner detection 8

  9. 2. Multi-Feature Fusion based Particle Filter Harris corner weight estimation –corner feature � Corner feature is more stable to rotation, zoom and illumination � Using Hausdorff distance [2] to represent the similarity between candidate and the reference point sets � Hausdorff distance does not need to establish the point-to-point relationship to two finite point sets To two finite point sets, A={a 1 ,a 2 ,…,a n } and B{b 1 ,b 2 ,…,b m }, the Hausdorff distance i is defined with (11) , (12) and (13) based on Euclidean distance = d A B ( , ) max[ ( , ), ( , )] h A B h B A We can define the Harris corner (11) � weight of the candidate from the = − h A B ( , ) max min a b (12) i j ∈ ∈ a A b B similarity measurement based on = − (13) h B A ( , ) max min b a Hausdorff distance as (14) i j ∈ ∈ b B a A = πσ − σ j 2 2 (14) weight (1/ 2 )exp( d / 2 ) contour contour [2] D. P. Huttenlocher, G. A. Klanderman, and W. A. Rucklidge, “Comparing images using the Hausdorff distance,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.15, pp.850-863, 1993. 9

  10. 2. Multi-Feature Fusion based Particle Filter Multi-feature fusion based PF object rough location � From (15), we can obtain the fusion weight of current candidate � Feature fusion method is tolerant to illumination and deformation = α + β j j j weight weight weight (15) t t ( color ) t ( contour ) Color histogram of particle n Harris corner of particle n Partice n Propagation Color histogram of particle n+1 Input image Partice n+1 Harris corner of particle n+1 Fig. 2 Particle propagation schematic diagram 10

  11. 2. Multi-Feature Fusion based Particle Filter Experimental results -- rough location � Color and Harris corner fusion based PF object rough location Fig. 3 David2_WMM_xvid results Fig. 4 Anvar1GlobalNormal results Blue ellipse is original PF result; Green ellipse is CAMSGPF result; Red ellipse is the tracking result of our proposed fusion scheme 11

  12. 2. Multi-Feature Fusion based Particle Filter Experimental results -- rough location � David sequence tracking demo Color and Harris corner CAMSGPF fusion based PF Color and Harris corner fusion based object tracking for David sequence 12

  13. 2. Multi-Feature Fusion based Particle Filter Experimental results -- rough location � CAR sequence tracking demo Color and Harris corner CAMSGPF fusion based PF Color and Harris corner fusion based object tracking for CAR sequence 13

  14. 3. Object Contour Tracking Region-based temporal differencing model (RTDM) � Obtain binary region image from rough object location by RTDM � Get exact object contour through morphology operation and contour finder function � This object contour tracking method is real time and reduces the computational cost of whole algorithm structure Fig. 5 Region-based TDM results: (a) current frame. (b) object binary region image. (c) object rough location and exact contour 14

  15. 3. Object Contour Tracking Object exact location guided particle propagation � Particle will face sample impoverishment in PF propagation � If the random noise of state transition model is small, the particle sample impoverishment will be obvious. And if the random noise is large, it will need more particles to realize exact state prediction � Adaptive noise coefficient model (ANCM) computes state adaptive noise associating with object scale to balance between sampling effectiveness and diversity as (16) ( ) ′ = + − + ˆ k x x x x n (5) − − − k 1 k 1 k 2 k ′ = ∗ = ∗ + ∗ ∗ n r n ( a wr b hr ) n (16) k k k k k k wr n hr ( meets zero-mean Gaussian distribution. and represent k k k object width and height variable scale. a=0.7, b=0.3) 15

  16. 3. Object Contour Tracking Experimental results -- ANCM � Adaptive noise coefficient model (ANCM) tracking results Fig. 6 Tracking results ‘people’ sequence from TRECVID First line are the tracking results of our proposed method with ANCM. And second line are the results without ANCM. 16

  17. 3. Object Contour Tracking Experimental results – exact contour � Object rough location and exact contour tracking Fig. 8 Object rough location and exact contour tracking in weak and changing illumination scene for Zhang_2 video sequence Fig. 7 Object rough location and exact contour tracking in room scene for Lu_1 video sequence Object rough location with green rectangle and exact contour with blue connection line. Exact contour is not so good in people’s face. 17

  18. 4. Summary and Future Work Summary � This new object contour tracking framework combines two steps for accurate object contour tracking � Step1: object rough location is realized by color histogram and Harris corner features fusion method in particle filter framework � Step2: region-based TDM is applied for exact contour detection, which is simpler than active contour models � The experimental results demonstrate the performance of this proposed model 18

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend