Object Contour Tracking Using Multi-feature Fusion based Particle - - PowerPoint PPT Presentation

object contour tracking using multi feature fusion based
SMART_READER_LITE
LIVE PREVIEW

Object Contour Tracking Using Multi-feature Fusion based Particle - - PowerPoint PPT Presentation

Object Contour Tracking Using Multi-feature Fusion based Particle Filter Xiaofeng Lu [1][3] , Li Song [1][2] , Songyu Yu [1] , Nam Ling [2] [1] Institute of Image Communication and Information Processing, SJTU, China. [2] Department of Computer


slide-1
SLIDE 1

Object Contour Tracking Using Multi-feature Fusion based Particle Filter

Xiaofeng Lu[1][3], Li Song[1][2], Songyu Yu[1], Nam Ling[2]

[1] Institute of Image Communication and Information Processing, SJTU, China. [2] Department of Computer Engineering, Santa Clara University, USA. [3] SCIE, Shanghai University, China.

ICIEA2012, Singapore 18 July 2012 1

slide-2
SLIDE 2

Outline

1.

Main Idea

2.

Multi-Feature Fusion based Particle Filter

  • ---Rough Object Location

3.

Object Contour Tracking

  • ---Accurate Object Contour Location

4.

Summary and Future Work

2

slide-3
SLIDE 3

Main idea

Accurate object contour is useful to CV and PR applications Algorithm involves a object rough location step and a rough region-

based accurate contour detection step

Object rough location is realized by color histogram and Harris

corner features fusion method in particle filter framework

Region-based temporal differencing model is adopted for object

contour detection, which is faster compared to traditional active contour models

  • 1. Main Idea

3

slide-4
SLIDE 4

Previous works

Our previous CamShift guided particle filter (CAMSGPF) improved

accuracy significantly for object tracking[1]

Active contour model (ACM) was first proposed by Kass et al. in

1988 for object contour segmentation

Parametric active contour model: Snake representation Geometric active contour model: Curve evolution using level set Utilize an energy minimizing function to a deformable contour to

approach the true target boundaries

Real time accurate object contour tracking is critical for further

pattern recognition application

  • 1. Main Idea

[1] Z. Wang, X. Yang, Y. Xu, and S. Yu, “CamShift guided particle filter for visual tracking,” Pattern Recognition Letters, Vol.30, pp. 407-413, Mar. 2009.

4

slide-5
SLIDE 5

Particle Filter Introduction [M. Isard and A. Blake, 1998]

Bayesian approach to dynamic state estimation Multi-hypothesis and complexity grows with sample number Observation model and state transition model as (2)(3) Construct the posterior probability density function (pdf)

  • 2. Multi-Feature Fusion based Particle Filter

( )

1 1

,

k k k k

x f x n

− −

=

( )

,

k k k k

z E x θ =

(2) (3)

( )

1: k k

p x z

Color space convert Particles propagation

采样20个粒子,每个粒子代表一个状态

w1=0.22 w2=0.51 w3=0.21 w4=0.43 w5=0.32 w5=0.31 ….. W20=0.36

States of all particles

Most closely matched target in T frame Target in T fame

5

slide-6
SLIDE 6

Particle Filter for object tracking

Initialization of PF

Particle filter is an optimal Bayesian algorithm for nonlinear and non- Gaussian object tracking. And initialize PF as (1).

System observation model in PF

Posterior probability density function is approximated by a set of particles with weights. And the weight of particles is updated as (4).

Dynamic state transition model in PF

The evolution of the particle set is achieved by propagating each particle with a dynamic model as (5).

  • 2. Multi-Feature Fusion based Particle Filter

( )

, , ,

T c c

x x y w h =

(1)

( ) ( ) ( )

( )

0: 1 2 0: 1

exp( , ) ,

i i i k k k k i i k k i i k k k

p z x p x x w D h h x q x x z λ

− −

⎡ ⎤ ∝ ∝ − ⎣ ⎦

(4)

( )

1 1 2

ˆk

k k k k

x x x x n

− − −

′ = + − +

(5) 6

slide-7
SLIDE 7

Color weight estimation – color feature

HSV color space histogram for color weight estimation

we use Bhattacharyya distance and Bhattacharyya coefficient to present the similarity measurement between candidate and the reference template as in (9) and (10) After particle propagation.

Color feature is robust to deformation but sensitive to illumination

changes and similar color clutters.

  • 2. Multi-Feature Fusion based Particle Filter

1

1 [ , ] 1 ( ) ( )

  • t

N i i color ref can ref can i

d p q p y q y

=

= − = −∑ ρ

(9)

2 2

(1/ 2 )exp( / 2 )

j color color

weight d = − πσ σ

(10) 7

slide-8
SLIDE 8

Harris corner weight estimation – corner feature

Obtain motion edge through frame difference and Canny operator Harris corner detector is used to abstract feature points on motion

edge to reduce the edge match calculation

  • 2. Multi-Feature Fusion based Particle Filter

Fig 1. (a) and (b) are adjacent frames; (c) result of temporal differencing; (d) result of Canny edge detection; (e) result of AND logic operation; (f) result of Harris corner detection

8

slide-9
SLIDE 9

Harris corner weight estimation –corner feature

Corner feature is more stable to rotation, zoom and illumination Using Hausdorff distance[2] to represent the similarity between

candidate and the reference point sets

Hausdorff distance does not need to establish the point-to-point

relationship to two finite point sets

  • 2. Multi-Feature Fusion based Particle Filter

To two finite point sets, A={a1,a2,…,an} and B{b1,b2,…,bm}, the Hausdorff distance is defined with (11) , (12) and (13) based on Euclidean distance ( , ) max[ ( , ), ( , )] d A B h A B h B A =

( , ) max min

i j b B a A

h A B a b

∈ ∈

= − ( , ) max min

i j a A b B

h B A b a

∈ ∈

= −

[2] D. P. Huttenlocher, G. A. Klanderman, and W. A. Rucklidge, “Comparing images using the Hausdorff distance,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.15, pp.850-863, 1993.

  • We can define the Harris corner

weight of the candidate from the similarity measurement based on Hausdorff distance as (14)

2 2

(1/ 2 )exp( / 2 )

j contour contour

weight d = − πσ σ

(11) (12) (13) (14)

i

9

slide-10
SLIDE 10

Multi-feature fusion based PF object rough location

  • 2. Multi-Feature Fusion based Particle Filter

( ) ( ) j j j t t color t contour

weight weight weight = + α β

From (15), we can obtain the fusion weight of current candidate Feature fusion method is tolerant to illumination and deformation

(15)

  • Fig. 2 Particle propagation schematic diagram

Input image Partice n Partice n+1 Color histogram of particle n Color histogram of particle n+1 Harris corner of particle n Harris corner of particle n+1 Propagation

10

slide-11
SLIDE 11

Experimental results -- rough location

Color and Harris corner fusion based PF object rough location

  • 2. Multi-Feature Fusion based Particle Filter

Blue ellipse is original PF result; Green ellipse is CAMSGPF result; Red ellipse is the tracking result of our proposed fusion scheme

  • Fig. 3 David2_WMM_xvid results
  • Fig. 4 Anvar1GlobalNormal results

11

slide-12
SLIDE 12

Experimental results -- rough location

David sequence tracking demo

  • 2. Multi-Feature Fusion based Particle Filter

CAMSGPF Color and Harris corner fusion based PF

Color and Harris corner fusion based object tracking for David sequence

12

slide-13
SLIDE 13

Experimental results -- rough location

CAR sequence tracking demo

  • 2. Multi-Feature Fusion based Particle Filter

CAMSGPF Color and Harris corner fusion based PF

Color and Harris corner fusion based object tracking for CAR sequence

13

slide-14
SLIDE 14

Region-based temporal differencing model (RTDM)

Obtain binary region image from rough object location by RTDM Get exact object contour through morphology operation and

contour finder function

This object contour tracking method is real time and reduces the

computational cost of whole algorithm structure

  • 3. Object Contour Tracking
  • Fig. 5 Region-based TDM results: (a) current frame. (b)
  • bject binary region image. (c) object rough location and

exact contour

14

slide-15
SLIDE 15

Object exact location guided particle propagation

Particle will face sample impoverishment in PF propagation If the random noise of state transition model is small, the particle

sample impoverishment will be obvious. And if the random noise is large, it will need more particles to realize exact state prediction

Adaptive noise coefficient model (ANCM) computes state adaptive

noise associating with object scale to balance between sampling effectiveness and diversity as (16)

  • 3. Object Contour Tracking

( )

k k k k k k

n r n a wr b hr n ′ = ∗ = ∗ + ∗ ∗

( meets zero-mean Gaussian distribution. and represent

  • bject width and height variable scale. a=0.7, b=0.3)

k

n

k

wr

k

hr

(16)

( )

1 1 2

ˆk

k k k k

x x x x n

− − −

′ = + − +

(5) 15

slide-16
SLIDE 16

Experimental results -- ANCM

Adaptive noise coefficient model (ANCM) tracking results

  • 3. Object Contour Tracking

First line are the tracking results of our proposed method with ANCM. And second line are the results without ANCM.

  • Fig. 6 Tracking results ‘people’ sequence from TRECVID

16

slide-17
SLIDE 17

Experimental results – exact contour

Object rough location and exact contour tracking

  • 3. Object Contour Tracking

Object rough location with green rectangle and exact contour with blue connection line. Exact contour is not so good in people’s face.

  • Fig. 7 Object rough location and exact contour

tracking in room scene for Lu_1 video sequence

  • Fig. 8 Object rough location and exact contour tracking

in weak and changing illumination scene for Zhang_2 video sequence

17

slide-18
SLIDE 18

Summary

This new object contour tracking framework combines two steps

for accurate object contour tracking

Step1: object rough location is realized by color histogram and

Harris corner features fusion method in particle filter framework

Step2: region-based TDM is applied for exact contour detection,

which is simpler than active contour models

The experimental results demonstrate the performance of this

proposed model

  • 4. Summary and Future Work

18

slide-19
SLIDE 19

Ongoing and future work:

Embeding object motion information into contour exact step Snake active contour model embedded PF framework is ongoing

  • 4. Summary and Future Work
  • Fig. 9 VFC snake active contour model embedded PF framework results

Bing Li, Scott T. Acton, “Active Contour External Force Using Vector Field Convolution for Image Segmentation,” IEEE Trans. on Image Processing, Vol.16, No. 8,pp.2096-2106, 2007.

19

slide-20
SLIDE 20

Thanks!

20