summary b ayes ian multi sensor tracking
play

Summary: B AYES ian (Multi-) Sensor Tracking Basis: In the course of - PowerPoint PPT Presentation

Summary: B AYES ian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Sensor


  1. Summary: B AYES ian (Multi-) Sensor Tracking • Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1

  2. Summary: B AYES ian (Multi-) Sensor Tracking • Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. • Objective: Learn as much as possible about the individual target states at each time by analyzing the ‘time series’ which is constituted by the sensor data. Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 2

  3. Summary: B AYES ian (Multi-) Sensor Tracking • Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. • Objective: Learn as much as possible about the individual target states at each time by analyzing the ‘time series’ which is constituted by the sensor data. • Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets’ temporal evolution is usually not well-known. Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 3

  4. Summary: B AYES ian (Multi-) Sensor Tracking • Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. • Objective: Learn as much as possible about the individual target states at each time by analyzing the ‘time series’ which is constituted by the sensor data. • Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets’ temporal evolution is usually not well-known. • Approach: Interpret measurements and state vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them. Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 4

  5. Summary: B AYES ian (Multi-) Sensor Tracking • Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. • Objective: Learn as much as possible about the individual target states at each time by analyzing the ‘time series’ which is constituted by the sensor data. • Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets’ temporal evolution is usually not well-known. • Approach: Interpret measurements and state vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them. • Solution: Derive iteration formulae for calculating the pdfs! Develop a mecha- nism for initiation! By doing so, exploit all background information available! De- rive state estimates from the pdfs along with appropriate quality measures! Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 5

  6. Bayesian Multiple Sensor Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state x k at time t k , accumulated multiple sensor data Z k a priori knowledge: target dynamics models, sensor model, other context dynamics model p ( x k − 1 |Z k − 1 ) p ( x k |Z k − 1 ) • prediction: − − − − − − − − − − → context sensor data Z k p ( x k |Z k − 1 ) p ( x k |Z k ) • filtering: − − − − − − − − − − → sensor model filtering output p ( x l − 1 |Z k ) p ( x l |Z k ) • retrodiction: ← − − − − − − − − − − dynamics model − finite mixture: inherent ambiguity (data, model, road network ) − optimal estimators: e.g. minimum mean squared error (MMSE) − initiation of pdf iteration: multiple hypothesis track extraction Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 6

  7. Recapitulation: The Multivariate G AUSS ian Pdf − wanted: probabilities ‘concentrated’ around a center ¯ x q ( x ) = 1 x ) P − 1 ( x − ¯ x ) ⊤ − quadratic distance: 2 ( x − ¯ q ( x ) defines an ellipsoid around ¯ x , its volume and orienta- tion being determined by a matrix P (symmetric: P ⊤ = P , positively definite: all eigenvalues > 0 ). � d x e − q ( x ) (normalized!) p ( x ) = e − q ( x ) / − first attempt: e − 1 1 x ) ⊤ P − 1 ( x − ¯ 2( x − ¯ x ) p ( x ) = N ( x ; ¯ x , P ) = � | 2 π P | p ( x ) = � − G AUSS ian Mixtures: i p i N ( x ; ¯ x i , P i ) (weighted sums) Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 7

  8. Very First Look at an Important Data Fusion Algorithm k ) ⊤ , Z k = { z k , Z k − 1 } Kalman filter: x k = ( r ⊤ r ⊤ k , ˙ p ( x 0 ) = N � x 0 ; x 0 | 0 , P 0 | 0 � initiation: , initial ignorance: P 0 | 0 ‘large’ N � x k − 1 ; x k − 1 | k − 1 , P k − 1 | k − 1 � dynamics model N � x k ; x k | k − 1 , P k | k − 1 � prediction: − − − − − − − − − → F k | k − 1 , D k | k − 1 x k | k − 1 = F k | k − 1 x k − 1 | k − 1 ⊤ + D k | k − 1 P k | k − 1 = F k | k − 1 P k − 1 | k − 1 F k | k − 1 � � � � current measurement z k N − − − − − − − − − − − − − → N filtering: x k ; x k | k − 1 , P k | k − 1 x k ; x k | k , P k | k sensor model: H k , R k = x k | k − 1 + W k | k − 1 ν k | k − 1 , ν k | k − 1 = z k − H k x k | k − 1 x k | k S k | k − 1 = H k P k | k − 1 H k ⊤ + R k P k | k − 1 − W k | k − 1 S k | k − 1 W k | k − 1 ⊤ , = P k | k W k | k − 1 = P k | k − 1 H k ⊤ S k | k − 1 − 1 ‘K ALMAN gain matrix’ A deeper look into the dynamics and sensor models necessary! Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 8

  9. How to deal with probability density functions? • pdf p ( x ) : Extract probability statements about the RV x by integration! � dx p ( x ) = 1 ) • na¨ ıvely: positive and normalized functions ( p ( x ) ≥ 0 , • conditional pdf p ( x | y ) = p ( x,y ) p ( y ) : Impact of information on y on RV x ? � dy p ( x, y ) = � dy p ( x | y ) p ( y ) : • marginal density p ( x ) = Enter y ! • Bayes: p ( x | y )= p ( y | x ) p ( x ) p ( y | x ) p ( x ) p ( x | y ) ← p ( y | x ) , p ( x ) ! = dx p ( y | x ) p ( x ) : � p ( y ) Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 9

  10. How to deal with probability density functions? • pdf p ( x ) : Extract probability statements about the RV x by integration! � dx p ( x ) = 1 ) • na¨ ıvely: positive and normalized functions ( p ( x ) ≥ 0 , • conditional pdf p ( x | y ) = p ( x,y ) p ( y ) : Impact of information on y on RV x ? � dy p ( x, y ) = � dy p ( x | y ) p ( y ) : • marginal density p ( x ) = Enter y ! • Bayes: p ( x | y )= p ( y | x ) p ( x ) p ( y | x ) p ( x ) p ( x | y ) ← p ( y | x ) , p ( x ) ! = dx p ( y | x ) p ( x ) : � p ( y ) ( x − y )2 2 πσ e − 1 1 2 σ 2 • certain knowledge on x : p ( x ) = δ ( x − y ) ‘ = ’ lim σ → 0 √ Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 10

  11. How to deal with probability density functions? • pdf p ( x ) : Extract probability statements about the RV x by integration! � dx p ( x ) = 1 ) • na¨ ıvely: positive and normalized functions ( p ( x ) ≥ 0 , • conditional pdf p ( x | y ) = p ( x,y ) p ( y ) : Impact of information on y on RV x ? � dy p ( x, y ) = � dy p ( x | y ) p ( y ) : • marginal density p ( x ) = Enter y ! • Bayes: p ( x | y )= p ( y | x ) p ( x ) p ( y | x ) p ( x ) p ( x | y ) ← p ( y | x ) , p ( x ) ! = dx p ( y | x ) p ( x ) : � p ( y ) ( x − y )2 2 πσ e − 1 1 2 σ 2 • certain knowledge on x : p ( x ) = δ ( x − y ) ‘ = ’ lim σ → 0 √ � dx p ( y, x ) = � dx p ( y | x ) p x ( x ) • transformed RV y = t [ x ] : p ( y ) = Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 11

  12. How to deal with probability density functions? • pdf p ( x ) : Extract probability statements about the RV x by integration! � dx p ( x ) = 1 ) • na¨ ıvely: positive and normalized functions ( p ( x ) ≥ 0 , • conditional pdf p ( x | y ) = p ( x,y ) p ( y ) : Impact of information on y on RV x ? � dy p ( x, y ) = � dy p ( x | y ) p ( y ) : • marginal density p ( x ) = Enter y ! • Bayes: p ( x | y )= p ( y | x ) p ( x ) p ( y | x ) p ( x ) p ( x | y ) ← p ( y | x ) , p ( x ) ! = dx p ( y | x ) p ( x ) : � p ( y ) ( x − y )2 2 πσ e − 1 1 2 σ 2 • certain knowledge on x : p ( x ) = δ ( x − y ) ‘ = ’ lim σ → 0 √ � dx p ( y, x ) = � dx p ( y | x ) p x ( x ) = • transformed RV y = t [ x ] : p ( y ) = � dx δ ( y − t [ x ]) p x ( x ) =: [ T p x ]( y ) ( T : p x �→ p , “transfer operator”) Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 12

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend