sensit collaborative signal processing candidate tracking
play

SensIT Collaborative Signal Processing Candidate Tracking Benchmarks - PowerPoint PPT Presentation

SensIT Collaborative Signal Processing Candidate Tracking Benchmarks v0.3 J im Reich, Xer ox P ARC DARPA CSP Wor kshop J anuar y 15, 2001 P alo Alto, CA Issues to Consider as we go What ar e the unique challenges of the scenar io?


  1. SensIT Collaborative Signal Processing Candidate Tracking Benchmarks v0.3 J im Reich, Xer ox P ARC DARPA CSP Wor kshop J anuar y 15, 2001 P alo Alto, CA

  2. Issues to Consider as we go • What ar e the unique challenges of the scenar io? – I s ther e a way to make the scenar io mor e “f undament al” and f ocus on the challenge • What inf or mation needs to be combined f r om multiple nodes, and how of ten? • What ar e some likely quer ies? • How would we benchmar k success? • Can this test be implemented as a r eal wor ld exper iment?

  3. Assumptions • Single vehicle unless other wise shown • Unless other wise stated, vehicles at t empt to maint ain constant speed and do not shif t gear s • Unless other wise stated, vehicles star t beyond the sensor f ield and ar r ive in sequence • Complications to be handled as we get bet ter : – Var iation of acoustic signatur e with aspect – Acceler ation, br aking, and gear shif ts • Changes the acoustic signals – Spatio-tempor ally var ying pr opagation models – Node unr eliabilit y

  4. Linear Array Laydown Example (optional) Microphone/ Microphone Array PIR 3-Axis Seismic Query Source 600m Nodes may continue along road (optional)

  5. 2D Array Laydown Example Microphone/ Microphone Array PIR 3-Axis Seismic 300m Randomly distributed in [X, Y, θ]

  6. Benchmarking • Gener ic Benchmar ks – Ener gy Consumption • Total, per node (max, avg) • f (comput ation, communication) – Detection Accur acy • Fr equency of f alse positives, negatives – Detection Lat ency • Mean, max vs. quer y sour ce location – Tr acking accur acy • Mean, max, std. • f (desir ed output f r equency) – Tr acking Latency • Mean, max vs. quer y sour ce location • Task-specif ic Benchmar ks

  7. 1: Track Single Target Task •Estimate target position vs. time Challenges •Localize target •Maintain accurate estimate in large gaps between sensors •Fuse data from multiple sensor types

  8. 2: Track Single Maneuvering Target Task •Estimate target position vs. time Challenges •No road, hence no prior knowledge of vehicle trajectory •Constant direction dynamics models no longer adequate •Many sensors making simultaneous observations

  9. 3: Track Accelerating/Decelerating Target A Task •Estimate target position vs. time Challenges •Vehicle signature time-varying •Constant velocity dynamics models no longer adequate •Gear shift requires maintaining internal discrete state (curr. gear) B Vehicle begins stationary and idling at point “A” Accelerates, maintains constant velocity Decelerates and stops and point “B” Extra credit: Handle gear shifts

  10. 4: Count Stationary (idling) Targets Task •Count number of targets •Locate targets Challenges •Multiple vehicles •Unknown number of vehicles •Cannot depend on peak-finding (CPA) of acoustic signal Task-Specific Benchmarks •Accuracy of count vs. dynamic range of acoustic outputs from ensemble of vehicles

  11. 5: Two-way traffic Task •Track target positions •Estimate target crossing time Challenges •Vehicles in close proximity, need to use dynamics to keep identities separate Task-Specific Benchmarks •Accuracy of crossing time estimate

  12. 6: Convoy on a Road Task •Count number of vehicles of each type •Determine order of vehicles Challenges •Multiple vehicles •Classification and state information must follow vehicle along full length of road Task-Specific Benchmarks •Accuracy of count & order vs. vehicle spacing & convoy velocity Vary inter-vehicle spacing to vary problem difficulty

  13. 7: Perimeter Violation Sensing Task •Alert on violation of perimeter •Ignore activity outside of perimeter (distractors) •Identify violator type and track location Challenges •Filter out distractor •Respond quickly while minimizing quiescent activity Task-Specific Benchmarks •Detection delay •Power usage during periods of no violation •Frequency of false positives all vs. distractor/violator source amplitude ratio

  14. 8: Tracking in an Obstacle Field Task •Track vehicle position Challenges •Obstacles cause individual sensors to lose lock on the target •Different sensing modalities are blocked differently by obstacles (i.e. seismic vs. acoustic)

  15. 9: Road Junction Merge/Split on Localized One-Time Sensor Task •Track targets •Maintain target identities •Re-establish identity of both targets when right-hand magnetometer is crossed Challenges •Need to conserve number and type of targets as they pass through tunnel. •Need to reason about targets – Seeing blue at top right mag. guarantees red at bottom. Task-Specific Benchmarks Targets can only be distinguished from •Time to propagate data from each other by magnetometers (shown), RHS magnetometer to red car in which give one-time “red/blue” output the lower RHS when the vehicle passes over them.

  16. 10. Cluster Behavior Task •Track cluster centroid •Keep count of vehicles in cluster adjusting as some leave and join Challenges •Large number of targets •Coalescing many similar targets, limiting exponential hypothesis blowup •Measuring global properties of cluster (centroid, count) rather than properties of single target Task-Specific Benchmarks •Maximum number of targets which can be handled simultaneously •Centroid accuracy

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend