Drone Net Using Tegra for Multi-Spectral Detection and Tracking in - - PowerPoint PPT Presentation

drone net
SMART_READER_LITE
LIVE PREVIEW

Drone Net Using Tegra for Multi-Spectral Detection and Tracking in - - PowerPoint PPT Presentation

Drone Net Using Tegra for Multi-Spectral Detection and Tracking in Shared Air Space Sam Siewert, GPU Tech 2017 May 8, 2017 Drone Net Concept Significance Motivation Large Numbers of sUAS Droneii, FAA, Sandia, ASSURE


slide-1
SLIDE 1

May 8, 2017  Sam Siewert, GPU Tech 2017 – Drone Net Concept

“Drone Net”

Using Tegra for Multi-Spectral Detection and Tracking in Shared Air Space

slide-2
SLIDE 2

Significance

2

Motivation – Large Numbers of sUAS

– Droneii, FAA, Sandia, ASSURE – Counter UAS Challenge – senseFly Catalog of Uses

Problem – Default solution

– Part 107 for sUAS and beyond – ADS-B for sUAS insufficient, infeasible – RADAR/LIDAR feasibility

Drone Net hypothesis

– Networked, multi-modal (passive/active), information and sensor data fusion – EO/IR + acoustic, spectral fusion, machine learning – Compare to and validate with LIDAR/RADAR, ADS-B

slide-3
SLIDE 3

The sUAS Market is Vigorous [Droneii]

slide-4
SLIDE 4

Proposed and State of Art

4

Counter UAS [DARPA, ONR, SRC, Mitre]

– RADAR/LIDAR, IFF – Neutralization, Jamming, Interception

Security [FAA, DHS, FBI, Sandia]

– EO/IR, RADAR/LIDAR, Geo-fencing – Detection Field Testing [FBI at JFK]

Safety and Compliance [DOT FAA]

– Part 107 - sUAS Registration, Pilot Certification, Classification – ADS-B NAS, See/Sense-and-avoid – PrecisionHawk LATAS (Low-Altitude Traffic and Airspace Safety) – NASA / Verizon UTM [UAS Traffic Management]

slide-5
SLIDE 5

Goals and Objectives

5

EO/IR sUAS Detection Feasibility

– Baseline – UAS, GA, Wildlife, Insects, … – DNN, DBN, SVM Machine Vision and Learning Classification, Identification

Passive, Passive + Active

– Performance Evaluation – ROC, PR, F-measure, Confusion Matrices – Data, Image, Information Fusion – Acoustic Camera, LIDAR [Next Steps]

Complimentary Spatial, Temporal, Spectral Resolution

– flightradar24.com – Enhanced aggregation – Network of Sensor Fusion Nodes – Long baseline [optical and acoustic localization] – Campus Drone Net – Geo-Net [Florida, Alaska, Arizona, Colorado]

Tegra K1, X1,X2

slide-6
SLIDE 6

Method – Information Fusion

6

EO/IR Multi-spectral imaging

– Visible, NIR to LWIR – Pixel Level and Feature Level Registration – Lower Cost than common bore-sight – Test sUAS with “Flash Pop” Stimulators

Link Drone Net Nodes [wireless, wired] ADS-B aggregation

– Drone Net receivers - Garmin, Appareo, etc. – Test UAS transceivers– e.g. uAvionix ping1090, ping2020

Acoustic cameras and cues [improve classification] Active LIDAR [RADAR] GP-GPU Real-time Processing

slide-7
SLIDE 7

Method – Machine Vision & Learning

7 Machine Vision using SoC Linux

– Salient Object Detection

Shape, Behavior and Contrast/Color/Texture in Multiple Bands Performance [ROC, PR, F-measure, confusion matrices]

– Real-Time Detection, Segmentation, Tracking, Classification, Identification

Machine Learning (Traditional, ANN)

– Expert systems – Bayesian inference, Dempster-Shafer reasoning [DBN] – PCA [Principal Component Analysis] – SVM [Support Vector Machines] – Clustering [e.g. K-means] – GPU Accelerated DNN (cuDNN) – Supervised, Unsupervised learning

Leverage Open Source: ROS, OpenCV, PyBrain, PyML, MLpack, cuDNN, Caffe

slide-8
SLIDE 8

Conceptual Configuration

 Sam Siewert 8

Panchromatic, NIR, RGB

Jetson Tegra X1 With GP-GPU Co-Processing

Cloud Analytics and Machine Learning Flash SD Card (local database)

LWIR

Saliency & Behavioral Assessment Thermal Fusion Assessment

Many multi- spectral focal planes …

2D/3D Spatial Assessment

slide-9
SLIDE 9

Experimental System Block Diagram

2 Watts at Idle, Plus 1.5 Watts per Camera = 6.5W E.g. Sobel, 30Hz, 20 Mega Pixels/Sec/Watt, 7.3W Peak – SPIE Sensor Tech + Apps

 Sam Siewert 9

1) Sync’d Capture 2) Resolution Match 3) Image Registration 4) Detection 5) Classification 6) Identification

slide-10
SLIDE 10

Needs Debugging – Literally!

Many Insects Detected in Visible to LWIR Opportunity to work on Bird / Aviation Interaction Testing

 Sam Siewert 10

slide-11
SLIDE 11

2015/16 – ADAC & ERAU Sponsored

UAA – ADAC, SmartCam ERAU (Undergraduate Research Team)

– Dr. Sam Siewert, PI, Assistant Professor – Demi Matthew Vis – AE/SE Student – Ryan Claus – SE Student – Nicholas DiPinto – SE Student – Arctic Power Team – Power Team Poster

CU Boulder – Embedded Systems Engineering Graduate Program

– Ram Krishnamurthy – MS EE – Surjith Singh – MS, ESE – Akshay Singh – ME, ESE – Shivasankar Gunasekaran – ME,ESE – Swaminath Badrinath – ME, ESE

Industry Advising/Collaboration Participants

– Randall Myers, Mentor Graphics

 Sam Siewert 11

This material is based upon work supported by the U.S. Department of Homeland Security under Grant Award Number, DHS-14-ST-061-COE-001A-02. The views and conclusions contained in this document are those of the authors and should not be interpreted as necessarily representing the official policies, either expressed

  • r

implied,

  • f

the U.S. Department of Homeland Security.

slide-12
SLIDE 12

2016/17 Team – ERAU Sponsored

ERAU – Drone Net

– Dr. Sam Siewert, PI, Assistant Prof. – Dr. Iacopo Gentilini, Co-I – Dr. Stephen Bruder, Co-I – Dr. Mehran Andalibi, Co-I – Demi Matthew Vis – AE/SE Student – Ryan Claus – SE Student

CU Boulder – Embedded Systems Graduate

– Ram Krishnamurthy – MS EE – Surjith Singh – MS, ESE – Akshay Singh – ME, ESE – Shivasankar Gunasekaran – ME,ESE – Omkar Seelam – ME, ESE

Industry Advising/Collaboration Participants

– Randall Myers, Mentor Graphics (PCB, CAD, Systems) – Joe Butler, Intel Corporation (IoT)

 Sam Siewert 12

slide-13
SLIDE 13

Detection Experiments for Aircraft and UAS

Preliminary Roof-top Field Trials at ERAU Prescott

 Sam Siewert 13

slide-14
SLIDE 14

AIAA – Drone Net Feasibility Results

 Sam Siewert 14

  • S. Siewert, M. Vis, R. Claus, R. Krishnamurthy, S. B. Singh, A. K. Singh, S. Gunasekaran,

“Image and Information Fusion Experiments with a Software-Defined Multi-Spectral Imaging System for Aviation and Marine Sensor Networks”, AIAA SciTech 2017, Grapevine, Texas, January 2017.

slide-15
SLIDE 15

Open Reference SDMSI Configuration

2 Basler Pulse Visible Cameras 1 FLIR Vue LWIR Camera with ZnSe Window Jetson TK1, Panda Wireless, USB3 Hub, Power, NEMA Enclosure

 Sam Siewert 15

slide-16
SLIDE 16

Smart Camera Deployment - Aerial

UAV Systems - ERAU ICARUS Group Experimental Aviation and Small Aircraft - ERAU Kite Aerial Photography, Balloon Missions (ERAU, UAA, CU Boulder)

 Sam Siewert 16

Sam Siewert – ERAU ICARUS Group

slide-17
SLIDE 17

Actual - Roof Mount Experiment

Starting point – evolve to aircraft, buoy and UAS later Embry Riddle flight line provides lots of light aircraft traffic Simple UAS testing in Campus (semi-Urban) environment Wildlife – insects, bats, birds, etc.

 Sam Siewert 17

slide-18
SLIDE 18

Information Fusion Concepts

Integration and System of Systems Between ADS-B and S-AIS for Vessel / Aircraft / UAS Awareness Smart Cameras Can Monitor and Plan Uplink Opportunity as Well as Wake up and Uplink

 Sam Siewert 18

System Fusion For Uplink

slide-19
SLIDE 19

Baseline Motion Trigger Detection

Difference Images over Time (adjustable) Threshold - Statistically Significant Pixel Change Filters (Atmospheric, Cloud, Constant Background Motion) – Dispersion of Changes Detection Performance – ROC, PR-Curve, F-measure [TP, FP, FN, TN analysis] Classification/Identification - Confusion Matrix

 Sam Siewert 19

https://en.wikipedia.org/wiki/Precision_and_recall

PR best for Image Retrieval E.g. https://images.google.com/ ROC best for Target Detection

slide-20
SLIDE 20

Frame by Frame Analysis

TP – Determined by Human Review Frame by Frame Alternative is by Physical Experiment Design “Autoit” Program to Analyze

 Sam Siewert 20

slide-21
SLIDE 21

Aircraft Detection Performance - Baseline

Video Links – Aircraft, Bugs, FP, TP+FP, [TN], [Full]

 Sam Siewert 21

slide-22
SLIDE 22

UAS Detection Performance – Baseline

Video Link – UAS+Aircraft, Bugs, FP, TP+FP, [TN], [Full]

 Sam Siewert 22

slide-23
SLIDE 23

Candidate SOD (BinWang14) - Aircraft

Modified to Run BinWang14 SOD => MD Baseline Video Links – TP+FP, [TN], [Full]

 Sam Siewert 23

slide-24
SLIDE 24

Candidate SOD (BinWang14) - UAS

Modified to Run BinWang14 SOD => MD Baseline Video Links – TP+FP, [TN], [Full]

 Sam Siewert 24

slide-25
SLIDE 25

AIAA – Drone Detection Conclusions

Drone Net Likely Requires Custom Detection – SOD Classification Based on Shape, Behavior and Contrast/Color/Texture in Multiple Bands (RGB, NIR, LWIR) Considering Acoustic Cue Fusion Cross Check with ADS-B, RADAR/LIDAR Data Produce Improved flightradar24.com Meta-data Find Ghost UAS and Aircraft [Non-compliant], Log Others

 Sam Siewert 25

slide-26
SLIDE 26

SPIE – Benchmark Results

 Sam Siewert 26

  • S. Siewert, V. Angoth, R. Krishnamurthy, K. Mani, K. Mock, S. B. Singh, S. Srivistava, C. Wagner, R. Claus, M. Vis,

“Software Defined Multi-Spectral Imaging for Arctic Sensor Networks”, SPIE Proceedings, Volume 9840, Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XXII, Baltimore, Maryland, April 2016.

slide-27
SLIDE 27

FPGA Results - Sobel

ALUTs: 10187 Registers: 13,561 Logic utilization: 7,427 / 32,070 ( 23 % )

 Sam Siewert 27

Resolution Transform (Watts) (Pixel/sec) per Watt Saturation FPS Bus transfer rate (MB/sec) 320x240 5.655 2,050,716 151 11.06 640x480 5.700 2,107,284 39.1 11.46 1280x960 5.704 2,143,506 9.95 11.66 2560x1920 5.696 2,157,303 2.50 11.72 Table 2. Sobel Continuous Transform Power Consumption by Cyclone V FPGA

slide-28
SLIDE 28

FPGA Results – Pyramidal

ALUTs: 24456 Registers: 34,062 Logic utilization: 17,721 / 32,070 ( 55 % ) ( 55 % )

 Sam Siewert 28

Table 3. Pyramidal Laplacian Resolution Up-Conversion Continuous Transform Power Resolution Transform (Watts) (Pixel/sec) per Watt Saturation FPS Bus transfer rate (MB/sec) 320x240 6.009 889,546 69.6 5.10 640x480 6.013 904,281 17.7 5.19 1280x960 6.038 905,624 4.45 5.21 2560x1920 6.192 889,054 1.12 5.25 Table 4. Pyramidal Gaussian Resolution Down-Conversion Continuous Transform Power Resolution Continuous Transform Power (Watts) (Pixel/sec) / Watt Saturation FPS Bus transfer rate (MB/sec) 320x240 5.968 2,445,040 190 13.92 640x480 6.018 2,399,202 47.0 13.77 1280x960 6.023 2,427,813 11.9 13.95 2560x1920 6.109 2,309,154 2.87 13.45

slide-29
SLIDE 29

GP-GPU Results - Sobel

 Sam Siewert 29

Table 5. Sobel Continuous Transform Power Resolution Continuous Power at 1Hz (Watts) Continuous Power at 30Hz (Watts) (pixels/sec ) per Watt @ 1Hz (pixels/sec) per Watt @ 30Hz Saturation FPS 320x240 4.241 4.932 18,109 467,153 1624 640x480 4.256 4.984 72,180 1,849,117 840 1280x960 4.266 5.142 288,045 7,169,195 237 2560x1920 4.325 7.326 1,136,462 20,127,764 55

slide-30
SLIDE 30

GP-GPU Results - Pyramidal

 Sam Siewert 30

Table 6. Pyramidal Up and Down Conversion Continuous Transform Power Resolution Continuous Power at 1Hz (Watts) Continuous Power at 20Hz (Watts) (pixels/sec ) / Watt @ 1Hz (pixels/sec) / Watt @ 20Hz Saturation FPS 320x240 4.104 4.824 18,713 477,612 1120 640x480 4.116 5.460 74,636 1,687,912 325 1280x960 4.152 6.864 295,954 5,370,629 82 2560x1920 4.224 13.44 1,163,636 10,971,429 20

slide-31
SLIDE 31

SPIE On-going Work

We Have Completed Hough Lines Continuous Transform Test, Available on GitHub Hough Power Curves Not Yet Produced – In Progress Goal to Identify all Continuous Transform Primitives Used in Infrared + Visible Fusion and 3D Mapping Pixel Level Emphasis, But Also Plan to Review Feature Level

– Camera Extrinsic and Intrinsic Transformations – Registration – Resolution and AR Matching – Methods of Pixel Level Fusion in Review [10] [11], [12], [14]

 Sam Siewert 31

slide-32
SLIDE 32

SPIE Benchmark Conclusions

Please Download our Benchmarks

– https://github.com/siewertserau/fusion_coproc_benchmarks – MIT License

Test on NVIDIA GP-GPU or FPGA SoCs (Altera, Xilinx) Share Results Back Please Help Us Add Benchmarks Critical to Continuous 3D Mapping and Infrared + Visible Fusion (Suite of Primitives) Open Source Hardware, Firmware, Software for Multispectral Smart Camera Applications

 Sam Siewert 32

slide-33
SLIDE 33

Research References

 Sam Siewert 33

slide-34
SLIDE 34

References

 Sam Siewert 34

  • 1S. Siewert, V. Angoth, R. Krishnamurthy, K. Mani, K. Mock, S. B. Singh, S. Srivistava, C. Wagner, R. Claus, M. Demi

Vis, “Software Defined Multi-Spectral Imaging for Arctic Sensor Networks”, SPIE Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XXII, Baltimore, Maryland, April 2016.

2 S. Siewert, J. Shihadeh, Randall Myers, Jay Khandhar, Vitaly Ivanov, “Low Cost, High Performance and Efficiency

Computational Photometer Design”, SPIE Sensing Technology and Applications, SPIE Proceedings, Volume 9121, Baltimore, Maryland, May 2014.

3Piella, G. (2003). A general framework for multiresolution image fusion: from pixels to regions. Information fusion,

4(4), 259-280.

4Blum, R. S., & Liu, Z. (Eds.). (2005). Multi-sensor image fusion and its applications. CRC press. 5Liu, Z., Blasch, E., Xue, Z., Zhao, J., Laganiere, R., & Wu, W. (2012). Objective assessment of multiresolution image

fusion algorithms for context enhancement in night vision: a comparative study. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 34(1), 94-109.

6Simone, G., Farina, A., Morabito, F. C., Serpico, S. B., & Bruzzone, L. (2002). Image fusion techniques for remote

sensing applications. Information fusion, 3(1), 3-15.

7Mitchell, H. B. (2010). Image fusion: theories, techniques and applications. Springer Science & Business Media. 8Szeliski, R. (2010). Computer vision: algorithms and applications. Springer Science & Business Media. 9Sharma, G., Jurie, F., & Schmid, C. (2012, June). Discriminative spatial saliency for image classification. In Computer

Vision and Pattern Recognition (CVPR), 2012 IEEE Conference on (pp. 3506-3513). IEEE.

10Toet, A. (2011). Computational versus psychophysical bottom-up image saliency: A comparative evaluation study.

Pattern Analysis and Machine Intelligence, IEEE Transactions on, 33(11), 2131-2146.

slide-35
SLIDE 35

References

 Sam Siewert 35

11Valenti, R., Sebe, N., & Gevers, T. (2009, September). Image saliency by isocentric curvedness and color. In

Computer Vision, 2009 IEEE 12th International Conference on (pp. 2185-2192). IEEE.

12Wang, M., Konrad, J., Ishwar, P., Jing, K., & Rowley, H. (2011, June). Image saliency: From intrinsic to extrinsic

  • context. In Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on (pp. 417-424). IEEE.

13Liu, F., & Gleicher, M. (2006, July). Region enhanced scale-invariant saliency detection. In Multimedia and Expo,

2006 IEEE International Conference on (pp. 1477-1480). IEEE.

14Cheng, M. M., Mitra, N. J., Huang, X., & Hu, S. M. (2014). Salientshape: Group saliency in image collections. The

Visual Computer, 30(4), 443-453.

15http://global.digitalglobe.com/sites/default/files/DG_WorldView2_DS_PROD.pdf 16http://www.spaceimagingme.com/downloads/sensors/datasheets/DG_WorldView3_DS_2014.pdf 17Richards, Mark A., James A. Scheer, and William A. Holm. Principles of modern radar. SciTech Pub., 2010. 18Brown, Christopher D., and Herbert T. Davis. "Receiver operating characteristics curves and related decision

measures: A tutorial." Chemometrics and Intelligent Laboratory Systems 80.1 (2006): 24-38.

19Wang, Bin, and Piotr Dudek. "A fast self-tuning background subtraction algorithm." Proceedings of the IEEE

Conference on Computer Vision and Pattern Recognition Workshops. 2014.

20Panagiotakis, Costas, et al. "Segmentation and sampling of moving object trajectories based on representativeness."

IEEE Transactions on Knowledge and Data Engineering 24.7 (2012): 1328-1343.

slide-36
SLIDE 36

References

 Sam Siewert 36

21Public SDMSI shared data web site for video sequences captured and used in two experiments presented in this

paper - http://mercury.pr.erau.edu/~siewerts/extra/papers/AIAA-SDMSI-data-2017/

22Perazzi, Federico, et al. "Saliency filters: Contrast based filtering for salient region detection." Computer Vision and

Pattern Recognition (CVPR), 2012 IEEE Conference on. IEEE, 2012.

23Achanta, Radhakrishna, et al. "SLIC superpixels compared to state-of-the-art superpixel methods." IEEE

transactions on pattern analysis and machine intelligence 34.11 (2012): 2274-2282.

24Hou, Xiaodi, and Liqing Zhang. "Saliency detection: A spectral residual approach." 2007 IEEE Conference on

Computer Vision and Pattern Recognition. IEEE, 2007.

25Global Contrast based Salient Region Detection. Ming-Ming Cheng, Niloy J. Mitra, Xiaolei Huang, Philip H. S. Torr,

Shi-Min Hu. IEEE Transactions on Pattern Analysis and Machine Intelligence (IEEE TPAMI), 37(3), 569-582, 2015.

26flightradar24.com, ADS-B, primary/secondary RADAR flight localization and aggregation services. 27Birch, Gabriel Carisle, John Clark Griffin, and Matthew Kelly Erdman. UAS Detection Classification and

Neutralization: Market Survey 2015. No. SAND2015-6365. Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States), 2015.

slide-37
SLIDE 37

Drone Detection and Neutralization Companies

37

Leading Drone Detection Companies

  • https://www.blacksagetech.com/
  • https://www.droneshield.com/
  • http://www.dedrone.com/en/

List of Drone Detection Companies and Experiments

  • Test at JFK by FBI -

https://www.faa.gov/news/updates/?newsId=85546

  • https://www.x20.org/uav-drone-detection-anti-drone-defense-

systems/

  • http://industrialcamera.net/security-surveillance-system-products-

services/uav-detection/

  • https://www.hgh-infrared.com/es/Aplicaciones/Seguridad/UAV-

Detection

  • http://www.cerbair.com/en/
  • http://www.israeldefense.co.il/en/content/rafael-unveils-drone-dome-

drone-detection-and-neutralization-system