computational neuroscience for technology event based
play

Computational Neuroscience for Technology: Event-based Vision - PowerPoint PPT Presentation

Computational Neuroscience for Technology: Event-based Vision Sensors and Information Processing Jrg Conradt Neuroscientific System Theory Center of Competence on Neuro-Engineering Technische Universitt Mnchen http://www.nst.ei.tum.de


  1. Computational Neuroscience for Technology: Event-based Vision Sensors and Information Processing Jörg Conradt Neuroscientific System Theory Center of Competence on Neuro-Engineering Technische Universität München http://www.nst.ei.tum.de 28.11.2014 Qualcomm Vienna

  2. Neuromorphic Engineering Neuro - morphic “to do with neurons”, “structure or form” i.e. neurally inspired Discovering key principles by which brains work and implementing them in technical systems that intelligently interact with their environment. Examples: • analog VLSI Circuits to implement neural processing • Sensors, such as Silicon Retinae or Cochleae • Distributed Adaptive Control, e.g. CPG based locomotion • Massively Parallel Self-Growing Computing Architectures • Neuro-Electronic Implants, Rehabilitation

  3. Computation in Brains Getting to know your Brain • 1.3 Kg, about 2% of body weight • 10 11 neurons • Neuron growth: 250.000 / min (early pregnancy) … but also … loss 1 neuron/second

  4. Computation in Brains Getting to know your Brain • 1.3 Kg, about 2% of body weight • 10 11 neurons • Neuron growth: 250.000 / min (early pregnancy) … but also … loss 1 neuron/second “Operating Mode” of Neurons • Analog leaky integration in soma • Digital pulses (spikes) along neurites • 10 14 stochastic synapses • Typical operating “frequency”: ≤100 Hz, typically ~10Hz, asynchronous

  5. Computation in Brains ... and Machines Getting to know your Brain Getting to know your Computer’s CPU: • 1.3 Kg, about 2% of body weight • 50g, irrelevant for most applications • 10 11 neurons • 2 • 10 9 transistors (Intel Itanium) • Neuron growth: • ideally no modification over lifetime 250.000 / min (early pregnancy) … but also … loss 1 neuron/second “Operating Mode” of Neurons “Operating Mode” of CPUs • Analog leaky integration in soma • No analog components • Digital pulses (spikes) along neurites • Digital signal propagation • 10 14 stochastic synapses • Reliable signal propagation • Typical operating “frequency”: • Typical operating frequency: ≤100 Hz, typically ~10Hz, asynchronous Several GHz, synchronous

  6. Computation in Brains ... and Machines Getting to know your Brain Getting to know your Computer’s CPU: • 1.3 Kg, about 2% of body weight • 50g, irrelevant for most applications • 10 11 neurons • 2 • 10 9 transistors (Intel Itanium) • Neuron growth: • ideally no modification over lifetime 250.000 / min (early pregnancy) … but also … loss 1 neuron/second “Operating Mode” of Neurons “Operating Mode” of CPUs • Analog leaky integration in soma • No analog components • Digital pulses (spikes) along neurites • Digital signal propagation • 10 14 stochastic synapses • Reliable signal propagation • Typical operating “frequency”: • Typical operating frequency: ≤100 Hz, typically ~10Hz, asynchronous Several GHz, synchronous 10 3 10 6 10 9 10 12 0 C. Elegans Chimpanzee Zebrafish Honeybee Elephant Frog Human Mouse Cat Sponge

  7. Research Area «Neuroscientific System Theory» Neuronal-Style Information Processing in Closed-Control-Loop Systems • Distributed Local Information Processing • Growing and Adaptive Networks of Computational Units On-Board Vision-Based • Neuromorphic Sensor Fusion and Distributed Actuator Networks Control of Miniature UAVs • Event-Based Perception, Cognition, and Action High-Speed Event-Based Event-Based Vision Navigation Distributed Sensing and Actuation Distributed Local Cognitive Maps for Spatial Representation

  8. Retinal Inspired Dynamic Vision Sensor • 128 x 128 pixels, each signals 30mm temporal changes of illumination (“events”) • Asynchronous updates (no image frames) 15 μ s latency, up to 1Mevents/sec • 52mm DVS Sensor ARM7 μ C UART/SPI Joint work with the Neuromorphic Sensors Group, ETH/University Zurich

  9. Event - Based Vision Frame - based Event - based full frames singular pixels continuous time discrete time

  10. Event - Based Vision Advantages • Sparse data stream: ~0.1 MB/s bandwidth • ~10us response time • Local contrast adjustment per pixel • Automatic preprocessing Challenges • Static scenes • New vision algorithms for tracking, localization, ...

  11. An Asynchronous Temporal Difference Vision Sensor Conradt et al, ISCAS 2009

  12. An Asynchronous Temporal Difference Vision Sensor Conradt et al, ISCAS 2009

  13. Application Example: Tracking an Active Object Power Switch Microcontroller Target Marker LED eDVS Event Time: t E =5612 μ s Previous Event Time for PWM Signal Event Position: P E = (3/2) Memory: t M = 4600 μ s Target LED 2215 1650 4132 13 4231 Δ t E =t E -t M 36mm Rechargeable 3254 4132 4600 3432 2144 Battery … … 3124 5027 4301 4644 2234 26mm 2301 3113 1285 341 4233 Current Tracked … … 128x128 128x128 Position expected Σ P D = (P E – P T ) 2 = 17 12 on event count t D = Δ t E - Δ t exp = 12 μ s P T = (7/3) Δ t exp = 1000 μ s G P : relative weight w P G T : relative weight w T 1 1 w T w P 6 Pixel Y P D t D -100 -80 -60 -40 -20 20 40 60 80 100 20 40 60 80 100 position deviation P D in pixel time deviation t D in μ s Position Update: P T+1 = (1 – η (w T +w P )) • P T + η (w T +w P ) • P E 0 Pixel X Müller et al, ROBIO 2011

  14. Application Example: Tracking an Active Object Marker Tilt Servo Laser Diode eDVS Sensor 240mm Pan Servo Müller et al, ROBIO 2011

  15. Application Example: Tracking an Active Object Georg Müller 10 time / s 8 6 Tilt Servo Laser Diode 4 eDVS Sensor 2 Pan Servo 0 Müller et al, ROBIO 2011

  16. (Fun) Application Example: High Speed Balancing of Short Poles Conradt et al, ISCAS 2009

  17. (Fun) Application Example: High Speed Balancing of Short Poles Converting Events into Meaning Incoming events Current estimate Conradt et al, ECV 2009

  18. Extended Application Example: High Speed Tracking Bamford et al, submitted EBCCSP’15

  19. Application Example: SLAM for Event-Based Vision Systems Framework: Particle Filter Weikersdorfer et al, ROBIO 2012

  20. Application Example: SLAM for Event-Based Vision Systems Event Based Localization Weikersdorfer et al, ROBIO 2012

  21. Application Example: SLAM for Event-Based Vision Systems Event Based Mapping Weikersdorfer et al, ICVS 2013

  22. Application Example: SLAM for Event-Based Vision Systems Event Based Mapping Weikersdorfer et al, ICVS 2013

  23. Application Example: SLAM for Event-Based Vision Systems ? 5x real time Hoffmann et al, ECMR 2013

  24. Event-based 3D SLAM with a depth-augmented Dynamic Vision Sensor David Weikersdorfer, David B. Adrian, Daniel Cremers, Jörg Conradt Technische Universität München, Germany • Sensor combination: event-based dynamic vision and PrimeSense object distance • Sparse sensory data allows real-time processing and integration • Localization update rates of > 100 Hz on standard PC • Suitable for small autonomous mobile robots in high-speed application scenarios Weikersdorfer et al, ICRA 2014

  25. An Autonomous Indoor Quadrocopter (eb3D SLAM) Parrot AR.Drone diam. ca. 80 cm Selected hardware components : eDVS ( red ), Mini-PC ( blue ), Primesense IR projector ( yellow ) and IR camera ( green ), and drone control board ( magenta ).

  26. An Autonomous Indoor Quadrocopter (eb3D SLAM) eb-3DSLAM Evaluation against ground truth: desired trajectory ext. tracker (OptiTrack Flex13)

  27. Outlook: An Autonomous Indoor Micro Quadrocopter (3D SLAM) Key Technical Specs: • Diameter 16cm • Weight 38g • Autonomy 14min 16cm Micro - eDVS Lightweight Lenses Microcontroller UART Port Key Specs: DVS Sensor • 20x20mm, 3g • 75mW power • 64KB SRAM • 32bit, 64MHz

  28. Neuromorphic Computing Landscape

  29. SpiNNaker Project • A million mobile phone processors in one computer • Able to model about 1% of the human brain… • …or 10 mice!

  30. SpiNNaker Chip Mobile DDR SDRAM interface Multi-chip packaging by UNISEM Europe

  31. Distributed Neuronal Computation “on chip” Asynchronous Spiking Neural Computation Hardware for low-power real-time operation in Closed-Loop Systems ... to simulate 1 Billion Spiking Neurons 18 Core Stand-Alone Prototype in real-time 864 Core “Rack Version” 72 Core Evaluation • Multi-channel spiking input and output • Stand-alone spiking computing system • Simulates ~20.000 neurons in real time • Small (~20x20mm); low power (~600mW) • Flexibly configurable, extendable, stackable http://www.nst.ei.tum.de

  32. The SpiNNaker – Robot Interface Board NST SpOmniBot WLAN Interface Interface Board Embedded 768core SpiNNaker Board OmniBot IO Board SpiNNaker SpiNNaker Retina Sensory/ Actuator Interface Board packets integrates as Stereo eDVS sensors Balancer “additional chip” … Denk et al, ICANN 2013

  33. Interface to Compliant Robotic Arm

  34. Interface to Compliant Robotic Arm

  35. Example Projects: SpOmniBee • 2 laterally facing retinas • Retina events feed into SpiNNaker • Distributed computation of local optical flow • Global combination of optic flow • Computation of Motor commands on SpiNNaker, send to robot Goal: keep SpOmnibot centered in hallway with marking, similar to Srinivasan’s Bee experiments

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend