Computational Neuroscience for Technology: Event-based Vision - - PowerPoint PPT Presentation

computational neuroscience for technology event based
SMART_READER_LITE
LIVE PREVIEW

Computational Neuroscience for Technology: Event-based Vision - - PowerPoint PPT Presentation

Computational Neuroscience for Technology: Event-based Vision Sensors and Information Processing Jrg Conradt Neuroscientific System Theory Center of Competence on Neuro-Engineering Technische Universitt Mnchen http://www.nst.ei.tum.de


slide-1
SLIDE 1

Computational Neuroscience for Technology: Event-based Vision Sensors and Information Processing

Jörg Conradt Neuroscientific System Theory Center of Competence on Neuro-Engineering Technische Universität München

http://www.nst.ei.tum.de

28.11.2014 Qualcomm Vienna

slide-2
SLIDE 2

Neuromorphic Engineering Neuro - morphic

“to do with neurons”, i.e. neurally inspired “structure or form”

Discovering key principles by which brains work and implementing them in technical systems that intelligently interact with their environment.

  • analog VLSI Circuits to implement neural processing
  • Sensors, such as Silicon Retinae or Cochleae
  • Distributed Adaptive Control, e.g. CPG based locomotion
  • Massively Parallel Self-Growing Computing Architectures
  • Neuro-Electronic Implants, Rehabilitation

Examples:

slide-3
SLIDE 3

Computation in Brains

Getting to know your Brain

  • 1.3 Kg, about 2% of body weight
  • 1011 neurons
  • Neuron growth:

250.000 / min (early pregnancy) … but also … loss 1 neuron/second

slide-4
SLIDE 4

Computation in Brains

Getting to know your Brain

  • 1.3 Kg, about 2% of body weight
  • 1011 neurons
  • Neuron growth:

250.000 / min (early pregnancy) … but also … loss 1 neuron/second “Operating Mode” of Neurons

  • Analog leaky integration in soma
  • Digital pulses (spikes) along neurites
  • 1014 stochastic synapses
  • Typical operating “frequency”:

≤100 Hz, typically ~10Hz, asynchronous

slide-5
SLIDE 5

Computation in Brains ... and Machines

Getting to know your Brain

  • 1.3 Kg, about 2% of body weight
  • 1011 neurons
  • Neuron growth:

250.000 / min (early pregnancy) … but also … loss 1 neuron/second “Operating Mode” of Neurons

  • Analog leaky integration in soma
  • Digital pulses (spikes) along neurites
  • 1014 stochastic synapses
  • Typical operating “frequency”:

≤100 Hz, typically ~10Hz, asynchronous Getting to know your Computer’s CPU:

  • 50g, irrelevant for most applications
  • 2 • 109 transistors (Intel Itanium)
  • ideally no modification over lifetime

“Operating Mode” of CPUs

  • No analog components
  • Digital signal propagation
  • Reliable signal propagation
  • Typical operating frequency:

Several GHz, synchronous

slide-6
SLIDE 6

1012 103 106 109 Elephant Cat Frog Mouse Honeybee Zebrafish Sponge

  • C. Elegans

Human Chimpanzee Getting to know your Brain

  • 1.3 Kg, about 2% of body weight
  • 1011 neurons
  • Neuron growth:

250.000 / min (early pregnancy) … but also … loss 1 neuron/second “Operating Mode” of Neurons

  • Analog leaky integration in soma
  • Digital pulses (spikes) along neurites
  • 1014 stochastic synapses
  • Typical operating “frequency”:

≤100 Hz, typically ~10Hz, asynchronous Getting to know your Computer’s CPU:

  • 50g, irrelevant for most applications
  • 2 • 109 transistors (Intel Itanium)
  • ideally no modification over lifetime

“Operating Mode” of CPUs

  • No analog components
  • Digital signal propagation
  • Reliable signal propagation
  • Typical operating frequency:

Several GHz, synchronous

Computation in Brains ... and Machines

slide-7
SLIDE 7

Research Area «Neuroscientific System Theory»

High-Speed Event-Based Vision On-Board Vision-Based Control of Miniature UAVs Distributed Local Cognitive Maps for Spatial Representation Event-Based Navigation

Neuronal-Style Information Processing in Closed-Control-Loop Systems

  • Distributed Local Information Processing
  • Growing and Adaptive Networks of Computational Units
  • Neuromorphic Sensor Fusion and Distributed Actuator Networks
  • Event-Based Perception, Cognition, and Action

Distributed Sensing and Actuation

slide-8
SLIDE 8

ARM7 μC DVS Sensor UART/SPI

52mm 30mm

  • 128 x 128 pixels, each signals

temporal changes of illumination (“events”)

  • Asynchronous updates (no image frames)
  • 15 μs latency, up to 1Mevents/sec

Joint work with the Neuromorphic Sensors Group, ETH/University Zurich

Retinal Inspired Dynamic Vision Sensor

slide-9
SLIDE 9

Event - Based Vision

Frame - based Event - based

discrete time full frames continuous time singular pixels

slide-10
SLIDE 10

Advantages

  • Sparse data stream: ~0.1 MB/s bandwidth
  • ~10us response time
  • Local contrast adjustment per pixel
  • Automatic preprocessing

Challenges

  • Static scenes
  • New vision algorithms

for tracking, localization, ...

Event - Based Vision

slide-11
SLIDE 11

An Asynchronous Temporal Difference Vision Sensor

Conradt et al, ISCAS 2009

slide-12
SLIDE 12

An Asynchronous Temporal Difference Vision Sensor

Conradt et al, ISCAS 2009

slide-13
SLIDE 13

26mm 36mm Target LED Rechargeable Battery Microcontroller for PWM Signal Power Switch

eDVS Event Time: tE =5612μs Event Position: PE = (3/2) Previous Event Time Memory: tM = 4600μs Target Marker LED expected

Δtexp = 1000μs

2215 1650 4132 13 4231 3254 4132 4600 3432 2144 3124 5027 4301 4644 2234 2301 3113 1285 341 4233

… …

128x128

tD = ΔtE - Δtexp = 12μs

ΔtE=tE-tM

20 40 60 80 100

  • 100 -80 -60 -40 -20

1

tD wT

time deviation tD in μs GT: relative weight wT

Position Update: PT+1 = (1 – η(wT+wP))•PT + η(wT+wP)•PE

20 40 60 80 100 1

PD wP

position deviation PD in pixel GP: relative weight wP

PD = (PE – PT)2 = 17

Σ

Current Tracked Position PT = (7/3)

… …

128x128

6 12

Pixel X Pixel Y

  • n event count

Müller et al, ROBIO 2011

Application Example: Tracking an Active Object

slide-14
SLIDE 14

eDVS Sensor Pan Servo Tilt Servo Laser Diode

240mm Marker

Application Example: Tracking an Active Object

Müller et al, ROBIO 2011

slide-15
SLIDE 15

eDVS Sensor Pan Servo Tilt Servo Laser Diode

time / s 2 4 6 8 10

Georg Müller

Application Example: Tracking an Active Object

Müller et al, ROBIO 2011

slide-16
SLIDE 16

(Fun) Application Example: High Speed Balancing of Short Poles

Conradt et al, ISCAS 2009

slide-17
SLIDE 17

Incoming events Current estimate

Converting Events into Meaning (Fun) Application Example: High Speed Balancing of Short Poles

Conradt et al, ECV 2009

slide-18
SLIDE 18

Extended Application Example: High Speed Tracking

Bamford et al, submitted EBCCSP’15

slide-19
SLIDE 19

Application Example: SLAM for Event-Based Vision Systems Framework: Particle Filter

Weikersdorfer et al, ROBIO 2012

slide-20
SLIDE 20

Application Example: SLAM for Event-Based Vision Systems Event Based Localization

Weikersdorfer et al, ROBIO 2012

slide-21
SLIDE 21

Application Example: SLAM for Event-Based Vision Systems Event Based Mapping

Weikersdorfer et al, ICVS 2013

slide-22
SLIDE 22

Application Example: SLAM for Event-Based Vision Systems Event Based Mapping

Weikersdorfer et al, ICVS 2013

slide-23
SLIDE 23

?

Application Example: SLAM for Event-Based Vision Systems

Hoffmann et al, ECMR 2013

5x real time

slide-24
SLIDE 24

Event-based 3D SLAM with a depth-augmented Dynamic Vision Sensor

  • Sensor combination: event-based dynamic

vision and PrimeSense object distance

  • Sparse sensory data allows real-time

processing and integration

  • Localization update rates of > 100 Hz
  • n standard PC
  • Suitable for small autonomous mobile robots

in high-speed application scenarios

David Weikersdorfer, David B. Adrian, Daniel Cremers, Jörg Conradt

Technische Universität München, Germany

Weikersdorfer et al, ICRA 2014

slide-25
SLIDE 25

An Autonomous Indoor Quadrocopter (eb3D SLAM)

Selected hardware components : eDVS (red), Mini-PC (blue), Primesense IR projector (yellow) and IR camera (green), and drone control board (magenta). Parrot AR.Drone

  • diam. ca. 80 cm
slide-26
SLIDE 26

An Autonomous Indoor Quadrocopter (eb3D SLAM)

Evaluation against ground truth: eb-3DSLAM desired trajectory

  • ext. tracker (OptiTrack Flex13)
slide-27
SLIDE 27

Outlook: An Autonomous Indoor Micro Quadrocopter (3D SLAM)

Micro - eDVS

Microcontroller UART Port Lightweight Lenses Key Specs:

  • 20x20mm, 3g
  • 75mW power
  • 64KB SRAM
  • 32bit, 64MHz

DVS Sensor

16cm

Key Technical Specs:

  • Diameter 16cm
  • Weight 38g
  • Autonomy 14min
slide-28
SLIDE 28
slide-29
SLIDE 29

Neuromorphic Computing Landscape

slide-30
SLIDE 30
  • A million mobile

phone processors in

  • ne computer
  • Able to model about

1% of the human brain…

  • …or 10 mice!

SpiNNaker Project

slide-31
SLIDE 31

Mobile DDR SDRAM interface

Multi-chip packaging by UNISEM Europe

SpiNNaker Chip

slide-32
SLIDE 32

Distributed Neuronal Computation “on chip”

Asynchronous Spiking Neural Computation Hardware for low-power real-time operation in Closed-Loop Systems

864 Core “Rack Version” 72 Core Evaluation 18 Core Stand-Alone Prototype

  • Multi-channel spiking input and output
  • Stand-alone spiking computing system
  • Simulates ~20.000 neurons in real time
  • Small (~20x20mm); low power (~600mW)
  • Flexibly configurable, extendable, stackable

... to simulate 1 Billion Spiking Neurons in real-time http://www.nst.ei.tum.de

slide-33
SLIDE 33

The SpiNNaker – Robot Interface Board

SpiNNaker

Interface Board integrates as “additional chip”

IO Board

OmniBot Retina Balancer …

Sensory/ Actuator SpiNNaker packets

WLAN Interface Embedded 768core SpiNNaker Board Stereo eDVS sensors Interface Board

NST SpOmniBot

Denk et al, ICANN 2013

slide-34
SLIDE 34

Interface to Compliant Robotic Arm

slide-35
SLIDE 35

Interface to Compliant Robotic Arm

slide-36
SLIDE 36

Example Projects: SpOmniBee

  • 2 laterally facing retinas
  • Retina events feed into SpiNNaker
  • Distributed computation of local
  • ptical flow
  • Global combination of optic flow
  • Computation of Motor commands
  • n SpiNNaker, send to robot

Goal: keep SpOmnibot centered in hallway with marking, similar to Srinivasan’s Bee experiments

slide-37
SLIDE 37

Application Example: Event-Based Optic Flow for Robot Control

slide-38
SLIDE 38

Example Projects: Ball Balancer

  • One Retina to observe the platform in top-down view
  • Two motors to control platform’s pan and tilt angles
  • SpiNNaker Interface Board is connected to retina and motors

Goal: Keep ball (one or more!) on a certain trajectory

Live Demo at HBP (EU Flagship) Inaugural Summit, Oct. 2013, Lausanne

slide-39
SLIDE 39
slide-40
SLIDE 40

Neural Principles for System Control

http://www.nst.ei.tum.de