From Context Awareness to Socially Aware and Interactive Systems - - PowerPoint PPT Presentation

from context awareness to
SMART_READER_LITE
LIVE PREVIEW

From Context Awareness to Socially Aware and Interactive Systems - - PowerPoint PPT Presentation

From Context Awareness to Socially Aware and Interactive Systems Paul Lukowicz DFKI/University of Kaiserslautern Germany Overview Part 1: Introduction of key concepts Part 2: Basic technologies Sensing Reasoning Part 3:


slide-1
SLIDE 1

From Context Awareness to Socially Aware and Interactive Systems

Paul Lukowicz DFKI/University of Kaiserslautern Germany

slide-2
SLIDE 2

Overview

  • Part 1: Introduction of key concepts
  • Part 2: Basic technologies

– Sensing – Reasoning

  • Part 3:

– phenomenology of complex socio-technical systems – applications and outlook

slide-3
SLIDE 3

Overview

  • Standard Sensors

– Motion, Sound, Location, FSR,RFID, BlueTooth

  • Exotic Sensors
slide-4
SLIDE 4

Acceleration Sensor

  • Acceleration sensors provides two types of infomration
slide-5
SLIDE 5

Microphone GPS Compass Bluetooth Wireless LAN Accelerometer Gyroscope Camera

Phone Sensors

slide-6
SLIDE 6
  • Acceleration sensors provides two types of infomration

 Angle towards the gravity vector (tilt)

Acceleration Sensor

1g X Y Z

slide-7
SLIDE 7

Acceleration Sensor

  • Acceleration sensors provides two types of infomration

 Angle towards the gravity vector (tilt  Change of motion speed

X Y Z

slide-8
SLIDE 8

Acceleration Sensor

  • Acceleration sensors provides two types of infomration

 Angle towards the gravity vector (tilt)  Change of motion speed  These components can not be easily separated !

1g

slide-9
SLIDE 9

Begrüssung

  • 2g

+2g 0g +1g /-2g /+2g /0g

time

0g

slide-10
SLIDE 10

Filterung und Bechleunigungskomponenten

Raw signal Low pass High pass High pass (fcut=2Hz) Low pass (fcut=2Hz)

slide-11
SLIDE 11

Walking Up and Down Stairs

Acceleration on the upper leg Down Up

slide-12
SLIDE 12

IMUs

  • XSENSE inetrial Measurement Unit (IMU)

– 3 axis accelerometer – 3 axis magnetic – 3 axis gyro

Derives absolute orientation of the device with respect to

  • gravity direction
  • north
slide-13
SLIDE 13

Application of IMUs

slide-14
SLIDE 14

Magnetic Field and Light

slide-15
SLIDE 15

Multisensor Signal

detect sitting down

slide-16
SLIDE 16

Sound

FFT and LDA with kitche chene nett tte sounds ds

slide-17
SLIDE 17

Water Pipe Sounds

Forgarty, Au, Hudson, UIST 2006

slide-18
SLIDE 18

Water Pipe Sounds

Forgarty, Au, Hudson, UIST 2006

slide-19
SLIDE 19

Water Pipe Sounds

slide-20
SLIDE 20

Audio: Conversation

  • Intensity difference reveals if the owner is

speaking or listening

– human voice distinguishable from other noises

slide-21
SLIDE 21

Dietary Monitoring

Initial success in detecting and classifying chewing sounds

– How many bites ? – What type of food was eaten?

Oliver Amft, Mathias Stäger, Paul Lukowicz, and Gerhard Tröster Analysis of Chewing Sounds for Dietary Monitoring,

  • Proc. UBICOMP 2005, Tokyo

microphone gets sounds through bone conduction

slide-22
SLIDE 22

Microphone position

1 2 3 4 5 6

slide-23
SLIDE 23

Bite Detection

4 bites of apple

slide-24
SLIDE 24

FSR for Muscle Activity Monitoring

  • thickness <0.5mm
  • sampling rate: >100Hz
  • power loss <1mW

typical force curve

slide-25
SLIDE 25

FSRs in den Schuhsohlen

normal hoch runter

slide-26
SLIDE 26

Muskelaktivität

  • Bewegung wird durch das Zusammenspiel verschiedener Muskel

hervorgerufen

  • Auch kleine Unterschiede in den Bewegungsmustern sind klar in der

Muskelaktivität zu erkennen

slide-27
SLIDE 27

Long vs. Short Strides

short long

slide-28
SLIDE 28

Up vs. Downstairs

up down

slide-29
SLIDE 29

Up vs. Downstairs

up down

slide-30
SLIDE 30

Squats Technique

correct asymetric cheat

slide-31
SLIDE 31

RFID

slide-32
SLIDE 32

RFID Example: Intel

Patterson, Fox, Kauz, Philipose, ISWC 2005

slide-33
SLIDE 33

Bluetooth: Presence of People

BT scans can reveal identities of people who interact with each other

BT

slide-34
SLIDE 34

Location

  • Types

– absolute coordinates – semantic (e.g. which room) – proximity

  • Parameters

– precision (30cm to some meters until one room) – reliability (how often am I right ?) – installation – price – API

  • outdoor location trivial (GPS),
  • indoor location unsolved
slide-35
SLIDE 35

Beacon Based Principles

slide-36
SLIDE 36

Beacon Based Principles

Distance Estimation Methods

  • 1. Time of flight

– for indoor environments signal speed should be small – problems with multi-path

  • 2. Signal strength

– problems with non distance dependent attenuation

slide-37
SLIDE 37

Beacon Technical Issues

  • Occlusions/Absorbtion

– E.g. metal absorbs radio signals

  • Reflections

– signals bounce around providing false information

  • Noise
  • User Acceptance

– electro-smog “hysteria”

slide-38
SLIDE 38

Beacon Technologies

  • Ultrasound: Time Of Flight

– used with RF for communication and synchronization

  • RF: Time Of Flight, Signal Strength

– WiFi (RADAR, Ekahu) – Bluetooth,Zigbee – Active RFID – Ultrawideband (UWB e.g. Ubisense)

slide-39
SLIDE 39

Beacon Technologies

  • Ultrasound: Time Of Flight

– used with RF for communication and synchronization

  • RF: Time Of Flight, Signal Strength

– WiFi (RADAR, Ekahu) – Bluetooth,Zigbee – Active RFID – Ultrawideband (UWB e.g. Ubisense)

slide-40
SLIDE 40

Wifi Signal Strength

slide-41
SLIDE 41

WLAN Fingerprinting

slide-42
SLIDE 42

UWB Location

slide-43
SLIDE 43

Example of UWB

  • Ubisense

– Ubisense delivers a precise, real-time location system (RTLS) utilizing ultra-wideband (UWB) technology that locates assets and people within 30cm / 12" in 3D.

15’000 Euro for a research set !

slide-44
SLIDE 44

Performance UWB

slide-45
SLIDE 45

Floor Sensors

  • Capacitive tracking of objects on the floor

– www.future-shape.com

slide-46
SLIDE 46
slide-47
SLIDE 47

Overview

  • 1. Standard Sensors

– Motion, Sound, Location, FSR,RFID, BlueTooth

  • Exotic Sensors
slide-48
SLIDE 48

Relative position information?

48

Many activites are determined by relative positions of two body parts. („Hand is near chest.“)

slide-49
SLIDE 49

Relative Positioning

49

Acceleration, Gyroscopes and Earth Magnetic Field Sensors Example: Xsens Relative Position information

  • nly indirectelyavailable (Euler

angles + knowledge about sensor positions) Ultrasound Example: Relate Brick Obstacles shield measurements. RF Based Example: Ubisense Low accuracy and infrastructure needed

slide-50
SLIDE 50

Physical Principle

50

Transmitter generates a magnetic field at a certain resonant frequency f Receiver is calibrated to resonant frequency f

slide-51
SLIDE 51

51

Magnetic Field is oscilating Coil 1 Coil 2 Coil 4 Coil 3 Coil 5

slide-52
SLIDE 52

Trinken Magnetic

slide-53
SLIDE 53

Poor Men’s X-Ray: Capacitors

slide-54
SLIDE 54

Physical background

  • Human body can be considered as part of a

capacitor

  • change of distance or moving of inside organs

result in capacitance change

slide-55
SLIDE 55

Sensing Setup

attached to fit tightly but remain comfortable

slide-56
SLIDE 56

Signal Examples

slide-57
SLIDE 57

Eye Tracking

Andreas Bulling, ETH Zürich

slide-58
SLIDE 58

Augenverfolgung

slide-59
SLIDE 59

Eye Activity when Reading

  • Bulling Eth Zürich
slide-60
SLIDE 60

Bread cutter Mixer Water boiler Running Cutting -> bread, salami Mixing level Consitency of fluid Amount of boiled water

Current Sensing

slide-61
SLIDE 61

Current Sensing

iSensor:

  • stream raw ADC values of the current used power

consumption

  • stream recognized device activity events
slide-62
SLIDE 62

Kitchen Scenario

Why we choose a kitchen scenario:

  • Lot of kitchen devices provide different modes
  • Kitchen devices can be used in different ways
  • In general: Preparing food is still quite hard to

recognize

Analyzed kitchen devices:

slide-63
SLIDE 63

Kitchen Scenario – Signal Analysis Device: Mixer – Raw ADC values

slide-64
SLIDE 64

Kitchen Scenario – Signal Analysis Device: Bread cutter – Raw ADC values

slide-65
SLIDE 65

Kitchen Scenario – Signal Analysis Device: Water boiler (cold water) – Raw ADC values

slide-66
SLIDE 66

Kitchen Scenario – Signal Analysis Device: Fridge – Raw ADC values

slide-67
SLIDE 67

Kitchen Scenario – Signal Analysis Device: Egg boiler, toaster, coffee machine

3 or 5 eggs boiled Soft, medium or hard boiled Small or big coffee Bright / brown or dark toasted

slide-68
SLIDE 68

Kitchen Scenario – Data Processing

Objective: avoid data streaming!!!

  • > send only events (e.g. mixer mode: level 2)

Jennic Zigbee microcontroller for data processing

  • > just simple feature calculation methods and recognition

algorithms Features Min Max Sum Variance Average (take max over the last 600ADC and calc features using the last 21 max ADC) Time Duration of a device activity/mode

+

slide-69
SLIDE 69

Kitchen Scenario – Data Processing

Objective: avoid data streaming!!!

  • > send only events (e.g. mixer mode: level 2)

Jennic Zigbee microcontroller for data processing

  • > just simple feature calculation methods and recognition

algorithms Thresholds could be sufficient!!!

slide-70
SLIDE 70

Kitchen Scenario – Data Processing

Some examples:

slide-71
SLIDE 71

Kitchen Scenario – Data Processing

slide-72
SLIDE 72

Evaluation

Single device usage: Fridge – door open events: 35 hours monitored, 91% recognition rate, 9% missed

slide-73
SLIDE 73

Evaluation

Water boiler

  • boil 0.75l, 1.0l, 1.25l, 1.5l and 1.7l of cold water

deviation: between 0.006l and 0.076l

slide-74
SLIDE 74

Evaluation

Bread cutter

  • cut slices of bread and salami, 6 different people,

28 slices of bread, 29 slices of salami

93,10% 96,42% Classificatio n:

Cutting activites were recognized two times

slide-75
SLIDE 75

Evaluation

Mixer - 7 users, mix both a liquid and a creamy fluid: chocolate milk (250ml milk, 2 spoons of chocolate powder) + quark-yoghurt dish (250g quark and 1 yoghurt)

Result: 100% of chocolate milks = fluid liquids 100% of quark-yoghurt dishes = creamy liquids 57% of quark-yoghurt dishes = change from medium consistency to a creamy one

slide-76
SLIDE 76

Evaluation

6 persons performed:

slide-77
SLIDE 77

Evaluation

Several devices at the same time:

+ + +

slide-78
SLIDE 78

Overview

  • Part 1: Introduction of key concepts
  • Part 2: Basic technologies

– Sensing – Reasoning

  • Part 3:

– phenomenology of complex socio-technical systems – applications and outlook

slide-79
SLIDE 79

Overview

  • 1. General Background
  • 2. Examples
slide-80
SLIDE 80

What is Context Recognition ?

Embedded Controllers: feedback control loop Artificial Intelligence: imitating human cognition

  • often video based
  • includes interpretation

Context Recognition: mapping signals from simple sensors onto a set of predefined, environment related states

slide-81
SLIDE 81

Context Aware Systems 2000

  • Gellersen, Schmidt, Beigl 1999, from the SmartIts project
  • also Day, Abowd, Pentland, Starner, Schiele,... around 2000

acceleration, sound, light...

slide-82
SLIDE 82

Context Aware Systems 2000

  • Gellersen, Schmidt, Beigl 1999, from the SmartIts project
  • also Day, Abowd, Pentland, Starner, Schiele,... around 2000

acceleration, sound, light... modes of locomotion

slide-83
SLIDE 83

Microphone GPS Compass Bluetooth Wireless LAN Accelerometer Gyroscope Camera

....and 2012

slide-84
SLIDE 84

Context Recognition

IEEE PAMI 06, PMC 07, Pattern Rec 08, ACM TIST 11, Springer PAA 11 ISWC, Pervasive,..

slide-85
SLIDE 85

Basic Principle

gyro acc

slide-86
SLIDE 86

Basic Principle

class membership inferred from coordinates only

gyro acc ???????

slide-87
SLIDE 87

Basic Principle

gyro acc

class membership inferred from coordinates only

slide-88
SLIDE 88

Separation Plane

gyro acc

recognition given through separation plane

slide-89
SLIDE 89

complex separation plane

gyro acc

  • ften complex and impossible to

represent in an analytic way

slide-90
SLIDE 90

Recognition Issues

  • 1. representation of the

separation plane – memory requirements !

  • 2. locating a point with respect

to the separation place

  • 3. Identifying the separation

plane – Generation from a sample – cost minimazation

gyro acc

slide-91
SLIDE 91

Learning Separation Planes

gyro acc

slide-92
SLIDE 92

Learning Separation Planes

separation plane is not perfect

  • ambivalent sensor data
  • size of the sample
  • sensor noise

gyro acc

slide-93
SLIDE 93

Begrüssung

  • 2g

+2g 0g +1g /-2g /+2g /0g

time

0g

slide-94
SLIDE 94

Analyzing Recordings

Standing up Sitting down Sitting Standing Sitting

Torso Wrist

Arm on table Arm resting on table Arm moving Arm moving Handshake time [s] acceleration [g]

slide-95
SLIDE 95

Time Series

z„„1 z„„2 z„„3 z„„N … z„1 z„2 z„3 z„N …

Process P„ Process P„„

t

slide-96
SLIDE 96

Time Series

z*1 z*2 z*3 z*N …

Process P„ Process P„„

t ?????

slide-97
SLIDE 97

Context Rec. Challenges

  • 1. Ambiguous sensors

– e.g. phone in a pocket tracking patient care

  • 2. Lack of representative training data

– huge inter and intra person variations – training with real users often not practicable

  • 3. Inconsistent state space

– e.g walking, having a meal and taking a pill – different granularities, temporal scales, and levels of abstraction

  • 4. Dynamically changing sensor configurations

– phone moves around, different phones, new sensors

slide-98
SLIDE 98

Context Problem Types

state action situation parameters

slide-99
SLIDE 99

Context Problem Types

state action situation parameters

  • physical characterization
  • ‘classical’ signal processing

?

slide-100
SLIDE 100

Context Problem Types

state action situation parameters

  • separable in time invariant

feature space

  • sliding window algorithms

size window shift

Signal stream

slide-101
SLIDE 101

Context Problem Types

state action situation parameters

  • characterized by signal sequence
  • time series analysis
  • ‘spotting’ problem

Does this section contain a gesture ?

  • r this one?
slide-102
SLIDE 102

Context Problem Types

state action situation parameters

  • specific, sequence/distribution of actions
  • PCFG, Bayesian networks, ontologies…
slide-103
SLIDE 103

Context Problem Types

state action situation parameters

slide-104
SLIDE 104

Context Recognition Example

does this section contain a gesture ?

  • r this one?
  • no model of NULL class
  • long NULL class segments
  • variable length ‘true positives’
slide-105
SLIDE 105

Confusion Matrix

slide-106
SLIDE 106

Spotting Errors

slide-107
SLIDE 107

Confusion Matrix

slide-108
SLIDE 108

Precission Recall Graphs

slide-109
SLIDE 109

Spotting Errors

slide-110
SLIDE 110

Overview

  • 1. General Background
  • 2. Examples
slide-111
SLIDE 111

Example: Assembly Monitoring

slide-112
SLIDE 112

Example: Assembly Monitoring

1,3 microphones 2,4,5 accelerometers 6 computer

  • P. Lukowicz, J. Ward, H. Junker, M. Stäger, G. Tröster, A. Atrash, T. Starner

Recognizing Workshop Activity Using Body Worn Microphones and Accelerometers, Pervasive Computing, pages 18-22, Vienna, Austria, 18.-23. April 2004.

slide-113
SLIDE 113

Drilling

∘Well defined, steady motion

  • slowly turn handle to lower, then return
  • constant machine noise

time

z-axis sound x-axis

slide-114
SLIDE 114

Sawing

∘Well defined, repetitive motion

  • back and forth
  • loud correlated sound

time

x-axis z-axis sound

slide-115
SLIDE 115

Approach

Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data Final predictions IA Prediction Sequence Audio Ch2. FFT LDA Audio Ch1.

  • Min. Dist.

Audio Ch2. jm smoothing

slide-116
SLIDE 116

Signal al Seg egme mentation ntation

IA Audio Ch1. Audio Ch2.

1 » Þ »

chest wrist chest wrist

I I d d 1 >> Þ <<

chest wrist chest wrist

I I d d

hand on the switch

slide-117
SLIDE 117

So Sound nd Classific ssification ation

FFT LDA IA Audio Ch1. Audio Ch2.

  • Min. Dist.

Pre-computed LDA training vectors

slide-118
SLIDE 118

Sound nd Cl Class ssification fication

Minimum distance

  • r K-Nearest Neighbour

Test frame 256 point FFT reduced to 3D with LDA

slide-119
SLIDE 119

Fram ame e by by frame me evalu luation ation

Audio Ch1. Audio Ch2.

  • ground truth
slide-120
SLIDE 120

Fram ame e by by frame me evalu luation ation

FFT LDA Audio Ch1. Audio Ch2.

  • Min. Dist.
  • ground truth
  • LDA
slide-121
SLIDE 121

Sawing

Fram ame e by by frame me evalu luation ation

IA Prediction Sequence Audio Ch2. FFT LDA Audio Ch1.

  • Min. Dist.

Audio Ch2.

  • ground truth
  • LDA
  • LDA+IA
slide-122
SLIDE 122
  • ground truth
  • LDA+IA

Continuous ntinuous evalu luation ation

  • ground truth
  • LDA+IA
  • jm(LDA+IA)

IA Prediction Sequence Audio Ch2. FFT LDA Audio Ch1.

  • Min. Dist.

Audio Ch2. jm smoothing

slide-123
SLIDE 123

Acceleration Modeling

HMM models ML classification 3x3 axis raw data prediction (sampled at 100Hz) single gaussion, mostly feed forward

slide-124
SLIDE 124

Results

  • Sound: 109 Events

– 97 correct – 94 insertions – 2 deletions – 10 substitutions

  • Motion: von 109 Events

– 102 correct – 23 insertions – 7 substitutions

Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data

slide-125
SLIDE 125

Results

  • Sound: 109 Events

– 97 correct – 94 insertions – 2 deletions – 10 substitutions

  • Motion: von 109 Events

– 102 correct – 23 insertions – 7 substitutions

Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data Final predictions

slide-126
SLIDE 126

Sensor Fusion

  • Sound: 109 Events

– 97 correct – 94 insertions – 2 deletions – 10 substitutions

  • Motion: von 109 Events

– 102 correct – 23 insertions – 7 substitutions

  • Combined 109 Events

– 92 correct – 0 insertions – 17 deletions – 0 substitutions

Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data Final predictions

slide-127
SLIDE 127

Classifier Fusion

  • Methods

– Comparison – Highest Rank – Borda Count – Logistic Regression

  • Aims

– improve person independent performance – deletion/insertion tradeoffs

Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data Final predictions Fusion

NULL

slide-128
SLIDE 128

Classifier Fusion

Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data Final predictions Fusion

NULL

  • Methods

– Comparison – Highest Rank – Borda Count – Logistic Regression

  • Aims

– improve person independent performance – deletion/insertion tradeoffs

  • Result

– logistic regression 10% better for person independent case 400 events in 20 sequences, from 5 subjects

Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers, J.A. Ward and P. Lukowicz and G. Troester and T. Starner, IEEE Trans. Pattern Analysis and Machine Intelligence, 28:10, October 2006

slide-129
SLIDE 129

Ultrasound

Location classif. Motion classif. Hands Location Segment info. Ultrasound Motion data Final predictions Fusion

NULL

Ultrasound

slide-130
SLIDE 130

Approach

drift away after a few sec 2Hz sampling rate and many errors

slide-131
SLIDE 131

Approach

LSQ estimate from US, „plausability“ filtered

  • ext. Kalman from

from US+IMU

  • ext. Kalman from US
slide-132
SLIDE 132

Approach

EM clustering

  • n multivariate

Guassian mixture discretization into motion strings, creation of matching cost graph, minimum tresholding

  • n location spotted segments

distance based spotting with temporal smoothing

slide-133
SLIDE 133

Approach

  • more detailed examination of output of trajectory spotting with respect to location (with

tresholds or SVM classification)

  • concurrency based plausibility analysis
slide-134
SLIDE 134

Results

location spotting trajectory spottingspotting SVM based post processing concurency resolution

slide-135
SLIDE 135

Muscle Monitoring

Candidate class.. HMM classif. Segment info. Motion data Final predictions Similarity search Partitioning Candidate sel. HMM Features Hand muscles

Sensing Muscle Activities with Body-Worn Sensors, Amft, O., Junker, H., Lukowicz, P., Tröster, G., and Schuster, C., In BSN 2006: Proceedings of the International Workshop on Wearable and Implantable Body Sensor Networks., pp. 138-141, April 2006

slide-136
SLIDE 136

Muscle Monitoring

slide-137
SLIDE 137

Skoda Qaulity Control

slide-138
SLIDE 138

2nd Generation Activitiess

slide-139
SLIDE 139

Sensors: Location

slide-140
SLIDE 140

Sensors: Location

Ubisense tags

slide-141
SLIDE 141

Location Raw Data

slide-142
SLIDE 142

Smoothed and Filtered Result

slide-143
SLIDE 143

Sensors: Motion

slide-144
SLIDE 144

Sensors: Inertial

slide-145
SLIDE 145

Moption Signal

slide-146
SLIDE 146

Sensors: FSR for Muscle Activity

slide-147
SLIDE 147

FSR for Muscle Activity Monitoring

Sensing Muscle Activities with Body-Worn Sensors, Amft, O., Junker, H., Lukowicz, P., Tröster, G., and Schuster, C., In BSN 2006: Proceedings of the International Workshop on Wearable and Implantable Body Sensor Networks., pp. 138-141, April 2006

slide-148
SLIDE 148

Combined Signal Examples

slide-149
SLIDE 149
slide-150
SLIDE 150

Algorithmic Approach

  • Processing separation

– Activity-wise – Sensor- / feature-wise

  • High recall spotting stage
  • Incremental fusion of additional information

Spotter for class 1 Spotter for class N

slide-151
SLIDE 151

Results

1 open hood 2 close hood 3 open trunk 4 check trunk 5 close trunk 6 fuel lid 7 open left door 8 close left door 9 open right door 10 close right door 11 open two doors 12 close two doors 13 mirror 14 check trunk gaps 15 lock check left 16 lock check right 17 check hood gaps 18 open the spare wheel box 19 close the spare wheel box 20 writing

Spotter for class 1 Spotter for class N Motion spotting FSR classification of spotted gestures Masking step FSR Masking step location Masking step FSR + location final

+

x

  • 8 subjects,
  • 3680 checking
  • 560 minutes of data
slide-152
SLIDE 152

Overview

  • Part 1: Introduction of key concepts
  • Part 2: Basic technologies

– Sensing – Reasoning

  • Part 3:

– phenomenology of complex socio-technical systems (collective sensing and communication phenomena) – applications and outlook

slide-153
SLIDE 153

Application Types

exploiting people‟s mobility to sense parameters over large areas sensing/recognizing collective phenomena collaborative sensing and reasoning to improve accuracy

slide-154
SLIDE 154

Application Types

exploiting people‟s mobility to sense parameters over large areas sensing/recognizing collective phenomena collaborative sensing and reasoning to improve accuracy influencing behaviour in the large

slide-155
SLIDE 155

Application Types

exploiting people‟s mobility to sense parameters over large areas sensing/recognizing collective phenomena collaborative sensing and reasoning to improve accuracy influencing behaviour in the large

slide-156
SLIDE 156

Large Scale Sensing

exploiting people‟s mobility to sense parameters over large areas

slide-157
SLIDE 157

Example

TomTom realtime traffic combining

  • Speed/location data from TomTom systems
  • Phone GSM cell data from
slide-158
SLIDE 158

Crowd Density Classification

from number of discoverable devices to crowd density

slide-159
SLIDE 159

Discoverable BlueTooth Devices

Wepner, Lukowicz accepted, PhoneSense 11

slide-160
SLIDE 160

160

Experiment Area - Munich Oktoberfest

160

slide-161
SLIDE 161

3 days at the Octoberfest 2010

slide-162
SLIDE 162

162

Methods I

  • Individual device based estimation

– Number of Bluetooth devices – Mean signal strength – Variance of the signal strengths

slide-163
SLIDE 163

163

Methods II

  • Collaborated device based estimation

– Average number of devices – Variance of the number of devices – Variance of all signal strengths

slide-164
SLIDE 164

Crowd Density Classification

a) b) c) d) e) f) g) h) i) 0!% 30!% 60!% 90!% 81!% 82!% 63!% 76!% 78!% 63!% 58!% 60!% 50!% 64!% 40!% 41!% 40!% 64!% 39!% 29!% 39!% 34!%

multiple phones, collaborative

slide-165
SLIDE 165

Application Types

exploiting people‟s mobility to sense parameters over large areas collaborative sensing and reasoning to improve accuracy

slide-166
SLIDE 166

Inertial tracking

true path estimated

฀ r s  r v (t)dt  r a (t)dt

 

dt

slide-167
SLIDE 167

Collaborative Location

slide-168
SLIDE 168

Collaborative Location

slide-169
SLIDE 169

Collaborative Location

slide-170
SLIDE 170

Collaborative Location

slide-171
SLIDE 171

Experimental Results

15 30 45 60 75 90 25 50 75 100 time min mean localisation error m

PDR error collaborative localisation error, 10 20

Based on 60km of traces from 10 people at a 3 day festival in Malta

Kloch, Lukowicz, ISWC 11

slide-172
SLIDE 172

Application Types

exploiting people‟s mobility to sense parameters over large areas sensing/recognizing collective phenomena collaborative sensing and reasoning to improve accuracy

slide-173
SLIDE 173

Simple Idea

= consumer confidence

slide-174
SLIDE 174

Allianz Arena Experiments

3 second and 1 first league games at Allianz Arena recorded with between 8 and 12 participants

slide-175
SLIDE 175

Approval Disapproval Cheering

Allianz Arena Experiments

  • Classification of relevant audio segments works

with an accuracy of almost 100 %

  • Automatically selecting relevant segments from

the continuous audio stream is still a work-in- progress cooperation between many phones !

slide-176
SLIDE 176

Application Types

exploiting people‟s mobility to sense parameters over large areas sensing/recognizing collective phenomena collaborative sensing and reasoning to improve accuracy influencing behaviour in the large

slide-177
SLIDE 177

As Coupling Gets Stronger, System Behavior Can Change Completely: Traffic Breakdowns

Thanks to Yuki Sugiyama

slide-178
SLIDE 178

Socially Interactive Systems Research

scenario boundary conditions sensing, information processing and spread impact on human decision making and social dynamics impact on physical dynamics (evacuation or traffic)

slide-179
SLIDE 179

System Simulations

scenario boundary conditions sensing, information processing and spread impact on human decision making and social dynamics impact on physical dynamics (evacuation or traffic)

slide-180
SLIDE 180

Simulation Framework

Movement Helbing AmI guidance system Models:

  • Trust

Decision

?

PDR Collaborative localisation

slide-181
SLIDE 181

Helbing Motion Simulation

slide-182
SLIDE 182

Helbing +Collaborative PDR

slide-183
SLIDE 183

Helbing +Collaborative PDR+Trust

slide-184
SLIDE 184

[LPF12] P. Lukowicz, S. Pentland, and A. Ferscha. From context awareness to socially aware

  • computing. Pervasive Computing, IEEE, 11(1):32–41, 2012.
slide-185
SLIDE 185

Literature

[AL09] O. Amft and P. Lukowicz. From backpacks to smartphones: Past, present, and future of wearable computers. Pervasive Computing, IEEE, 8(3):8–13, 2009. [BLA08] D. Bannach, P. Lukowicz, and O. Amft. Rapid prototyping of activity recognition applications. Pervasive Computing, IEEE, 7(2):22–31, 2008. [CAL10] J. Cheng, O. Amft, and P. Lukowicz. Active capacitive sensing: Exploring a new wearable sensing modality for activity recognition. Pervasive Computing, pages 319–336, 2010. [GLBR10] H. Gellersen, P. Lukowicz, M. Beigl, and T. Riedel. Cooperative relative

  • positioning. Pervasive Computing, IEEE, (99):1–1, 2010.g modality for activity
  • recognition. Pervasive Computing, pages 319–336, 2010.

[GBLH11] A. Grunerbl, G. Bahle, P. Lukowicz, and F. Hanser. Using indoor location to assess the state of dementia patients: Results and experience report from a long term, real world study. In Intelligent Environments (IE), 2011 7th International Conference on, pages 32–39. IEEE, 2011. [KKL+10] K. Kloch, J. Kantelhardt, P. Lukowicz, P. Wu ̈chner, and H. de Meer. Ad- hoc information spread between mobile devices: A case study in analytical modeling of controlled self-organization in it systems. Architecture of Computing Systems-ARCS 2010, pages 101–112, 2010.

slide-186
SLIDE 186

Literature

[KPLF11] K. Kloch, G. Pirkl, P. Lukowicz, and C. Fischer. Emergent behaviour in collaborative indoor local- isation: an example of self-organisation in ubiquitous sensing

  • systems. Architecture of Computing Systems-ARCS 2011, pages 207–218, 2011.

[KWK+09] K. Kunze, F. Wagner, E. Kartal, E. Morales Kluge, and P. Lukowicz. Does context matter?- a quantitative evaluation in a real world maintenance scenario. Pervasive Computing, pages 372–389, 2009 [LTGLH07] P. Lukowicz, A. Timm-Giel, M. Lawo, and O. Herzog. Wearit@ work: Toward real- world industrial wearable computing. Pervasive Computing, IEEE, 6(4):8–13, 2007.. [LCG11] P. Lukowicz, T. Choudhury, and H. Gellersen. Beyond context awareness. Pervasive Computing, IEEE, 10(4):15–17, 2011. [LNN+12] P. Lukowicz, S. Nanda, V. Narayanan, H. Albelson, D.L. McGuinness, and M.I.

  • Jordan. Qual- comm context-awareness symposium sets research agenda for context-aware
  • smartphones. Per- vasive Computing, IEEE, 11(1):76–79, 2012.

[WLG11] J.A. Ward, P. Lukowicz, and H.W. Gellersen. Performance metrics for activity

  • recognition. ACM Transactions on Intelligent Systems and Technology (TIST), 2(1):6, 2011.

[WLTS06] J.A. Ward, P. Lukowicz, G. Troster, and T.E. Starner. Activity recognition of assembly tasks using body-worn microphones and accelerometers. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 28(10):1553–1567, 2006.

slide-187
SLIDE 187

Some Research Groups

  • MIT Media Lab

– http://hd.media.mit.edu/

  • Dartmouth College
  • K-Lab Pisa

– http://www-kdd.isti.cnr.it/