From Context Awareness to Socially Aware and Interactive Systems - - PowerPoint PPT Presentation
From Context Awareness to Socially Aware and Interactive Systems - - PowerPoint PPT Presentation
From Context Awareness to Socially Aware and Interactive Systems Paul Lukowicz DFKI/University of Kaiserslautern Germany Overview Part 1: Introduction of key concepts Part 2: Basic technologies Sensing Reasoning Part 3:
Overview
- Part 1: Introduction of key concepts
- Part 2: Basic technologies
– Sensing – Reasoning
- Part 3:
– phenomenology of complex socio-technical systems – applications and outlook
Overview
- Standard Sensors
– Motion, Sound, Location, FSR,RFID, BlueTooth
- Exotic Sensors
Acceleration Sensor
- Acceleration sensors provides two types of infomration
Microphone GPS Compass Bluetooth Wireless LAN Accelerometer Gyroscope Camera
Phone Sensors
- Acceleration sensors provides two types of infomration
Angle towards the gravity vector (tilt)
Acceleration Sensor
1g X Y Z
Acceleration Sensor
- Acceleration sensors provides two types of infomration
Angle towards the gravity vector (tilt Change of motion speed
X Y Z
Acceleration Sensor
- Acceleration sensors provides two types of infomration
Angle towards the gravity vector (tilt) Change of motion speed These components can not be easily separated !
1g
Begrüssung
- 2g
+2g 0g +1g /-2g /+2g /0g
time
0g
Filterung und Bechleunigungskomponenten
Raw signal Low pass High pass High pass (fcut=2Hz) Low pass (fcut=2Hz)
Walking Up and Down Stairs
Acceleration on the upper leg Down Up
IMUs
- XSENSE inetrial Measurement Unit (IMU)
– 3 axis accelerometer – 3 axis magnetic – 3 axis gyro
Derives absolute orientation of the device with respect to
- gravity direction
- north
Application of IMUs
Magnetic Field and Light
Multisensor Signal
detect sitting down
Sound
FFT and LDA with kitche chene nett tte sounds ds
Water Pipe Sounds
Forgarty, Au, Hudson, UIST 2006
Water Pipe Sounds
Forgarty, Au, Hudson, UIST 2006
Water Pipe Sounds
Audio: Conversation
- Intensity difference reveals if the owner is
speaking or listening
– human voice distinguishable from other noises
Dietary Monitoring
Initial success in detecting and classifying chewing sounds
– How many bites ? – What type of food was eaten?
Oliver Amft, Mathias Stäger, Paul Lukowicz, and Gerhard Tröster Analysis of Chewing Sounds for Dietary Monitoring,
- Proc. UBICOMP 2005, Tokyo
microphone gets sounds through bone conduction
Microphone position
1 2 3 4 5 6
Bite Detection
4 bites of apple
FSR for Muscle Activity Monitoring
- thickness <0.5mm
- sampling rate: >100Hz
- power loss <1mW
typical force curve
FSRs in den Schuhsohlen
normal hoch runter
Muskelaktivität
- Bewegung wird durch das Zusammenspiel verschiedener Muskel
hervorgerufen
- Auch kleine Unterschiede in den Bewegungsmustern sind klar in der
Muskelaktivität zu erkennen
Long vs. Short Strides
short long
Up vs. Downstairs
up down
Up vs. Downstairs
up down
Squats Technique
correct asymetric cheat
RFID
RFID Example: Intel
Patterson, Fox, Kauz, Philipose, ISWC 2005
Bluetooth: Presence of People
BT scans can reveal identities of people who interact with each other
BT
Location
- Types
– absolute coordinates – semantic (e.g. which room) – proximity
- Parameters
– precision (30cm to some meters until one room) – reliability (how often am I right ?) – installation – price – API
- outdoor location trivial (GPS),
- indoor location unsolved
Beacon Based Principles
Beacon Based Principles
Distance Estimation Methods
- 1. Time of flight
– for indoor environments signal speed should be small – problems with multi-path
- 2. Signal strength
– problems with non distance dependent attenuation
Beacon Technical Issues
- Occlusions/Absorbtion
– E.g. metal absorbs radio signals
- Reflections
– signals bounce around providing false information
- Noise
- User Acceptance
– electro-smog “hysteria”
Beacon Technologies
- Ultrasound: Time Of Flight
– used with RF for communication and synchronization
- RF: Time Of Flight, Signal Strength
– WiFi (RADAR, Ekahu) – Bluetooth,Zigbee – Active RFID – Ultrawideband (UWB e.g. Ubisense)
Beacon Technologies
- Ultrasound: Time Of Flight
– used with RF for communication and synchronization
- RF: Time Of Flight, Signal Strength
– WiFi (RADAR, Ekahu) – Bluetooth,Zigbee – Active RFID – Ultrawideband (UWB e.g. Ubisense)
Wifi Signal Strength
WLAN Fingerprinting
UWB Location
Example of UWB
- Ubisense
– Ubisense delivers a precise, real-time location system (RTLS) utilizing ultra-wideband (UWB) technology that locates assets and people within 30cm / 12" in 3D.
15’000 Euro for a research set !
Performance UWB
Floor Sensors
- Capacitive tracking of objects on the floor
– www.future-shape.com
Overview
- 1. Standard Sensors
– Motion, Sound, Location, FSR,RFID, BlueTooth
- Exotic Sensors
Relative position information?
48
Many activites are determined by relative positions of two body parts. („Hand is near chest.“)
Relative Positioning
49
Acceleration, Gyroscopes and Earth Magnetic Field Sensors Example: Xsens Relative Position information
- nly indirectelyavailable (Euler
angles + knowledge about sensor positions) Ultrasound Example: Relate Brick Obstacles shield measurements. RF Based Example: Ubisense Low accuracy and infrastructure needed
Physical Principle
50
Transmitter generates a magnetic field at a certain resonant frequency f Receiver is calibrated to resonant frequency f
51
Magnetic Field is oscilating Coil 1 Coil 2 Coil 4 Coil 3 Coil 5
Trinken Magnetic
Poor Men’s X-Ray: Capacitors
Physical background
- Human body can be considered as part of a
capacitor
- change of distance or moving of inside organs
result in capacitance change
Sensing Setup
attached to fit tightly but remain comfortable
Signal Examples
Eye Tracking
Andreas Bulling, ETH Zürich
Augenverfolgung
Eye Activity when Reading
- Bulling Eth Zürich
Bread cutter Mixer Water boiler Running Cutting -> bread, salami Mixing level Consitency of fluid Amount of boiled water
Current Sensing
Current Sensing
iSensor:
- stream raw ADC values of the current used power
consumption
- stream recognized device activity events
Kitchen Scenario
Why we choose a kitchen scenario:
- Lot of kitchen devices provide different modes
- Kitchen devices can be used in different ways
- In general: Preparing food is still quite hard to
recognize
Analyzed kitchen devices:
Kitchen Scenario – Signal Analysis Device: Mixer – Raw ADC values
Kitchen Scenario – Signal Analysis Device: Bread cutter – Raw ADC values
Kitchen Scenario – Signal Analysis Device: Water boiler (cold water) – Raw ADC values
Kitchen Scenario – Signal Analysis Device: Fridge – Raw ADC values
Kitchen Scenario – Signal Analysis Device: Egg boiler, toaster, coffee machine
3 or 5 eggs boiled Soft, medium or hard boiled Small or big coffee Bright / brown or dark toasted
Kitchen Scenario – Data Processing
Objective: avoid data streaming!!!
- > send only events (e.g. mixer mode: level 2)
Jennic Zigbee microcontroller for data processing
- > just simple feature calculation methods and recognition
algorithms Features Min Max Sum Variance Average (take max over the last 600ADC and calc features using the last 21 max ADC) Time Duration of a device activity/mode
+
Kitchen Scenario – Data Processing
Objective: avoid data streaming!!!
- > send only events (e.g. mixer mode: level 2)
Jennic Zigbee microcontroller for data processing
- > just simple feature calculation methods and recognition
algorithms Thresholds could be sufficient!!!
Kitchen Scenario – Data Processing
Some examples:
Kitchen Scenario – Data Processing
Evaluation
Single device usage: Fridge – door open events: 35 hours monitored, 91% recognition rate, 9% missed
Evaluation
Water boiler
- boil 0.75l, 1.0l, 1.25l, 1.5l and 1.7l of cold water
deviation: between 0.006l and 0.076l
Evaluation
Bread cutter
- cut slices of bread and salami, 6 different people,
28 slices of bread, 29 slices of salami
93,10% 96,42% Classificatio n:
Cutting activites were recognized two times
Evaluation
Mixer - 7 users, mix both a liquid and a creamy fluid: chocolate milk (250ml milk, 2 spoons of chocolate powder) + quark-yoghurt dish (250g quark and 1 yoghurt)
Result: 100% of chocolate milks = fluid liquids 100% of quark-yoghurt dishes = creamy liquids 57% of quark-yoghurt dishes = change from medium consistency to a creamy one
Evaluation
6 persons performed:
Evaluation
Several devices at the same time:
+ + +
Overview
- Part 1: Introduction of key concepts
- Part 2: Basic technologies
– Sensing – Reasoning
- Part 3:
– phenomenology of complex socio-technical systems – applications and outlook
Overview
- 1. General Background
- 2. Examples
What is Context Recognition ?
Embedded Controllers: feedback control loop Artificial Intelligence: imitating human cognition
- often video based
- includes interpretation
Context Recognition: mapping signals from simple sensors onto a set of predefined, environment related states
Context Aware Systems 2000
- Gellersen, Schmidt, Beigl 1999, from the SmartIts project
- also Day, Abowd, Pentland, Starner, Schiele,... around 2000
acceleration, sound, light...
Context Aware Systems 2000
- Gellersen, Schmidt, Beigl 1999, from the SmartIts project
- also Day, Abowd, Pentland, Starner, Schiele,... around 2000
acceleration, sound, light... modes of locomotion
Microphone GPS Compass Bluetooth Wireless LAN Accelerometer Gyroscope Camera
....and 2012
Context Recognition
IEEE PAMI 06, PMC 07, Pattern Rec 08, ACM TIST 11, Springer PAA 11 ISWC, Pervasive,..
Basic Principle
gyro acc
Basic Principle
class membership inferred from coordinates only
gyro acc ???????
Basic Principle
gyro acc
class membership inferred from coordinates only
Separation Plane
gyro acc
recognition given through separation plane
complex separation plane
gyro acc
- ften complex and impossible to
represent in an analytic way
Recognition Issues
- 1. representation of the
separation plane – memory requirements !
- 2. locating a point with respect
to the separation place
- 3. Identifying the separation
plane – Generation from a sample – cost minimazation
gyro acc
Learning Separation Planes
gyro acc
Learning Separation Planes
separation plane is not perfect
- ambivalent sensor data
- size of the sample
- sensor noise
gyro acc
Begrüssung
- 2g
+2g 0g +1g /-2g /+2g /0g
time
0g
Analyzing Recordings
Standing up Sitting down Sitting Standing Sitting
Torso Wrist
Arm on table Arm resting on table Arm moving Arm moving Handshake time [s] acceleration [g]
Time Series
z„„1 z„„2 z„„3 z„„N … z„1 z„2 z„3 z„N …
Process P„ Process P„„
t
Time Series
z*1 z*2 z*3 z*N …
Process P„ Process P„„
t ?????
Context Rec. Challenges
- 1. Ambiguous sensors
– e.g. phone in a pocket tracking patient care
- 2. Lack of representative training data
– huge inter and intra person variations – training with real users often not practicable
- 3. Inconsistent state space
– e.g walking, having a meal and taking a pill – different granularities, temporal scales, and levels of abstraction
- 4. Dynamically changing sensor configurations
– phone moves around, different phones, new sensors
Context Problem Types
state action situation parameters
Context Problem Types
state action situation parameters
- physical characterization
- ‘classical’ signal processing
?
Context Problem Types
state action situation parameters
- separable in time invariant
feature space
- sliding window algorithms
size window shift
Signal stream
Context Problem Types
state action situation parameters
- characterized by signal sequence
- time series analysis
- ‘spotting’ problem
Does this section contain a gesture ?
- r this one?
Context Problem Types
state action situation parameters
- specific, sequence/distribution of actions
- PCFG, Bayesian networks, ontologies…
Context Problem Types
state action situation parameters
Context Recognition Example
does this section contain a gesture ?
- r this one?
- no model of NULL class
- long NULL class segments
- variable length ‘true positives’
Confusion Matrix
Spotting Errors
Confusion Matrix
Precission Recall Graphs
Spotting Errors
Overview
- 1. General Background
- 2. Examples
Example: Assembly Monitoring
Example: Assembly Monitoring
1,3 microphones 2,4,5 accelerometers 6 computer
- P. Lukowicz, J. Ward, H. Junker, M. Stäger, G. Tröster, A. Atrash, T. Starner
Recognizing Workshop Activity Using Body Worn Microphones and Accelerometers, Pervasive Computing, pages 18-22, Vienna, Austria, 18.-23. April 2004.
Drilling
∘Well defined, steady motion
- slowly turn handle to lower, then return
- constant machine noise
time
z-axis sound x-axis
Sawing
∘Well defined, repetitive motion
- back and forth
- loud correlated sound
time
x-axis z-axis sound
Approach
Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data Final predictions IA Prediction Sequence Audio Ch2. FFT LDA Audio Ch1.
- Min. Dist.
Audio Ch2. jm smoothing
Signal al Seg egme mentation ntation
IA Audio Ch1. Audio Ch2.
1 » Þ »
chest wrist chest wrist
I I d d 1 >> Þ <<
chest wrist chest wrist
I I d d
hand on the switch
So Sound nd Classific ssification ation
FFT LDA IA Audio Ch1. Audio Ch2.
- Min. Dist.
Pre-computed LDA training vectors
Sound nd Cl Class ssification fication
Minimum distance
- r K-Nearest Neighbour
Test frame 256 point FFT reduced to 3D with LDA
Fram ame e by by frame me evalu luation ation
Audio Ch1. Audio Ch2.
- ground truth
Fram ame e by by frame me evalu luation ation
FFT LDA Audio Ch1. Audio Ch2.
- Min. Dist.
- ground truth
- LDA
Sawing
Fram ame e by by frame me evalu luation ation
IA Prediction Sequence Audio Ch2. FFT LDA Audio Ch1.
- Min. Dist.
Audio Ch2.
- ground truth
- LDA
- LDA+IA
- ground truth
- LDA+IA
Continuous ntinuous evalu luation ation
- ground truth
- LDA+IA
- jm(LDA+IA)
IA Prediction Sequence Audio Ch2. FFT LDA Audio Ch1.
- Min. Dist.
Audio Ch2. jm smoothing
Acceleration Modeling
HMM models ML classification 3x3 axis raw data prediction (sampled at 100Hz) single gaussion, mostly feed forward
Results
- Sound: 109 Events
– 97 correct – 94 insertions – 2 deletions – 10 substitutions
- Motion: von 109 Events
– 102 correct – 23 insertions – 7 substitutions
Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data
Results
- Sound: 109 Events
– 97 correct – 94 insertions – 2 deletions – 10 substitutions
- Motion: von 109 Events
– 102 correct – 23 insertions – 7 substitutions
Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data Final predictions
Sensor Fusion
- Sound: 109 Events
– 97 correct – 94 insertions – 2 deletions – 10 substitutions
- Motion: von 109 Events
– 102 correct – 23 insertions – 7 substitutions
- Combined 109 Events
– 92 correct – 0 insertions – 17 deletions – 0 substitutions
Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data Final predictions
Classifier Fusion
- Methods
– Comparison – Highest Rank – Borda Count – Logistic Regression
- Aims
– improve person independent performance – deletion/insertion tradeoffs
Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data Final predictions Fusion
NULL
Classifier Fusion
Sound classif. Motion classif. Sound Analysis Segment info. Sound data Motion data Final predictions Fusion
NULL
- Methods
– Comparison – Highest Rank – Borda Count – Logistic Regression
- Aims
– improve person independent performance – deletion/insertion tradeoffs
- Result
– logistic regression 10% better for person independent case 400 events in 20 sequences, from 5 subjects
Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers, J.A. Ward and P. Lukowicz and G. Troester and T. Starner, IEEE Trans. Pattern Analysis and Machine Intelligence, 28:10, October 2006
Ultrasound
Location classif. Motion classif. Hands Location Segment info. Ultrasound Motion data Final predictions Fusion
NULL
Ultrasound
Approach
drift away after a few sec 2Hz sampling rate and many errors
Approach
LSQ estimate from US, „plausability“ filtered
- ext. Kalman from
from US+IMU
- ext. Kalman from US
Approach
EM clustering
- n multivariate
Guassian mixture discretization into motion strings, creation of matching cost graph, minimum tresholding
- n location spotted segments
distance based spotting with temporal smoothing
Approach
- more detailed examination of output of trajectory spotting with respect to location (with
tresholds or SVM classification)
- concurrency based plausibility analysis
Results
location spotting trajectory spottingspotting SVM based post processing concurency resolution
Muscle Monitoring
Candidate class.. HMM classif. Segment info. Motion data Final predictions Similarity search Partitioning Candidate sel. HMM Features Hand muscles
Sensing Muscle Activities with Body-Worn Sensors, Amft, O., Junker, H., Lukowicz, P., Tröster, G., and Schuster, C., In BSN 2006: Proceedings of the International Workshop on Wearable and Implantable Body Sensor Networks., pp. 138-141, April 2006
Muscle Monitoring
Skoda Qaulity Control
2nd Generation Activitiess
Sensors: Location
Sensors: Location
Ubisense tags
Location Raw Data
Smoothed and Filtered Result
Sensors: Motion
Sensors: Inertial
Moption Signal
Sensors: FSR for Muscle Activity
FSR for Muscle Activity Monitoring
Sensing Muscle Activities with Body-Worn Sensors, Amft, O., Junker, H., Lukowicz, P., Tröster, G., and Schuster, C., In BSN 2006: Proceedings of the International Workshop on Wearable and Implantable Body Sensor Networks., pp. 138-141, April 2006
Combined Signal Examples
Algorithmic Approach
- Processing separation
– Activity-wise – Sensor- / feature-wise
- High recall spotting stage
- Incremental fusion of additional information
Spotter for class 1 Spotter for class N
Results
1 open hood 2 close hood 3 open trunk 4 check trunk 5 close trunk 6 fuel lid 7 open left door 8 close left door 9 open right door 10 close right door 11 open two doors 12 close two doors 13 mirror 14 check trunk gaps 15 lock check left 16 lock check right 17 check hood gaps 18 open the spare wheel box 19 close the spare wheel box 20 writing
Spotter for class 1 Spotter for class N Motion spotting FSR classification of spotted gestures Masking step FSR Masking step location Masking step FSR + location final
+
x
- 8 subjects,
- 3680 checking
- 560 minutes of data
Overview
- Part 1: Introduction of key concepts
- Part 2: Basic technologies
– Sensing – Reasoning
- Part 3:
– phenomenology of complex socio-technical systems (collective sensing and communication phenomena) – applications and outlook
Application Types
exploiting people‟s mobility to sense parameters over large areas sensing/recognizing collective phenomena collaborative sensing and reasoning to improve accuracy
Application Types
exploiting people‟s mobility to sense parameters over large areas sensing/recognizing collective phenomena collaborative sensing and reasoning to improve accuracy influencing behaviour in the large
Application Types
exploiting people‟s mobility to sense parameters over large areas sensing/recognizing collective phenomena collaborative sensing and reasoning to improve accuracy influencing behaviour in the large
Large Scale Sensing
exploiting people‟s mobility to sense parameters over large areas
Example
TomTom realtime traffic combining
- Speed/location data from TomTom systems
- Phone GSM cell data from
Crowd Density Classification
from number of discoverable devices to crowd density
Discoverable BlueTooth Devices
Wepner, Lukowicz accepted, PhoneSense 11
160
Experiment Area - Munich Oktoberfest
160
3 days at the Octoberfest 2010
162
Methods I
- Individual device based estimation
– Number of Bluetooth devices – Mean signal strength – Variance of the signal strengths
163
Methods II
- Collaborated device based estimation
– Average number of devices – Variance of the number of devices – Variance of all signal strengths
Crowd Density Classification
a) b) c) d) e) f) g) h) i) 0!% 30!% 60!% 90!% 81!% 82!% 63!% 76!% 78!% 63!% 58!% 60!% 50!% 64!% 40!% 41!% 40!% 64!% 39!% 29!% 39!% 34!%
multiple phones, collaborative
Application Types
exploiting people‟s mobility to sense parameters over large areas collaborative sensing and reasoning to improve accuracy
Inertial tracking
true path estimated
r s r v (t)dt r a (t)dt
dt
Collaborative Location
Collaborative Location
Collaborative Location
Collaborative Location
Experimental Results
15 30 45 60 75 90 25 50 75 100 time min mean localisation error m
PDR error collaborative localisation error, 10 20
Based on 60km of traces from 10 people at a 3 day festival in Malta
Kloch, Lukowicz, ISWC 11
Application Types
exploiting people‟s mobility to sense parameters over large areas sensing/recognizing collective phenomena collaborative sensing and reasoning to improve accuracy
Simple Idea
= consumer confidence
Allianz Arena Experiments
3 second and 1 first league games at Allianz Arena recorded with between 8 and 12 participants
Approval Disapproval Cheering
Allianz Arena Experiments
- Classification of relevant audio segments works
with an accuracy of almost 100 %
- Automatically selecting relevant segments from
the continuous audio stream is still a work-in- progress cooperation between many phones !
Application Types
exploiting people‟s mobility to sense parameters over large areas sensing/recognizing collective phenomena collaborative sensing and reasoning to improve accuracy influencing behaviour in the large
As Coupling Gets Stronger, System Behavior Can Change Completely: Traffic Breakdowns
Thanks to Yuki Sugiyama
Socially Interactive Systems Research
scenario boundary conditions sensing, information processing and spread impact on human decision making and social dynamics impact on physical dynamics (evacuation or traffic)
System Simulations
scenario boundary conditions sensing, information processing and spread impact on human decision making and social dynamics impact on physical dynamics (evacuation or traffic)
Simulation Framework
Movement Helbing AmI guidance system Models:
- Trust
- …
Decision
?
PDR Collaborative localisation
Helbing Motion Simulation
Helbing +Collaborative PDR
Helbing +Collaborative PDR+Trust
[LPF12] P. Lukowicz, S. Pentland, and A. Ferscha. From context awareness to socially aware
- computing. Pervasive Computing, IEEE, 11(1):32–41, 2012.
Literature
[AL09] O. Amft and P. Lukowicz. From backpacks to smartphones: Past, present, and future of wearable computers. Pervasive Computing, IEEE, 8(3):8–13, 2009. [BLA08] D. Bannach, P. Lukowicz, and O. Amft. Rapid prototyping of activity recognition applications. Pervasive Computing, IEEE, 7(2):22–31, 2008. [CAL10] J. Cheng, O. Amft, and P. Lukowicz. Active capacitive sensing: Exploring a new wearable sensing modality for activity recognition. Pervasive Computing, pages 319–336, 2010. [GLBR10] H. Gellersen, P. Lukowicz, M. Beigl, and T. Riedel. Cooperative relative
- positioning. Pervasive Computing, IEEE, (99):1–1, 2010.g modality for activity
- recognition. Pervasive Computing, pages 319–336, 2010.
[GBLH11] A. Grunerbl, G. Bahle, P. Lukowicz, and F. Hanser. Using indoor location to assess the state of dementia patients: Results and experience report from a long term, real world study. In Intelligent Environments (IE), 2011 7th International Conference on, pages 32–39. IEEE, 2011. [KKL+10] K. Kloch, J. Kantelhardt, P. Lukowicz, P. Wu ̈chner, and H. de Meer. Ad- hoc information spread between mobile devices: A case study in analytical modeling of controlled self-organization in it systems. Architecture of Computing Systems-ARCS 2010, pages 101–112, 2010.
Literature
[KPLF11] K. Kloch, G. Pirkl, P. Lukowicz, and C. Fischer. Emergent behaviour in collaborative indoor local- isation: an example of self-organisation in ubiquitous sensing
- systems. Architecture of Computing Systems-ARCS 2011, pages 207–218, 2011.
[KWK+09] K. Kunze, F. Wagner, E. Kartal, E. Morales Kluge, and P. Lukowicz. Does context matter?- a quantitative evaluation in a real world maintenance scenario. Pervasive Computing, pages 372–389, 2009 [LTGLH07] P. Lukowicz, A. Timm-Giel, M. Lawo, and O. Herzog. Wearit@ work: Toward real- world industrial wearable computing. Pervasive Computing, IEEE, 6(4):8–13, 2007.. [LCG11] P. Lukowicz, T. Choudhury, and H. Gellersen. Beyond context awareness. Pervasive Computing, IEEE, 10(4):15–17, 2011. [LNN+12] P. Lukowicz, S. Nanda, V. Narayanan, H. Albelson, D.L. McGuinness, and M.I.
- Jordan. Qual- comm context-awareness symposium sets research agenda for context-aware
- smartphones. Per- vasive Computing, IEEE, 11(1):76–79, 2012.
[WLG11] J.A. Ward, P. Lukowicz, and H.W. Gellersen. Performance metrics for activity
- recognition. ACM Transactions on Intelligent Systems and Technology (TIST), 2(1):6, 2011.
[WLTS06] J.A. Ward, P. Lukowicz, G. Troster, and T.E. Starner. Activity recognition of assembly tasks using body-worn microphones and accelerometers. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 28(10):1553–1567, 2006.
Some Research Groups
- MIT Media Lab
– http://hd.media.mit.edu/
- Dartmouth College
- K-Lab Pisa