6.808: Mobile and Sensor Computing Lecture 10: The Pothole Patrol - - PowerPoint PPT Presentation

6 808 mobile and sensor computing
SMART_READER_LITE
LIVE PREVIEW

6.808: Mobile and Sensor Computing Lecture 10: The Pothole Patrol - - PowerPoint PPT Presentation

6.808: Mobile and Sensor Computing Lecture 10: The Pothole Patrol Hari Balakrishnan hari@mit.edu Slides from Jakob Eriksson road decay unavoidable, hard to predict current monitoring methods costly/ineffective 3 the Pothole Patrol


slide-1
SLIDE 1

6.808: Mobile and Sensor Computing

Lecture 10: The Pothole Patrol

Hari Balakrishnan hari@mit.edu

Slides from Jakob Eriksson

slide-2
SLIDE 2
slide-3
SLIDE 3
  • road decay unavoidable, hard to predict
  • current monitoring methods costly/ineffective

3

slide-4
SLIDE 4

the Pothole Patrol

4

' P

  • t

h

  • l

e d e t e c t e d a t . . . . '

Open WiFi Access Point

GPS localization P2 Central Server

  • pportunistic accelerometer sensing

GPS localization

  • pportunistic data upload

aggregation and reporting

3-axis accelerometer (380 Hz)

slide-5
SLIDE 5

Acceleration vector

5

ay: road direction ax: on road plane, perpendicular to road az: perpendicular to road plane Road

slide-6
SLIDE 6

experimental platform

  • 7 Boston/Cambridge taxis
  • small computer in glove box
  • 380 Hz 3-axis accelerometer
  • 802.11a/b/g wireless interface
  • GPS receiver on roof
  • <time,location,heading,speed,ax,ay,az>

6

slide-7
SLIDE 7

7

wide-area sensing

slide-8
SLIDE 8

8

slide-9
SLIDE 9

9

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1 10 100 1000 Fraction of road segments Number of repeat encounters

7 cars in 10 days 2492 unique km 9730 total km appear @ least once

slide-10
SLIDE 10

P2 architecture

10

3-Axis accelero- meter

GPS Location Interpolator Pothole Record Sensors Clustering

x x x x x x xx

1.5 1.1 2.6

Pothole Detector Vehicle clients Central server

slide-11
SLIDE 11

sensor placement

  • highly accurate
  • difficult mounting
  • extreme exposure

try to stay inside vehicle Pros? Cons?

slide-12
SLIDE 12

12

480 500 520 540 560 580 600 371 372 373 374 375 376 Acceleration (raw) Time into trace (sec) Attached to Windshield

  • very clean signal
  • ‘gold standard’
  • difficult to mount

490 500 510 520 530 540 550 560 570 368 369 370 371 372 373 Acceleration (raw) Time into trace (sec) Attached to Dashboard

  • good signal
  • easy to mount
  • ‘out of the way’
  • 40
  • 20

20 40 60 80 354 355 356 357 358 359 Acceleration (raw) Time into trace (sec) Attached to Embedded PC

  • very poor signal
  • no mounting necessary

WINDSHIELD?

DASHBOARD?

ATTACHED TO PC?

slide-13
SLIDE 13

challenge: “pothole” v. “not pothole”

13

How do I identify pothole vs others?

slide-14
SLIDE 14

pothole v. not pothole

14

slide-15
SLIDE 15
slide-16
SLIDE 16

16

high-pass speed

windows

  • f all event

classes door slams, curbs

z-peak xz-ratio speed vs. z ratio

acceleration, braking, turns pothole detections minor anomalies expansion joints rail crossings speed bumps smaller highway anomalies

OUT IN

P2 detector

256-sample windows

Events usually of much shorter duration than 256 samples

slide-17
SLIDE 17

17

high-pass speed

windows

  • f all event

classes door slams, curbs

z-peak xz-ratio speed vs. z ratio

acceleration, braking, turns pothole detections minor anomalies expansion joints rail crossings speed bumps smaller highway anomalies

OUT IN

P2 detector

256-sample windows

Events usually of much shorter duration than 256 samples

Need to learn threshold parameters (will come back to it later)

slide-18
SLIDE 18

18

slide-19
SLIDE 19

x-z ratio “high enough”

19

slide-20
SLIDE 20

hand-labeled training data

20

slide-21
SLIDE 21

training the detector

21

Type Count Percentage Smooth road (SM) 64 23% Potholes (PH) 63 23% Manholes (MH) 59 21% Railroad Crossing (RC) 18 6% Crosswalk/Exp. Joint (CWEJ) 76 27%

  • manually label training samples
slide-22
SLIDE 22

loosely-labeled training

  • needed to avoid over-training with

unrepresentative manually curated data

  • under-samples “smooth” roads

22

slide-23
SLIDE 23

training the detector

  • pick an objective function
  • ptimize over 3 threshold parameters
  • z-peak
  • xy-ratio
  • speed vs. z-ratio

23

For each set of parameter score s(t) = corr − incorr2. detections reported when using

slide-24
SLIDE 24

detector performance

24

After training on loosely labeled data E.g., 7.3% of detected “potholes” are railroad

Note: Actual false positive rate is not 7.6% Why?

Among the segmented reported as potholes by the algorithm

slide-25
SLIDE 25

25

Road # potholes #win #det. rate Storrow Dr. few 1865 3 0.16% Memorial Dr. few 1781 2 0.12% Hwy I-93 few 2877 5 0.17% Binney St some 6887 25 0.63% Beacham St many 1643 231 14%

estimating false +ve rate

# of sample windows # of detections/ # windows upper bound on FPs

slide-26
SLIDE 26

clustering

26

slide-27
SLIDE 27
  • 7 taxis over 10 days
  • 9730 total km of road covered
  • 2492 unique km of road covered
  • 1.4 million sample windows
  • 4131 severe detections in 2709

locations (after clustering)

27

experiments

slide-28
SLIDE 28

28

slide-29
SLIDE 29

29

potholes 39 sunk-in manholes 3 railways and exp. joints 4 undetermined 2

48 spot-checks

slide-30
SLIDE 30

30

slide-31
SLIDE 31
slide-32
SLIDE 32
  • automatic wide-area road 


quality monitoring

  • use of opportunistic mobility
  • mobile sensing w/ delay-tolerant

communication

  • machine learning classifier with labeled and

loosely-labeled data

  • Data collection and curation is hard!
  • low-cost approach to help solve 


a costly problem

P2: the Pothole Patrol