Vehicle Velocity Prediction Using Artificial Neural Networks and - - PowerPoint PPT Presentation

vehicle velocity prediction using artificial neural
SMART_READER_LITE
LIVE PREVIEW

Vehicle Velocity Prediction Using Artificial Neural Networks and - - PowerPoint PPT Presentation

Vehicle Velocity Prediction Using Artificial Neural Networks and Effect of Real-World Signals on Prediction Window by, Tushar D. Gaikwad Committee Dr. Zachary Asher, Chair Dr. Richard Meyer Dr. Alvis Fong Western Michigan University 1


slide-1
SLIDE 1

Vehicle Velocity Prediction Using Artificial Neural Networks and Effect of Real-World Signals on Prediction Window

by, Tushar D. Gaikwad Committee – Dr. Zachary Asher, Chair

  • Dr. Richard Meyer
  • Dr. Alvis Fong

Western Michigan University

1

slide-2
SLIDE 2

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Agenda

➢ Introduction/Background:

  • Intelligent Transportation System (ITS)
  • Autonomous vehicles
  • Challenges and strategies
  • Artificial Intelligence(AI)
  • Research gap
  • Literature review
  • Novel contribution

➢ Methodology:

  • Approach for velocity prediction
  • Drive cycle development
  • Analogy for neural networks

➢ Results:

  • Assessment methods
  • Effect of different signals
  • Effect of different neural networks
  • Effect on different prediction window
  • Forward prediction for every 10 sec

➢ Conclusion and future work

  • Summary
  • Conclusion
  • Future Work

2

slide-3
SLIDE 3

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Acknowledgement

➢ University Faculty:

  • Dr. Zachary Asher
  • Dr. Richard Meyer
  • Dr. Alvis Fong
  • Dr. Thomas Bradley
  • Dr. Jennifer Hudson
  • Dr. Ilya Kolmanovsky

3

➢ Graduate Students:

  • Farhang Motallebiaraghi
  • Nick Goberville
  • Nicholas Brown
  • Arron Rabinowitz
  • Amol Patil
  • Johan Fanas
  • Yogesh Jagdale
  • Parth Kadav
  • Marsad Zoardar
slide-4
SLIDE 4

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Introduction

  • The

shift that we are witnessing toward Intelligent Transportation Systems(ITS), will be the most disruptive since the Initial days of automobiles.

  • It has potential to completely transform the

movement of people and goods, enabling safer and smarter transportation.

  • ITS consists of several technologies such as

Advanced Driver Assistance System(ADAS), Automated Driving Functions(ADF), Vehicle to Vehicle(V2V) and Vehicle to Infrastructure(V2I) communication.

4

slide-5
SLIDE 5

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Autonomous Vehicles

  • Autonomous vehicle technology is the key to

improve driver and passenger safety.

  • Control strategies and AI software that powers it

are some of the most critical components.

  • Increase in computational capabilities, enable to

train complex and deep neural networks

5

slide-6
SLIDE 6

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Motivation 1: Fuel Economy

Fuel economy:

  • Climate change is projected to significantly affect

the world

  • Severe consequences of global warming sooner

than expected.

  • Governments around the world have imposed

various Fuel economy (FE) requirements. Strategy:

  • Optimal Energy Management
  • Ego Vehicle velocity Predictions determines the

constraints of the energy optimization problem to increase fuel economy

[Gaikwad, Tushar D., Zachary D. Asher, Kuan Liu, Mike Huang, and Ilya Kolmanovsky. Vehicle Velocity Prediction and Energy Management Strategy Part 2: Integration of Machine Learning Vehicle Velocity Prediction with Optimal Energy Management to Improve Fuel

  • Economy. No. 2019-01-1212. SAE Technical Paper, 2019.]

6

slide-7
SLIDE 7

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Motivation 1: Safety

Safety:

  • Over 37,000 people killed and injured in the US

fatal collisions in 2017

  • These accidents often originate from driver

inattention or impairment

  • Driver error is responsible for 90+% of crashes in

the U.S. Strategy:

  • Collision Risk Estimator
  • Ego Vehicle velocity Predictions can be integrated

into collision risk estimators thereby helping drivers avoid accidents.

[Phillips, Derek J., Real-time Prediction of Automotive Collision Risk from Monocular Video (2019)]

7

slide-8
SLIDE 8

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Challenges and Strategies

How to predict vehicle velocity?

8

slide-9
SLIDE 9

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Artificial Intelligence(AI)

  • AI has ability to model and extract unseen features

and relationships.

  • Powerful tool to predict the future output of any complex

system

  • Machine

learning (ML) is about extracting knowledge from data.

  • ML is intersection of statistics, AI, and computer science

and is also known as predictive analytics or statistical learning.

  • Deep Learning is the subset of ML which has

multilayered neurons

9

slide-10
SLIDE 10

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Artificial Intelligence

  • Deep learning models are algorithms inspired by the

structure and function of the brain called artificial neural networks.

  • Deep learning consists of hidden layers and multiple

neurons.

  • As we construct larger neural networks and train them with

more and more data, their performance continues to increase.

10

slide-11
SLIDE 11

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Research Gap

  • Projects should address research gap
  • Are there certain types of perception algorithms that work better than others?
  • Are there certain sensor inputs that enable high quality predictions?
  • Which is a better model with sensor/signal inputs for different prediction windows as an output

11

slide-12
SLIDE 12

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Year Researchers/Group Remarks 2008- 2009 University of Florida University of Wisconsin- Milwaukee Their study used V2I and GPS signals as inputs into a perception model using traffic model. Later they used NN which was found better than traffic models. 2014 University of Minnesota They used the historical data of speed, and spacing relative to the leading vehicle and V2I as inputs to traffic model 2014 Lefèvre et al. Prediction using ego vehicle velocity over 1-10 sec was compared with respect to parametric and non-parametric models 2015 Lemieux et al. Deep learning networks is also used to predict ego vehicle velocity and route. 2015 Sun et al Radial Basis Function neural networks performed good vehicle speed performance on four standard driving cycles 2015 Amir Rezaei et.al. Studied prediction for 1, 6, 10 seconds with GPS/GIS used in ANN 2015 Hellström and Jankovic Proposed a model for human driver operating an accelerator pedal and used it for prediction 2017 Colorado State University This study used current and previous vehicle velocity and GPS data input to a shallow NN perception model. 2017 Olabiyi et al. Deep Neural Networks (DNNs) is used for prediction 2017 Zhang et al. Utilized V2V and V2I communications for future vehicle velocity prediction. They also developed an energy management strategy based

  • n vehicle velocity prediction.

2017 David Baker et.al. Studied different prediction window for error distribution with NARX model. They used Vehicle speed and GPS from CAN in Narx model. 2018 Beijing Institute of Technology, China Demonstrated velocity forecast with aid of historical data in Gaussian function Neural Network 2019 Liu Kuan et.al. University of Michigan Explored a variety of perception models including auto-regressive moving average, shallow NN, long short term memory (LSTM) deep NN, markov chain, and conditional linear gaussian models. It was determined that the LSTM deep NN provided the best prediction fidelity

Literature

12

slide-13
SLIDE 13

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Novel Contribution

  • Use of different real-world signals in groups to estimate effect on prediction.
  • Use of different models consisting DNN and machine learning models to estimate better prediction

model.

  • Understanding effects of different inputs and models on different prediction window.
  • Use of two assessment methods to understand results.

13

slide-14
SLIDE 14

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Methodology

  • Vehicle Velocity Prediction Strategy
  • Drive Input
  • Deep Learning and Machine Learning Models
  • Prediction Window
  • Assessment

14

slide-15
SLIDE 15

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Methodology

  • Vehicle Velocity Prediction Strategy

15

slide-16
SLIDE 16

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Drive Cycle Development and Signal Recording

  • Collected in August 2019 at Fort Collins,

Colorado.

  • Dataset from repeated drives collected

along a fixed route by the same driver.

  • Route Details
  • 1. Parking Lot
  • 2. West on Mulberry until Shields
  • 3. South on Shields until Prospect
  • 4. East on Prospect until College
  • 5. North on College until Mulberry
  • 6. West on Mulberry until Parking Lot
  • 7. Parking Lot

16

slide-17
SLIDE 17

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Drive Cycle Development and Signal Recording

Data Collection

  • Autonomous driver assistance system

(ADAS) data for the vehicle forward cone from smart radar.

  • Vehicle to infrastructure (V2I) data in the

form of traffic signal information and segment travel times.

  • EGO Vehicle Parameters using

Freematics logger.

17

  • Sr. No.

Parameters 1 Time 2 Latitude 3 Longitude 4 Distance 5 Brake Pedal Position 6 Transmission Gear 7 Engine Speed 8 Max Torque 9 Min Torque 10 Engine Torque 11 Turn Signal 12 Vehicle Speed 13 Acceleration Longitude 14 Acceleration Latitude 15 Yaw Rate 16 Altitude 17 SPaT 18 Segment Speed 19 ADAS

slide-18
SLIDE 18

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Drive Cycle Development and Signal Recording

18

  • Recorded Signals

Velocity on the Longitude vs Latitude vs time Radar Object Detections immediately in front.

slide-19
SLIDE 19

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Drive Cycle Development and Signal Recording

19

  • Recorded Signals

Velocity vs Time for one drive instance. Velocity on the map Terrain

slide-20
SLIDE 20

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Methodology

  • Vehicle Velocity Prediction Strategy

20

slide-21
SLIDE 21

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Neural Network Analogy

Artificial Neural Networks- Temporal Lobe

  • Long term Memory (Weights)
  • Things last through time

CNN - Occipital Lobe

  • Vision
  • Recognition of objects

YET to create !! for Parietal Lobe

  • Sensation and Perception
  • Spatial Coordination System

RNN- Frontal Lobe

  • Short Term Memory
  • Personality, Behavior

21 ANN with Stored Weights CNN YET to Create!! RNN(LSTM)

slide-22
SLIDE 22

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Neural Network Analogy

Artificial Neural Networks- Temporal Lobe

  • Long term Memory (Weights)
  • Things last through time

CNN - Occipital Lobe

  • Vision
  • Recognition of objects

YET to create !! for Parietal Lobe

  • Sensation and Perception
  • Spatial Coordination System

RNN- Frontal Lobe

  • Short Term Memory
  • Personality, Behavior

22

Type Network

RNN

LSTM CNN LSTM

CNN

CNN

Machine Learning

Decision Trees Bagged Trees Random Forest Extra Forest Linear Regression LR With Interactions Ridge KNN

slide-23
SLIDE 23

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Neural Network Analogy

  • LSTM(Long Short-term memory):
  • Special type of Recurrent Neural Network (RNN)
  • A typical LSTM unit is composed of a cell, an input gate, an output gate, and a forget gate
  • The cell remembers values and the three gates regulate the flow of information into and out of the

cell.

  • CNN(Convolutional Neural Network):
  • CNNs are a powerful artificial neural network technique used for regression and classification.
  • These networks preserve the structure of the problem
  • They are also used for regression problems and sequence prediction.
  • A typical CNN consists of convolution layers, pooling layers, and fully connected layer

23

slide-24
SLIDE 24

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Neural Network Analogy

  • Ensembles:
  • Ensemble learning is a technique that combines the predictions from multiple models.
  • It works better if the predictions from the sub-models are not correlated or weakly correlated.
  • Radom Forest:
  • When selecting a split point in decision trees, the learning algorithm can consider through all

variables and all variable values to select the most optimal split point.

  • The random forest algorithm changes this procedure so that the learning algorithm is limited to a

random sample of features

  • Extra Trees:
  • It is also known as Extremely randomized Trees.
  • Extra trees randomize certain decisions to minimize over-learning from the data and overfitting.
  • It splits nodes by choosing cut-points entirely at random
  • Extra Trees seem to keep higher performance in the presence of noise features

24

slide-25
SLIDE 25

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Trained Network Training Dataset 12 Drive Cycles

Training and Testing

  • Divide the Data into two training and testing datasets
  • Develop the Neural Network
  • Train Neural Network based with training dataset
  • Test the neural network and assess the output

25

Training Dataset 12 Drive Cycles Testing Dataset 1 Drive Cycles Recorded Dataset for 13 Drive Cycles Developed Neural Network Testing Dataset 1 Drive Cycle Predicted Output

slide-26
SLIDE 26

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Methodology

  • Vehicle Velocity Prediction Strategy

26

slide-27
SLIDE 27

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Results

  • Assessment

27

  • Mean Absolute Error (MAE) is a

measure of the difference between the two variables.

  • Assume 𝑧t1, … , 𝑧𝑢𝑜 are prediction results

and 𝑎t1, … , 𝑎𝑢𝑜 are target values. The MAE is given by, MAE(Yt, Zt) = σi=1

n

yti − Zti n

  • Time shift is a measure of the time lag

between the predicted time series and target time series

  • It uses a cross-correlation technique for

finding the time shift error, which is given by, time shift = argδmax (෍

𝑜

|Yt−δ × Zt1|)

slide-28
SLIDE 28

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Results

  • Assessment

28

  • MAE = 1.6763 m/s
  • Time Shift = 0 Seconds
  • MAE = 1.6763 m/s
  • Time Shift = 10 Seconds

Increase in MAE Increase in MAE and Time shift

slide-29
SLIDE 29

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Results

  • Assessment

29

  • MAE = 4.3901 m/s
  • Time Shift = 10 Seconds
slide-30
SLIDE 30

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Assessment
  • Effect of Different Signals
  • Effect of different Models
  • Effect on Prediction Window
  • 1- 10 seconds forward prediction

Results

30

slide-31
SLIDE 31

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Signals on Prediction

  • Groups are divided

31

Data Group A Current Velocity GPS Data Group B Current Velocity GPS Previous 5 Seconds EGO and Engine Parameters Data Group C Current Velocity GPS Previous 5 Seconds EGO and Engine Parameters Radar Data Data Group D Current Velocity GPS Previous 5 Seconds EGO and Engine Parameters SPat Data Group E Current Velocity GPS Previous 5 Seconds EGO and Engine Parameters SPat Segment Speed

Group A Group B Group B Group D

Data Group F

Current Velocity GPS Previous 5 Seconds EGO and Engine Parameters SPat Segment Speed

Radar

Data Group G

Current Velocity GPS Previous 5 Seconds EGO and Engine Parameters Segment Speed

Group E Group B

slide-32
SLIDE 32

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Signals on Prediction

32

1st

slide-33
SLIDE 33

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Signals on Prediction

33

slide-34
SLIDE 34

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Signals on Prediction

34

Prediction with Group D Prediction with Group E

slide-35
SLIDE 35

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Results

  • Assessment
  • Effect of Different Signals
  • Effect of different Models
  • Effect on Prediction Window
  • 1- 10 seconds forward prediction

35

slide-36
SLIDE 36

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Models on Prediction

36

1st 2st 3rd 4th

slide-37
SLIDE 37

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Models on Prediction

37

1st 2nd 3rd 4th 6th

slide-38
SLIDE 38

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Models on Prediction

38

LSTM CNN

slide-39
SLIDE 39

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Models on Prediction

39

Random Forest

Extra Trees

slide-40
SLIDE 40

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Results

  • Assessment
  • Effect of Different Signals
  • Effect of different Models
  • Effect on Prediction Window
  • 1- 10 seconds forward prediction

40

slide-41
SLIDE 41

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Effect of different group signals

Effect on Prediction Window

41 0.00 1.00 2.00 3.00 4.00 5.00 6.00 Group A Group B Group C Group D Group E Group F Group G MAE(m/s)

Effect of different signals on prediction window

10 sec 15 sec 20 sec 30 sec 0.00 5.00 10.00 15.00 20.00 25.00 Group A Group B Group C Group D Group E Group F Group G Time shift(Seconds)

Effect of different signals on prediction window time shift

10 sec 15 sec 20 sec 30 sec

slide-42
SLIDE 42

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Effect of different prediction models

Effect on Prediction Window

42

2 4 6 8 10 12 14 16 18 MAE (m/s)

Effect of Different Neural Networks on MAE

10 sec 15 sec 20 sec 30 sec

2 4 6 8 10 12 14 Time Shift(Seconds)

Effect of Different Neural Networks on Time Shift

10 sec 15 sec 20 sec 30 sec

slide-43
SLIDE 43

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Results

  • Assessment
  • Effect of Different Signals
  • Effect of different Models
  • Effect on Prediction Window
  • 1- 10 seconds forward prediction

43

slide-44
SLIDE 44

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Forward Prediction Window
  • Prediction for every second shown for Every 10 second interval

Effect on Forward Prediction Window

44

LSTM

slide-45
SLIDE 45

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Forward prediction window
  • Prediction for every second shown for every 10 second interval

Effect on Forward Prediction Window

45

LSTM

slide-46
SLIDE 46

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Forward Prediction Window
  • Prediction for every second shown for Every 10 second interval

Effect on Forward Prediction Window

46

CNN

Random Forest

slide-47
SLIDE 47

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Summary

47

Group D LSTM 10 second prediction MAE and Time Shift MAE = 1.78 m/s Time Shift =2.35

  • Autonomous vehicles is the key to move towards ITS
  • It uses AI to power different strategies to enable safer and smarter transportation.
  • Velocity prediction is very important to develop those strategies.
slide-48
SLIDE 48

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Conclusion

  • For Evaluating Results its better to use MAE along with time shift.
  • Dataset with SPaT data performs better in terms of getting accurate results
  • Classical Machine learning models gives better time shift error.
  • In Classical Machine learning Random Forest and Extra Trees perform better in terms of Time shift,

which is 1.50 sec.

  • Overall Artificial NN gives better MAE.
  • LSTM performs the best in terms of MAE, which is 1.78m/s.
  • Increase in Prediction Window Increases MAE and Time shift in case of ANN.

48

slide-49
SLIDE 49

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Future Work

Future Work:

  • Study with More drive cycle on different types of roads
  • Inclusion of V2V data and Camera data
  • More V2I data.
  • Implementation of trained model on NVIDIA drive PX2 to test prediction

accuracy on the road.

49

Overall conclusion: Accurate velocity prediction can be achieved using LSTM and dataset with different features consisting with SPaT data, which can be used in different autonomous vehicle strategies.

slide-50
SLIDE 50

50

Thank You!

slide-51
SLIDE 51

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Forward Prediction
  • Prediction for every second shown for Every 10 second interval

Forward Prediction

51

CNN

Random Forest

slide-52
SLIDE 52

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Forward Prediction
  • Prediction for every second shown for Every 10 second interval

Forward Prediction

52

LSTM

slide-53
SLIDE 53

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Forward Prediction
  • Prediction for every second shown for Every 10 second interval

CNN and LSTM

53

LSTM CNN

slide-54
SLIDE 54

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Feature Analysis with Extra Tree

Backup for Feature Selection

54

Score Feature Rank

Rank Feature Number Parameter 1 5 Brake Pedal Position 2 12 Vehicle Speed 3 7 Engine Speed 4 4 AccelPedal 5 28 traffic phase 1st 6 34 Previson 2 sec velocity 7 23 distance to traffic 1st 8 33 Previson 1 sec velocity 9 1 Latitude 10 30 traffic phase 3rd 11 29 traffic phase 2nd 12 3 Distance 13 11 Turn Signal 14 6 Transmission Gear 15 10 Engine Torque 16 24 distance to traffic 2nd 17 31 traffic phase 4th 18 2 Longitude 19 25 distance to traffic 3rd 20 22 Trafic Indices 5th 21 16 Altitude 22 26 distance to traffic 4th 23 32 traffic phase 5th 24 27 distance to traffic 5th 25 20 Trafic Indices 3rd 26 21 Trafic Indices 4th 27 18 Trafic Indices 1st 28 19 Trafic Indices 2nd 29 35 Previson 3 sec velocity 30 37 Previson 5 sec velocity 31 17 Steer Angle 32 8 Max Torque 33 36 Previson 4 sec velocity 34 13 Acceleration Logitude 35 14 Acceleration Lattitude 36 15 TawRate 37 9 Min Torque

slide-55
SLIDE 55

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Linear Regression

  • 1. Salary Vs Experience
  • 2. Find Minimum Sum of squares

55

slide-56
SLIDE 56

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Decision Tree

  • 1. X1 and X2 as input variable for prediction of Y
  • 2. Find the split points

56

slide-57
SLIDE 57

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

CNN

  • 1. Input Data on which multiple feature

detectors (Filters) are applied, which comprises of Convolutional Layer.

  • 2. On the Top of that, we applied Relu,

to remove linearity,

  • r

increase nonlinearity.

  • 3. Then

we apply, Max pooling is applied to create , to make sure special invariance is present.(In case data is not similar). It reduces the size of

  • data. Also avoids overfitting of the

data.

  • 4. Flattening to create one long vector.
  • 5. Input is given to Fully connected

ANN.

57

slide-58
SLIDE 58

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

RNN

  • 1. Simple ANN, 3 inputs, 2 outputs, 1

hidden layer,

  • 2. If

we

  • bserve

ANN in new dimension.

  • 3. Temporal Loop is added to HL ,

Hidden layer not only gives output, but also feedbacks to itself.

  • 4. When unrolled , it represents RNN
  • 5. This gets neurons to have short

term memory, which allows them pass information to pass information.

58

slide-59
SLIDE 59

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

LSTM

1. Decision is made by a sigmoid layer called the “forget gate layer.” whether to throw away the cell state. 2. First, a sigmoid layer called the “input gate layer” decides which values we’ll update. A tanh layer creates a vector of new candidate values, Ct, that could be added to the state. 3. The old cell state, Ct−1 is updated, into the new cell state Ct. We multiply the old state by ft, forgetting the things we decided to forget earlier. Then we add it∗C~t. This is the new candidate values, scaled by how much we decided to update each state value. 4. The output will be based filtered version of

  • n our cell state Ct. Sigmoid layer decides

what parts of the cell state we’re going to

  • utput. By multiplying sigmoid output and

cell state in tanh we can get the final output.

59

slide-60
SLIDE 60

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Cross Correlation

  • What is Cross Correlation?

Cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product.

60

slide-61
SLIDE 61

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

  • Group A

Effect of Different Signals on Prediction

61

Data Group A Current Velocity GPS

  • MAE = 3.46 m/s
  • Time Shift = 9.6
slide-62
SLIDE 62

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Signals on Prediction

  • Group B

62

Data Group B Current Velocity GPS Previous 5 Seconds EGO and Engine Parameters

  • MAE = 2.19 m/s
  • Time Shift = 4.2
slide-63
SLIDE 63

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Signals on Prediction

  • Group C

63

  • MAE = 2.16 m/s
  • Time Shift = 4.2

Data Group C Current Velocity GPS Previous 5 Seconds EGO and Engine Parameters Radar Data

slide-64
SLIDE 64

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Signals on Prediction

  • Group D

64

  • MAE = 1.86 m/s
  • Time Shift = 2.7

Data Group D Current Velocity GPS Previous 5 Seconds EGO and Engine Parameters SPat

slide-65
SLIDE 65

WESTERN MICHIGAN UNIVERSITY Energy Efficient & Autonomous Vehicles Laboratory

Introduction Methodology Results Conclusion and Future Work

Effect of Different Signals on Prediction

  • Group E

65

  • MAE = 1.92 m/s
  • Time Shift = 3.1

Data Group E Current Velocity GPS Previous 5 Seconds EGO and Engine Parameters SPat Segment Speed