travel time estimation using approximate belief states on
play

Travel Time Estimation using Approximate Belief States on a Hidden - PowerPoint PPT Presentation

Travel Time Estimation using Approximate Belief States on a Hidden Markov Model Walid Krichene Overview Context Inference on a HMM Modeling framework and exact inference Approximate Inference: the Boyen-Koller algorithm Graph Partitioning


  1. Travel Time Estimation using Approximate Belief States on a Hidden Markov Model Walid Krichene

  2. Overview Context Inference on a HMM Modeling framework and exact inference Approximate Inference: the Boyen-Koller algorithm Graph Partitioning

  3. Overview Context Inference on a HMM Modeling framework and exact inference Approximate Inference: the Boyen-Koller algorithm Graph Partitioning

  4. Context ◮ Mobile Millennium project ◮ Travel time estimation on an Arterial Network

  5. Context ◮ Mobile Millennium project ◮ Travel time estimation on an Arterial Network ◮ Input data: probe vehicles that send their GPS locations periodically

  6. Context ◮ Mobile Millennium project ◮ Travel time estimation on an Arterial Network ◮ Input data: probe vehicles that send their GPS locations periodically ◮ processed using path inference

  7. Context ◮ Mobile Millennium project ◮ Travel time estimation on an Arterial Network ◮ Input data: probe vehicles that send their GPS locations periodically ◮ processed using path inference ◮ observation = (path, travel time along the path)

  8. Objective Improve inference algorithm ◮ Time complexity exponential in size of the network (number of links)

  9. Objective Improve inference algorithm ◮ Time complexity exponential in size of the network (number of links) ◮ Solution: assume links are independent ◮ But lose structure of network

  10. Objective Improve inference algorithm ◮ Time complexity exponential in size of the network (number of links) ◮ Solution: assume links are independent ◮ But lose structure of network ◮ Need approximate inference to keep the structure

  11. Overview Context Inference on a HMM Modeling framework and exact inference Approximate Inference: the Boyen-Koller algorithm Graph Partitioning

  12. Graphical Model ◮ Nodes: random variables ◮ Conditional independence: x and y are independent conditionally on ( n 1 , n 2 ) but not on n 1

  13. Hidden Markov Model ◮ Hidden variables s t ∈ ( s 1 , . . . , s N ) ◮ Observed variables y t ◮ ( s 0 , . . . , s t ) is a Markov process

  14. Hidden Markov Model ◮ Hidden variables s t ∈ ( s 1 , . . . , s N ) ◮ Observed variables y t ◮ ( s 0 , . . . , s t ) is a Markov process ◮ Hidden variables are introduced to simplify the model

  15. Hidden Markov Model ◮ Hidden variables s t ∈ ( s 1 , . . . , s N ) ◮ Observed variables y t ◮ ( s 0 , . . . , s t ) is a Markov process ◮ Hidden variables are introduced to simplify the model ◮ Interesting because provides efficient algorithms to do inference and parameter estimation

  16. Parametrization of a HMM ◮ Initial probability distribution π i = P ( s i 0 )

  17. Parametrization of a HMM ◮ Initial probability distribution π i = P ( s i 0 ) ◮ Transition Matrix: T i , j = P ( s j t +1 | s i t )

  18. Parametrization of a HMM ◮ Initial probability distribution π i = P ( s i 0 ) ◮ Transition Matrix: T i , j = P ( s j t +1 | s i t ) ◮ Observation model: P ( y t | s t )

  19. Parametrization of a HMM ◮ Initial probability distribution π i = P ( s i 0 ) ◮ Transition Matrix: T i , j = P ( s j t +1 | s i t ) ◮ Observation model: P ( y t | s t ) ◮ Completely characterizes the HMM: We can compute probability of any event.

  20. Inference General inference problem: compute P ( s t | y 0: T )

  21. Inference General inference problem: compute P ( s t | y 0: T ) ◮ Filtering if t = T ◮ Prediction if t > T ◮ Smoothing if t < T

  22. Inference General inference problem: compute P ( s t | y 0: T ) ◮ Filtering if t = T ◮ Prediction if t > T ◮ Smoothing if t < T Let y = y 0: T P ( s | y ) = P ( s , y ) α ( s t ) β ( s t ) = P ( y ) � s t α ( s t ) β ( s t ) where α ( s t ) ∆ = P ( y 0 , . . . , y t , s t ) β ( s t ) ∆ = P ( y t +1 , . . . , y T | s t )

  23. Message passing algorithms Recursive algorithm to compute α ( s t ) and β ( s t )

  24. Message passing algorithms Recursive algorithm to compute α ( s t ) and β ( s t ) ◮ α ( s t +1 ) = � s t α ( s t ) T s t , s t +1 P ( y t +1 | s t +1 ) ◮ β ( s t ) = � s t +1 β ( s t +1 ) P ( y t +1 | s t +1 ) T s t , s t +1

  25. Message passing algorithms Recursive algorithm to compute α ( s t ) and β ( s t ) ◮ α ( s t +1 ) = � s t α ( s t ) T s t , s t +1 P ( y t +1 | s t +1 ) ◮ β ( s t ) = � s t +1 β ( s t +1 ) P ( y t +1 | s t +1 ) T s t , s t +1 ◮ Complexity: O ( N 2 T ) operations ◮ α recursion: for every t , N possible values of s t +1 , each α ( s t +1 ) requires N multiplications

  26. Parameter estimation Parameters of the HMM: θ = ( π, T , η ) ◮ T : transition matrix ◮ π : initial state probability distribution ◮ η : parameters of observation model: P ( y t | s t , η ) Parameter estimation: maximize log likelihood w.r.t θ T − 1 T � � � � � ln · · · π s 0 P ( y t | s t , η ) T s t , s t +1 s 0 s 1 s T t =0 t =0

  27. Expectation Maximization algorithm ◮ E step: estimate the hidden (unobserved) variables given the observed variables and the current estimate of θ

  28. Expectation Maximization algorithm ◮ E step: estimate the hidden (unobserved) variables given the observed variables and the current estimate of θ ◮ M step: maximize likelihood function under assumption that latent variables are known (they are “filled-in” with their expected values)

  29. Expectation Maximization algorithm In the case of HMMs: � T − 1 t , s j t =0 ξ ( s i t +1 ) ◮ ˆ T ij = � T − 1 t =0 γ ( s i t ) � T t ) y j t =0 γ ( s i ◮ ˆ η ij = t � T t =0 γ ( s i t ) α ( s i 0 ) β ( s i 0 ) ◮ ˆ π i = � s 0 α ( s 0 ) β ( s 0 ) where ξ and γ are simple functions of α and β . Time complexity O ( N 2 T ) operations

  30. Overview Context Inference on a HMM Modeling framework and exact inference Approximate Inference: the Boyen-Koller algorithm Graph Partitioning

  31. Modeling framework ◮ System modeled using a Hidden Markov Model. ◮ L links

  32. Modeling framework ◮ System modeled using a Hidden Markov Model. ◮ L links Hidden variables Link l : discrete state S l t ∈ { 1 , . . . , K } - state of entire system S t = ( S 1 t , . . . , S L t ) - N = K L possible states - Markov process P ( S t +1 | S 0 , . . . , S t ) = P ( S t +1 | S t )

  33. Modeling framework ◮ System modeled using a Hidden Markov Model. ◮ L links Hidden variables Link l : discrete state S l t ∈ { 1 , . . . , K } - state of entire system S t = ( S 1 t , . . . , S L t ) - N = K L possible states - Markov process P ( S t +1 | S 0 , . . . , S t ) = P ( S t +1 | S t ) Observed variables We observe travel times: random variables Distributions depend on the state of links

  34. HMM

  35. Parametrization of the HMM Transition model T t ( s i → s j ) ∆ = P ( s j t +1 | s i t ) Transition matrix, size K L

  36. Parametrization of the HMM Observation model Probability to observe response y = ( l , x i , x f , δ ) given state s at time t � x f O t ( s → y ) ∆ = P ( y t | s t ) = g l , s ρ l t ( δ ) × t ( x ) dx x i ◮ g l , s t : distribution of total travel time on link l at state s . ◮ ρ l t : probability distribution of vehicle locations (results from traffic assumptions)

  37. Parametrization of the HMM Observation model Probability to observe response y = ( l , x i , x f , δ ) given state s at time t � x f O t ( s → y ) ∆ = P ( y t | s t ) = g l , s ρ l t ( δ ) × t ( x ) dx x i ◮ g l , s t : distribution of total travel time on link l at state s . ◮ ρ l t : probability distribution of vehicle locations (results from traffic assumptions) Assumptions Processes time invariant during 1 hour time slices

  38. Travel time estimation ◮ Estimate state of system

  39. Travel time estimation ◮ Estimate state of system ◮ Estimate parameters of models (observation)

  40. Travel time estimation ◮ Estimate state of system ◮ Estimate parameters of models (observation) ◮ Update estimation when new responses are observed

  41. Travel time estimation ◮ Estimate state of system ◮ Estimate parameters of models (observation) ◮ Update estimation when new responses are observed Belief State p t ( s ) ∆ = P ( s t | y 0: t ) Probability distribution over possible states

  42. Travel time estimation Bayesian tracking of the belief state: forward-backward propagation ( O ( N 2 T ) time) Can be done in O ( N 2 ): O y [ . ] T [ . ] p t → q t +1 → p t +1

  43. Travel time estimation Bayesian tracking of the belief state: forward-backward propagation ( O ( N 2 T ) time) Can be done in O ( N 2 ): O y [ . ] T [ . ] p t → q t +1 → p t +1 Parameter estimation of the model ◮ update parameters of probability distribution of vehicle locations: solve � ln ρ l max t ( x ) x ∈ X l t where X l t are the observed vehicle locations

  44. Parameter estimation of the model ◮ update Transition matrix: EM algorithm in O ( N 2 T ) operations

  45. Parameter estimation of the model ◮ update Transition matrix: EM algorithm in O ( N 2 T ) operations ◮ Exact inference and parameter estimation done in O ( N 2 T ) = O ( K 2 L T ) time complexity

  46. Computational intractability Exact inference and EM algorithm not tractable Size of the belief state and transition matrix exponential in size of network. EM algorithm takes exponential time in L .

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend