overview
play

Overview Motivation Verifying Continuous-Time Markov Chains 1 - PowerPoint PPT Presentation

Verifying Continuous-Time Markov Chains Verifying Continuous-Time Markov Chains Overview Motivation Verifying Continuous-Time Markov Chains 1 Lecture 1+2: Discrete-Time Markov Chains What are discrete-time Markov chains? 2 Reachability


  1. Verifying Continuous-Time Markov Chains Verifying Continuous-Time Markov Chains Overview Motivation Verifying Continuous-Time Markov Chains 1 Lecture 1+2: Discrete-Time Markov Chains What are discrete-time Markov chains? 2 Reachability probabilities 3 Joost-Pieter Katoen Qualitative reachability and all that 4 RWTH Aachen University Verifying probabilistic CTL Software Modeling and Verification Group 5 http://www-i2.informatik.rwth-aachen.de/i2/mvps11/ Expressiveness of probabilistic CTL 6 VTSA Summerschool, Liège, Belgium Probabilistic bisimulation 7 September 20, 2011 Verifying ω -regular properties 8 logoRWTH Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 1/135 Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 2/135 Verifying Continuous-Time Markov Chains Motivation Verifying Continuous-Time Markov Chains Motivation Overview Probabilities help Motivation 1 ◮ When analysing system performance and dependability What are discrete-time Markov chains? 2 ◮ to quantify arrivals, waiting times, time between failure, QoS, ... Reachability probabilities 3 ◮ When modelling unreliable and unpredictable system behavior Qualitative reachability and all that 4 ◮ to quantify message loss, processor failure ◮ to quantify unpredictable delays, express soft deadlines, ... Verifying probabilistic CTL 5 ◮ When building protocols for networked embedded systems Expressiveness of probabilistic CTL 6 ◮ randomized algorithms Probabilistic bisimulation 7 ◮ When problems are undecidable deterministically Verifying ω -regular properties 8 ◮ repeated reachability of lossy channel systems, . . . Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 3/135 Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 4/135

  2. Verifying Continuous-Time Markov Chains Motivation Verifying Continuous-Time Markov Chains Motivation Illustrative example: Security Illustrative example: Leader election Security: Crowds protocol [Reiter & Rubin, 1998] Distributed system: Leader election ◮ A protocol for anonymous web browsing (variants: mCrowds, [Itai & Rodeh, 1990] BT-Crowds) ◮ A round-based protocol in a synchronous ring of N > 2 nodes ◮ Hide user’s communication by random routing within a crowd ◮ the nodes proceed in a lock-step fashion ◮ sender selects a crowd member randomly using a uniform distribution ◮ each slot = 1 message is read + 1 state change + 1 message is sent ◮ selected router flips a biased coin: ⇒ this synchronous computation yields a discrete-time Markov chain ◮ with probability 1 − p : direct delivery to final destination ◮ Each round starts by each node choosing a uniform id ∈ { 1, . . . , K } ◮ otherwise: select a next router randomly (uniformly) ◮ Nodes pass their selected id around the ring ◮ once a routing path has been established, use it until crowd changes ◮ If there is a unique id, the node with the maximum unique id is leader ◮ Rebuild routing paths on crowd changes ◮ If not, start another round and try again . . . ◮ Property: Crowds protocol ensures “probable innocence”: ◮ probability real sender is discovered < 1 p 2 if N � 2 · ( c + 1 ) p − 1 ◮ where N is crowd’s size and c is number of corrupt crowd members Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 5/135 Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 6/135 Verifying Continuous-Time Markov Chains Motivation Verifying Continuous-Time Markov Chains Motivation Properties of leader election Probability to elect a leader within L rounds Almost surely eventually a leader will be elected P = 1 ( ♦ leader elected ) � � ♦ � ( N + 1 ) · L leader elected P � q With probability at least 0.8, a leader is elected within k steps � � ♦ � k leader elected P � 0 . 8 Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 7/135 Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 8/135

  3. Verifying Continuous-Time Markov Chains Motivation Verifying Continuous-Time Markov Chains Motivation What is probabilistic model checking? Probabilistic models Nondeterminism Nondeterminism no yes Discrete time discrete-time Markov decision Markov chain (DTMC) process (MDP) Continuous time CTMC CTMDP Other models: probabilistic variants of (priced) timed automata, or hybrid automata Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 9/135 Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 10/135 Verifying Continuous-Time Markov Chains Motivation Verifying Continuous-Time Markov Chains What are discrete-time Markov chains? Probability theory is simple, isn’t it? Overview Motivation 1 What are discrete-time Markov chains? 2 Reachability probabilities 3 In no other branch of mathematics is it so easy to make mistakes Qualitative reachability and all that 4 as in probability theory Verifying probabilistic CTL 5 Henk Tijms, “Understanding Probability” (2004) Expressiveness of probabilistic CTL 6 Probabilistic bisimulation 7 Verifying ω -regular properties 8 Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 11/135 Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 12/135

  4. Verifying Continuous-Time Markov Chains What are discrete-time Markov chains? Verifying Continuous-Time Markov Chains What are discrete-time Markov chains? Geometric distribution Memoryless property Geometric distribution Let X be a discrete random variable, natural k > 0 and 0 < p � 1. The mass function of a geometric distribution is given by: Theorem Pr { X = k } = ( 1 − p ) k − 1 · p 1. For any random variable X with a geometric distribution: p and Var [ X ] = 1 − p We have E [ X ] = 1 and cdf Pr { X � k } = 1 − ( 1 − p ) k . Pr { X = k + m | X > m } = Pr { X = k } for any m ∈ T , k � 1 p 2 This is called the memoryless property, and X is a memoryless r.v.. Geometric distributions and their cdf’s 2. Any discrete random variable which is memoryless is geometrically distributed. Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 13/135 Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 14/135 Verifying Continuous-Time Markov Chains What are discrete-time Markov chains? Verifying Continuous-Time Markov Chains What are discrete-time Markov chains? Joint distribution function Markov property Joint distribution function Markov process The joint distribution function of stochastic process X = { X t | t ∈ T } is A discrete-time stochastic process { X ( t ) | t ∈ T } over state space given for n , t 1 , . . . , t n ∈ T and d 1 , . . . , d n by: { d 0 , d 1 , . . . } is a Markov process if for any t 0 < t 1 < . . . < t n < t n + 1 : F X ( d 1 , . . . , d n ; t 1 , . . . , t n ) = Pr { X ( t 1 ) � d 1 , . . . , X ( t n ) � d n } Pr { X ( t n + 1 ) = d n + 1 | X ( t 0 ) = d 0 , X ( t 1 ) = d 1 , . . . , X ( t n ) = d n } The shape of F X depends on the stochastic dependency between X ( t i ) . = Pr { X ( t n + 1 ) = d n + 1 | X ( t n ) = d n } Stochastic independence Random variables X i on probability space P are independent if: The distribution of X ( t n + 1 ) , given the values X ( t 0 ) through X ( t n ) , only n n � � depends on the current state X ( t n ) . F X ( d 1 , . . . , d n ; t 1 , . . . , t n ) = F X ( d i ; t i ) = Pr { X ( t i ) � d i } . i = 1 i = 1 The conditional probability distribution of future states of a Markov process only depends on the current state and not on its further history. The next state of the stochastic process only depends on the current state, and not on states assumed previously. This is the Markov property. Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 15/135 Joost-Pieter Katoen Verifying Continuous-Time Markov Chains 16/135

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend