real time modeling and test case generation
play

Real-Time Modeling and Test Case Generation Konrad Krentz A - PowerPoint PPT Presentation

Real-Time Modeling and Test Case Generation Konrad Krentz A Convincing Safety Case [1] 2 A Convincing Safety Case Fault Formal () Architecture Tree Testing Methods Analysis UPPAAL Agenda 3 Timed Timed UPPAAL Model- Overview


  1. Real-Time Modeling and Test Case Generation Konrad Krentz

  2. A Convincing Safety Case [1] 2 A Convincing Safety Case Fault Formal (…) Architecture Tree Testing Methods Analysis UPPAAL

  3. Agenda 3 Timed Timed UPPAAL Model- Overview Checking Automata Testing Folienmaster | Max Mustermann | 7. Oktober 2007

  4. [2] 4

  5. Approach [2,4] 5 1 2 a 3 4 b c 1 2 a 3 4 b c 1 2 3 4

  6. UPPAAL’s Architecture [3] 6

  7. Agenda 7 Timed Timed UPPAAL Model- Overview Checking Automata Testing Folienmaster | Max Mustermann | 7. Oktober 2007

  8. From State Machines to Timed Automata 8 Train

  9. Timed Automaton [3] 9 ( L , l , C , A , E , I ) 0 L (Locations ) l L (Initial Location) 0 C (Clocks) A I O { } (Actions) C E L A B ( C ) 2 L (Transitio ns) I : L B ( C ) (Location Invariants )

  10. Demo 10

  11. Agenda 11 Timed Timed UPPAAL Model- Overview Checking Automata Testing Folienmaster | Max Mustermann | 7. Oktober 2007

  12. Model-Checking with UPPAAL [2] 12 (Engine)

  13. Reachability Properties [3,6] 13 Let be a state formulae. E is reachable.

  14. Safety Properties [3,6] 14 Let be a state formulae. A[] is always satisfied. E[] there is path where is always true.

  15. Liveness Properties [3,6] 15 Let be a state formulae. A will eventually be statisfied . - - If , then eventually will be satisfied.

  16. Demo 16

  17. Agenda 17 Timed Timed UPPAAL Model- Overview Checking Automata Testing Folienmaster | Max Mustermann | 7. Oktober 2007

  18. Challenges in Timed Testing 18  When to stimulate and when to expect the response?  What is correct?  Test system may become itself a real-time system

  19. Offline Test Generation [4] 19

  20. Diagnostic Trace  Test Case [4] 20 1. Partition into subnetworks E and S 2. Projection to E (Remove invisible traces and sum adjacent delay actions) 3. Add verdicts Assumptions: Deterministic, Weakly input enabled, Output urgent, Isolated outputs

  21. Generating Diagnostic Traces based on Coverage Criterions 21 1.Adding auxiliary variables to the original model and expressing the coverage criterion as reachability formulae. 2. A coverage specification language “observers”

  22. Edge Coverage Criterion 22 “Traverse every of the selected edges.” E<> e[0] and e[1] and e[2] e

  23. Location Coverage Criterion 23 “Visit every of the selected locations.” E<> l[0] and l[1] and l[2] and l[3]

  24. Definition-Use Pair Coverage 24 “Cover all DU - Pairs of Variable v.” “DU - Pair”: Assignment of v and usage without redefinition in between.

  25. Observers as Coverage Specification Language 25 ( Q , q , Q , B ) 0 f Q observer locations q Q initial observer location 0 Q Q accepting observer locations f b B Edges of the form q q ' b evaluates to a boolean depending on the input/outp ut transit ion taken by the observed timed automaton (observer cannot react on delay tran sitions)

  26. Location Coverage Observer 26 Predicates : target_loc (X) transit ion ends in location X target_loc(safe) target_loc(near_cross) target_loc(cross) loc(safe) loc(near_cross) loc(cross)

  27. Edge Coverage Observer 27 Predicates : e dge(X) edge X is traversed by the transitio n edge(2) edge(0) edge(1) edge_cov(2) edge_cov(0) edge_cov(1)

  28. Definition-Use Pair Coverage Observer 28 Predicates : def ( X ) Variable X is defined edge ( E ) Edge E is traversed use ( X ) Variable X is used

  29. Generating Traces from Observers 29  Algorithm simulates the observer by exploring the timed automaton without regarding time (only input/output transitions)  Keeps track of the set of reachable observer locations, the current trace and the visited states  Outputs the trace with the most covered accepting observer locations

  30. Offline Testing with UPPAAL COVER [4] 30

  31. Online Testing [4] 31

  32. Online Testing with UPPAAL TRON 32

  33. Offline versus Online Testing [4] 33

  34. Conclusion 34  High initial effort  Features: worst case execution time calculation, schedulability analysis, deadlock detection  Testing has been successfully done in many industrial case studies  “Certification must consider multiple sources of evidence and ultimately rests on informed engineering judgment and experience” [ Rushby] Folienmaster | Max Mustermann | 7. Oktober 2007

  35. Sources (1/2) 35 1. Herrmann Kopetz: Real-Time Systems Design Principles for Distributed Embedded Applications , 1997 , Kluwer Academic Publishers 2. Kim Guldstrand Larsen: Formal Methods for Real Time Systems, 1998 http://www.cs.aau.dk/~kgl/ARTES 3. Gerd Behrmann et al.: A Tutorial on Uppaal, 2004, http://www.it.uu.se/research/group/darts/papers/texts/new- tutorial.pdf 4. Anders Hessel et al.: Testing Real-Time Systems Using UPPAAL, 2008, Springer Berlin, http://www.cs.aau.dk/~marius/tron/FMT2008.pdf 5. Anders Hessel et al.: Time-optimal Real-Time Test Case Generation using UPPAAL http://www.cs.aau.dk/~marius/tron/FMT2008.pdf Folienmaster | Max Mustermann | 7. Oktober 2007

  36. Sources (2/2) 36 6. Gerd Behrmann, Kim Larsen, Intro to UPPAAL, www.cs.aau.dk/~adavid/RTSS05/ uppaal -intro.pdf 7. http://i12www.ira.uka.de/~engelc/lehre/keypraktWS0607/slides/U ppaalHandout.pdf

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend