On the Generation of Test Cases for Embedded Software in Avionics - - PowerPoint PPT Presentation

on the generation of test cases for embedded software in
SMART_READER_LITE
LIVE PREVIEW

On the Generation of Test Cases for Embedded Software in Avionics - - PowerPoint PPT Presentation

On the Generation of Test Cases for Embedded Software in Avionics or Overview of CESAR Philipp Rmmer Oxford University, Computing Laboratory philr@comlab.ox.ac.uk 8th KeY Symposium May 19th 2009 1 / 16 The Wonderful World of Oz Model


slide-1
SLIDE 1

On the Generation of Test Cases for Embedded Software in Avionics

  • r

Overview of CESAR

Philipp Rümmer Oxford University, Computing Laboratory philr@comlab.ox.ac.uk 8th KeY Symposium May 19th 2009

1 / 16

slide-2
SLIDE 2

The Wonderful World of Oz Model Checking

Background: I recently joined Daniel Kröning’s group in Oxford (. . . after 8 years of KeY . . . ) Now employed by the CESAR project Outline of the talk: Overview of CESAR Model checking Matlab/Simulink (Bounded) Model checking to generate test cases

2 / 16

slide-3
SLIDE 3

CESAR

“Cost-Efficient methods and processes for SAfety Relevant embedded systems” Artemis project Project started March 1st 2009 Some universities involved: Oxford, Manchester, INRIA, KTH, Athens, + Fraunhofer Some industry partners: Airbus/EADS, Volvo, Thales Overall project topic: Introduction of formal methods in development of embedded software Domains: Aerospace, Automotive, Rail, Industrial automation

3 / 16

slide-4
SLIDE 4

CESAR: Most relevant sub-projects

SP2: Requirements engineering Define a formalised requirements capturing language In particular also: non-functional requirements Support requirement validation: consistency, completeness SP3: Component-based development Defined a component-based design language Incremental approaches for validation, verification, certification and qualification

4 / 16

slide-5
SLIDE 5

Simulink as de facto standard for embedded software

5 / 16

slide-6
SLIDE 6

Simulink for embedded software (2)

Graphical dataflow language on top of Mathwork’s Matlab Supports discrete + continuous dataflow + stateflows Automatic simulation and code generation S-functions Further relevant language: Lustre/Scade

6 / 16

slide-7
SLIDE 7

Simulink for embedded software (2)

Graphical dataflow language on top of Mathwork’s Matlab Supports discrete + continuous dataflow + stateflows Automatic simulation and code generation S-functions Further relevant language: Lustre/Scade Issues with Simulink Quite low-level No (strong) encapsulation Unclear semantics

6 / 16

slide-8
SLIDE 8

CESAR SP3: Component-based development

New high-level component-based modelling/design language (CBD language) Mainly: University of Manchester Verification methods for: CBD language, Simulink, Lustre, etc. Primary approach: white-box test-case generation Model checking as underlying method Focus on: compositionality, replicated components Tool support Compositionality and coverage criteria Mainly: Oxford University

7 / 16

slide-9
SLIDE 9

Model Checking

Algorithmic verification approach (in contrast to: deductive) Applicable to properties in temporal logic Can generate counterexamples for violated properties Two kinds of model checking that are mostly unrelated: (Full) Model checking Bounded model checking

8 / 16

slide-10
SLIDE 10

Full model checking

Basically: Exhaustive search in the state space of a program Explicit or symbolic Abstraction to make state space small/finite Tool developed in Oxford: SatAbs

9 / 16

slide-11
SLIDE 11

Bounded model checking

Principle of bounded model checking Examine finite unwinding of program: I(s0) ∧ R(s0, s1) ∧ · · · ∧ R(sn−1, sn) ∧ ¬P(sn) I(s) . . . Initial states R(s, s′) . . . Transition relation of program P(s) . . . Safety property to be checked Typically: completely propositional encoding Incomplete for verification, but complete for disproving safety properties Quite similar to KeY . . . (yes it is!) How to encode heap? (→ answered by Carsten) Tool developed in Oxford: CBMC

10 / 16

slide-12
SLIDE 12

Test case generation using model checking

Basic idea: Specify trap properties and use counterexample from model checker as a test case Idea goes back to 1996, Callahan, Engels Model-based testing technique Example (Model checking to achieve statement coverage) 1: void f(int x) { 2: ... 3: if (x > 10000) 4: causeSegFault(...); 5: } How to reach the blue statement? Trap property: (pc = 4) Counterexample found: f(10001)

11 / 16

slide-13
SLIDE 13

Test case generation using model checking (2)

Criteria that are most relevant for us: Structural coverage Trap properties to cover: Statements, control edges, MC/DC Dataflows Mutation detection Generate faulty mutants of program: Exchange operators, literals, introduce bit-faults, etc. Try to verify equivalence of original program and mutant ⇒ Use counterexample as test case Hypothesis: coupling effect Tests that detect simple faults probably also detect complex faults

12 / 16

slide-14
SLIDE 14

Verification of Simulink programs using CBMC

For discrete dataflow and stateflows: Compile Simulink model to imperative program Statically generate schedule for program (order in which Simulink blocks are executed) C library with implementations of blocks Simply include code for S-functions Resulting code can be model-checked Counterexamples/tests can be translated back to Simulink Alternative approach: compile Simulink to Lustre PhD student working on Simulink front-end for CBMC: Michele Mazzucchi (ETH) Unclear: how to handle continuous dataflow?

13 / 16

slide-15
SLIDE 15

Floating-point arithmetic

Essential to analyse Simulink models Difficult to handle also in bounded model checking Bit-precise encoding: large and very hard for SAT-solvers Interval arithmetic: no counterexamples ⇒ unusable Approach currently investigated in our group Bit-precise encoding Combination under- and over-approximation Over-approximation: Only include clauses for higher-valued bits Under-approximation: Assert that lowest-valued bits are zero PhD student working on floating-point support: Angelo Brillout (ETH)

14 / 16

slide-16
SLIDE 16

Conclusion

Overall goal of CESAR: Improve quality + reduce costs of embedded software using formal methods Testing ⇒ Independent of compilers correctness, hardware, etc. (in contrast to deductive verification) Bounded model checking to generate tests CESAR includes pilot applications for evaluation: Aerospace, Automotive, Rail, Industrial automation Future work: basically everything

15 / 16

slide-17
SLIDE 17

Thanks for your attention!

16 / 16