VALIDATION The goal of validation is to judge the quality of the - - PDF document

validation the goal of validation is to judge the quality
SMART_READER_LITE
LIVE PREVIEW

VALIDATION The goal of validation is to judge the quality of the - - PDF document

VALIDATION The goal of validation is to judge the quality of the soft- ware from the users point of view; e.g., reliability. The goal of all validation techniques is: to reveal failures, to localize the faults that caused the


slide-1
SLIDE 1

VALIDATION

The goal of validation is to judge the quality of the soft-

ware from the user’s point of view; e.g., reliability.

The goal of all validation techniques is:

– to reveal failures, – to localize the faults that caused the failures, and – ultimately, to correct the faults, and thereby to achieve the highest possible confidence that the component conducts itself according to its spec- ification.

Validation can be considered an activity to improve the

software and to evaluate the software. However, the quality of poorly understood software can’t be improved via validation.

1

slide-2
SLIDE 2

DIFFERENT TECHNIQUES (A) Execution-based validation (testing) (A1) Black-box (functional testing) Use the specification to develop test cases (the code is invisible): (A11) by constructing equivalence classes (A12) by performing boundary-value analysis (ideally using both A11 and A12 together) (A2) White-box (structural testing) Use source code to develop test cases (the specification may be invisible) in order to achieve: (A21) statement coverage (A22) control-path coverage (A23) control-construct covering (A24) multiple-condition coverage (...and many more) (B) Non-execution-based validation (reading) Really “verification,” this is presented here as a con- trasting example to the execution-based techniques. (B1) symbolic execution (B2) reading (B21) sequential (B22) control-flow oriented (B23) stepwise abstraction

2

slide-3
SLIDE 3

READING VIA STEPWISE ABSTRACTION Example: 01: if x != 0 then 02: y := 5; 03: else 04: z := z - x; 05: endif 06: if z > 1 then 07: z := z / x; 08: else 09: z := 0; 10: endif Develop abstractions of this code using Mill’s functional no- tation. Abstraction lines 01–05:

x 6= 0 ! y : = 5 j z : = z
  • x

Abstraction lines 06–10:

z > 1 ! z : = z =x j z : = 0

Abstraction lines 01–10:

(x 6= 0 ! (z > 1 ! y ; z : = 5 ; z =x j y ; z : = 5 ; 0 ) j (z > 1 ! z : = (z
  • x)=x
j z : = 0 ))

3

slide-4
SLIDE 4

PRINCIPLES OF THE CODE-READING PROCESS

  • 1. Detect how the program would fail if run
Determine the meaning (behavior) of a component

through reading.

! [pg m]

– define the meaning of every elemental program part, and describe it as a conditional instruction. Remark: conditions and parallel instructions should be described as formally as possible. – aggregate the elemental functions according to the control flow.

Compare the abstract meaning function with the given

specification to detect possible failures.

! is f
  • [pg
m] ?

Remark: if specifications for single design parts ex- ist, the diagnosis can proceed in a stepwise fashion.

  • 2. Isolate the faults that would lead to failures
Search out the cause of the inappropriate behavior

(i.e., the fault) in the code. Remark: The deep understanding of the code gained in the reading step should ease fault localization.

4

slide-5
SLIDE 5

REQUIRED DOCUMENTS FOR CODE READING Specification Source code executable Steps

f pg m

component

  • 1. detect likely

failures X (

! [pg m])

Compare (

f
  • [pg
m]? )

x X

  • 2. isolate

faults X X

5

slide-6
SLIDE 6

TESTING Goal: to define a finite set T of tests (T

input x output)

where:

the probability of revealing all failures is high the belief that all failures were revealed is strong

There are different criteria:

for choosing tests for deciding about the “completeness” of the tests (i.e.,

when to stop testing) Each test is specified through a pair of

T i:(TF

, TD) where:

TF represents the test case (i.e., a possible sequence
  • f invocations of the component)
TD represents the test data

(i.e., pairs (i

2 input, o 2 output))

Remark: o describes the expected result, according to the specification, for the input i.

6

slide-7
SLIDE 7

TESTING Continued Test Results:

  • T
i: (
  • 2 output,
  • 2 [Pgm](i))
  • : expected result
  • 0: actual result

Remark: when testing is not done systematically, mis- matches between

  • and
  • 0 are frequently overlooked.

Documentation of a test case:

  • T1: comment (test’s purpose)

Test case:

f1

Test data (input, expected result) Test result (actual result) Myers: A successful test case is one that causes a failure!

7

slide-8
SLIDE 8

BLACK-BOX TESTING Example:

(x 6= 0 ! (z > 1 ! y ; z : = 5 ; z =x j y ; z : = 5 ; 0 ) j (z > 1 ! z : = (z
  • x)=x
j z : = 0 ))

Test data

  • 1. Divide the inputs into equivalence classes for which

identical behavior is expected from the component. (a)

x 6= 0 ^ z > 1

(b)

x 6= 0 ^ z 1

(c)

x = 0 ^ z > 1

(d)

x = 0 ^ z 1
  • 2. Select test data
Using equivalence classes:

Use one (or more) tests per equivalence class.

f<x=1, z=4>, <x=1,z=0>, <z=0,z=4>, <x=0,z=0> g Using boundary value analysis:

Give special treatment to boundary values (bound- aries of equivalence classes). (a)

f< 1,inc(1)>,< +1,inc(1)>,< 1; +1>,< +1; +1> g

(b)

f< 1,1>,< +1,1>,< 1; 1>,< +1; 1> g

(c)

f<0, inc(1)>, <0, +1> g

(d)

f<o, 1>, <0, 1> g

(+/-

1 corresponds to the largest/smallest repre-

sentable number, inc(x) to the smallest number representable that is larger than x. )

8

slide-9
SLIDE 9

PRINCIPLES OF THE BLACK-BOX TESTING PROCESS

  • 1. Detect failures
Define the tests (TF

, TD) using the component’s specification: – identify test cases – identify test data Criteria for completeness of the tests are: – at least one test is defined for each equivalence class – at least one test is defined for every vague point in the specification

The program is executed for the input part of each

test (to obtain an actual result).

Failures are diagnosed in the output by comparing

the expected result with the actual result.

  • 2. Isolate faults
Search for the cause (i.e., the fault in the code) of

the detected failure by reading/debugging.

9

slide-10
SLIDE 10

REQUIRED DOCUMENTS FOR BLACK-BOX TESTING Specification Source code executable Steps

f pg m

component Generate test X cases Execute test X cases Diagnosis X X isolate faults X X X

10

slide-11
SLIDE 11

WHITE-BOX TESTING 01: if x != 0 then 02: y := 5; 03: else 04: z := z - x; 05: endif 06: if z > 1 then 07: z := z / x; 08: else 09: z := 0; 10: endif

x != 0 y := 5 z := z - x z > 1 N Y z := z / x z := 0 N Y

Test data (for source code)

  • 1. statement coverage

Select tests to execute each statement at least once

f<x=0, z=1>, <x=1, z=3> g
  • 2. control-path coverage

Select tests to traverse each edge of the program’s control-flow graph at least once

f<x=0, z=1>, <x=1, z=3> g
  • 3. complete control-path coverage

Select tests to traverse each elementary path at least

  • nce
f<x=0, z=1>, <x=1, z=3>, <x=0, z=3>, <x=1, z=1> g

11

slide-12
SLIDE 12

REFINEMENT OF COVERAGE CRITERIA Multiple-condition coverage

In the case of combined boolean conditions

(e.g., a

^ b), make sure that all cases are tested:

a b a

^b

t t t t f f f t f f f f

In the case of comparison operators (e.g., a b), make

sure that both possibilities are tested (a = b, a < b).

In the case of loops (e.g., while <expr> do S end), make

sure that the loop is executed 0 times, 1 time, and n > 1 times.

12

slide-13
SLIDE 13

PRINCIPLES OF THE WHITE-BOX TESTING PROCESS

  • 1. Detect failures
Define the tests (TF

, TD) using the source code – identify test cases – identify test data Criteria for completeness of the tests are: – According to the “statement-coverage” approach: (# XEQ statements / # existing statements ) = 1 – According to the “control-path-coverage” approach: (# XEQ paths / # existing paths) = 1

(G) := # paths - # nodes + 2 The program is executed for the input part of each

test (to obtain an actual result)

Failures are diagnosed in the output by comparing

the expected result with the actual result

Coverage values attained are checked with a sup-

port tool.

  • 2. Isolate faults
Search for the cause (i.e., the fault in the code) of

the detected failure by reading/debugging

13

slide-14
SLIDE 14

REQUIRED DOCUMENTS FOR WHITE-BOX TESTING Specification Source code executable Steps

f pg m

component Generate test X cases Execute test (X) X cases Diagnosis X (X) X isolate faults X X X

14