testing
play

Testing Terminology System testing Types of errors Function - PDF document

Outline Testing Terminology System testing Types of errors Function testing Structure Testing Dealing with errors Performance testing Quality assurance vs Acceptance testing Testing Installation testing


  1. Outline Testing ● Terminology ● System testing ● Types of errors ■ Function testing ■ Structure Testing ● Dealing with errors ■ Performance testing ● Quality assurance vs ■ Acceptance testing Testing ■ Installation testing ● Component Testing ■ Unit testing ■ Integration testing ● Testing Strategy ● Design Patterns & 16-testing 1 2 16-testing Testing Terminology What is this? ● Reliability: The measure of success with which the observed behavior of a system confirms to some specification of its behavior. ● Failure: Any deviation of the observed behavior from the specified behavior. ● Error: The system is in a state such that further processing by the system will lead to a failure. ● Fault (Bug): The mechanical or algorithmic cause of an error. 3 4 16-testing 16-testing There are many different types of errors and Erroneous State (“Error”) Algorithmic Fault 5 6 16-testing 16-testing 1

  2. Mechanical Fault How do we deal with Errors and Faults? 7 8 16-testing 16-testing Verification? Modular Redundancy? 9 10 16-testing 16-testing Declaring the Bug Patching? as a Feature? 11 12 16-testing 16-testing 2

  3. Testing? Examples of Faults and Errors ● Faults in the Interface ● Mechanical Faults (very hard specification to find) ■ Mismatch between what the ■ Documentation does not client needs and what the server match actual conditions or offers operating procedures ■ Mismatch between requirements ● Errors and implementation ■ Stress or overload errors ● Algorithmic Faults ■ Capacity or boundary errors ■ Missing initialization ■ Timing errors ■ Branching errors (too soon, too ■ Throughput or performance late) errors ■ Missing test for nil 13 14 16-testing 16-testing Another View on How to Deal Dealing with Errors with Errors ● Verification: ● Error prevention (before the system is released): ■ Assumes hypothetical environment that does not match real environment ■ Use good programming methodology to reduce complexity ■ Proof might be buggy (omits important constraints; simply wrong) ■ Use version control to prevent inconsistent system ● Modular redundancy: ■ Apply verification to prevent algorithmic bugs ● Error detection (while system is running): ■ Expensive ■ Testing: Create failures in a planned way ● Declaring a bug to be a “feature” ■ Debugging: Start with an unplanned failures ■ Bad practice ■ Monitoring: Deliver information about state. Find performance bugs ● Patching ● Error recovery (recover from failure once the system is released): ■ Slows down performance ■ Data base systems (atomic transactions) ● Testing (this lecture) ■ Modular redundancy ■ Testing is never good enough ■ Recovery blocks 15 16 16-testing 16-testing Some Observations Testing takes creativity ● It is impossible to completely test any nontrivial Testing often viewed as dirty work. ● ● To develop an effective test, one must have: module or any system ◆ Detailed understanding of the system ◆ Knowledge of the testing techniques ■ Theoretical limitations: Halting problem ◆ Skill to apply these techniques in an effective and efficient manner ■ Practial limitations: Prohibitive in time and cost Testing is done best by independent testers ● ■ We often develop a certain mental attitude that the program should in a certain way ● Testing can only show the presence of bugs, not when in fact it does not. their absence (Dijkstra) ● Programmer often stick to the data set that makes the program work "Don’t mess up my code!" ■ A program often does not work when tried by somebody else. ● Don't let this be the end-user. ■ 17 18 16-testing 16-testing 3

  4. Testing Activities Testing Activities ctd Requirements Requirements Client’s Subsystem System Unit Global Understanding Analysis User Analysis Code Test Design Requirements of Requirements Environment Tested Document Document User Document Subsystem Subsystem Manual Unit Accepted Validated Functioning Code System Test System System Tested Integration Performance Acceptance Installation Functional Subsystem Test Test Test Test Test Functioning Integrated System Subsystems Usable System Tests by client Tested Subsystem Subsystem Tests by developer Unit Code User’s understanding Test All tests by developer System in Use Tests (?) by user 19 20 16-testing 16-testing Quality Assurance Fault Handling Techniques encompasses Testing Quality Assurance Fault Handling Usability Testing Fault Tolerance Fault Avoidance Fault Detection Scenario Prototype Product Testing Testing Testing Design Atomic Modular Reviews Fault Avoidance Fault Tolerance Methodology Transactions Redundancy Atomic Modular Configuration Verification Configuration Transactions Redundancy Management Verification Management Fault Detection Testing Debugging Reviews Component Integration System Correctness Performance Testing Testing Testing Debugging Debugging Debugging Walkthrough Inspection Testing Correctness Performance Debugging Debugging Integration System Component Testing Testing Testing 21 22 16-testing 16-testing Component Testing System Testing ● Unit Testing: System Testing: ● The entire system ■ ■ Individual subsystem Carried out by developers ■ ■ Carried out by developers Goal: Determine if the system meets the requirements (functional and global) ■ ■ Goal: Confirm that subsystems is correctly coded and carries out Acceptance Testing: ● the intended functionality ■ Evaluates the system delivered by developers ● Integration Testing: ■ Carried out by the client. May involve executing typical transactions on site on a trial basis ■ Groups of subsystems (collection of classes) and eventually the Goal: Demonstrate that the system meets customer requirements and is ready to ■ entire system use ■ Carried out by developers ● Implementation (Coding) and testing go hand in hand ■ Goal: Test the interface among the subsystem 23 24 16-testing 16-testing 4

  5. Unit Testing Black-box Testing ● Focus: I/O behavior. If for any given input, we can predict the output, ● Informal: then the module passes the test. Incremental coding ■ ■ Almost always impossible to generate all possible inputs ("test cases") Static Analysis: ● ● Goal: Reduce number of test cases by equivalence partitioning: ■ Hand execution: Reading the source code Walk-Through (informal presentation to others) ■ ■ Divide input conditions into equivalence classes Code Inspection (formal presentation to others) ■ ■ Choose test cases for each equivalence class. (Example: If an object is Automated Tools checking for ■ supposed to accept a negative number, testing one negative number is ◆ syntactic and semantic errors enough) ◆ departure from coding standards Dynamic Analysis: ● ■ Black-box testing (Test the input/output behavior) ■ White-box testing (Test the internal logic of the subsystem or object) ■ Data-structure based testing (Data types determine test cases) 25 26 16-testing 16-testing Black-box Testing (Continued) White-box Testing Selection of equivalence classes (No rules, only guidelines): ● ● Focus: Thoroughness (Coverage). Every statement in Input is valid across range of values. Select test cases from 3 equivalence ■ classes: the component is executed at least once. ◆ Below the range ● Four types of white-box testing ◆ Within the range ◆ Above the range ■ Statement Testing Input is valid if it is from a discrete set. Select test cases from 2 equivalence ■ ■ Loop Testing classes: ◆ Valid discrete value ■ Path Testing ◆ Invalid discrete value ■ Branch Testing Another solution to select only a limited amount of test cases: ● Get knowledge about the inner workings of the unit being tested => white-box ■ testing 27 28 16-testing 16-testing White-box Testing Example White-box Testing (Continued) FindMean(float Mean, FILE ScoreFile) Statement Testing (Algebraic Testing): Test single statements (Choice of ● { SumOfScores = 0.0; NumberOfScores = 0; Mean = 0; operators in polynomials, etc) Read(Scor eFile, Score); /*Read in and sum the scores*/ while (! EOF(ScoreFile) { Loop Testing: ● if ( Score > 0.0 ) { Cause execution of the loop to be skipped completely. (Exception: Repeat loops) ■ SumOfScores = SumOfScores + Score; ■ Loop to be executed exactly once NumberOfScores++; ■ Loop to be executed more than once } ● Path testing: Read(ScoreFile, Score); Make sure all paths in the program are executed ■ } Branch Testing (Conditional Testing): Make sure that each possible outcome ● /* Compute the mean and print the result */ from a condition is tested at least once if (NumberOfScores > 0 ) { Mean = SumOfScores/NumberOfScores; printf("The mean score is %f \n", Mean); } else if ( i = TRUE) printf("YES\n"); else printf("NO\n"); printf("No scores found in file\n"); Test cases: 1) i = TRUE; 2) i = FALSE } 29 30 16-testing 16-testing 5

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend