software testing
play

Software Testing Credits: IPL (Cantata++) Rick Mercer; Franklin, - PowerPoint PPT Presentation

Software Testing Credits: IPL (Cantata++) Rick Mercer; Franklin, Beedle & Associates Satish Mishra; HU Berlin Hyoung Hong; Concordia University Pressman Instructor: Peter Baumann email: p.baumann@jacobs-university.de tel: -3178


  1. Software Testing Credits: IPL (Cantata++) Rick Mercer; Franklin, Beedle & Associates Satish Mishra; HU Berlin Hyoung Hong; Concordia University Pressman Instructor: Peter Baumann email: p.baumann@jacobs-university.de tel: -3178 “ Hey, it compiles office: room 88, Research 1 – let’s ship it! ” 320312 Software Engineering (P. Baumann)

  2. Test Your Testing! [Myers 1982]  Program reads 3 integers from cmd line, interprets as side lengths of a triangle  Outputs triangle type: • Non-equilateral • Equilateral • Isosceles  ...test cases? 320312 Software Engineering (P. Baumann) 2

  3. Maintenance  Why Tests? - Software Costs "If debugging is the process of removing bugs, then programming must be the process of putting them in." Cost Testing Design and Implementation Requirements 320312 Software Engineering (P. Baumann) 3

  4. Some Better-Test-Well Applications Train Control - Alcatel Medical Systems – GE Medical Cantata++ running under Symbian – Nokia Series 60 Nuclear Reactor Control - Thales EFA Typhoon – BAe Systems International Space Station Airbus A340 – Ultra Electronics – Dutch Space 320312 Software Engineering (P. Baumann) 4

  5.  What Is Software Testing?  Software Testing = process of exercising a program with the specific intent of finding errors prior to delivery to the end user. 320312 Software Engineering (P. Baumann) 5

  6.  Who Tests the Software? developer independent tester Understands the system Must learn about the system but will test "gently" but will attempt to break it driven by "delivery" and is driven by quality “Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.” - Brian Kernighan 320312 Software Engineering (P. Baumann) 6

  7. Test Feature Space Level regression acceptance safety system security robustness integration usability reliability Accessibility unit performance correctness white grey black maintainability box box box portability manual interoperability semi-automatic … Automation automatic Quality 320312 Software Engineering (P. Baumann) 7

  8.  What Testing Shows errors requirements conformance performance an indication of quality 320312 Software Engineering (P. Baumann) 8

  9.  Testing & The Design Cycle What users Acceptance testing really need System Requirements testing Integration Design testing Unit Code testing Project work flow Dynamic testing 320312 Software Engineering (P. Baumann) 9

  10.  What users Acceptance really need testing Unit Testing System Requirements testing Integration Design testing Unit Code testing  Test unit = code that tests target • Usually one or more test module/class • In oo programs: target frequently one class  Test case = test of an assertion (“design promise”) or particular feature • “ writing to then deleting an item from an empty stack yields an empty stack ”: isempty( pop( push( empty(), x ) ) ) 320312 Software Engineering (P. Baumann) 10

  11. What users Acceptance really need testing Unit Testing System Requirements testing Integration Design testing Unit Code testing module to be results tested interface local data structures software boundary conditions engineer independent paths error handling paths test cases 320312 Software Engineering (P. Baumann) 11

  12.  What users Acceptance really need testing Unit Test Environment System Requirements testing Integration Design testing Unit Code testing Test driver  RESULTS = dummy environment for test class Test stub  driver = dummy methods of classes used, but not available Module  Some unit testing frameworks • C++: cppunit • Java: JUnit stub • server-side Java code stub (web apps!): Cactus • JavaScript: JSpec test cases 320312 Software Engineering (P. Baumann) 12

  13.  What users Acceptance really need testing Equivalence Class Testing System Requirements testing Integration Design testing Unit Code testing  Practically never can do exhaustive testing on input combinations  How to find „good“ test cases? • Good = likely to produce an error loop 20 X  Idea: 10 14 possible paths; build equivalence classes 1 test per millisecond of test input situations, = 3,170 years to test completely test one candidate per class • See lab 320312 Software Engineering (P. Baumann) 13

  14. What users Acceptance really need testing Test Your Testing, Reloaded System Requirements testing Integration Design testing Unit Code testing  Program reads 3 integers from cmd line, interprets as side lengths of a triangle  Outputs triangle type: • Non-equilateral • Equilateral • Isosceles  ...test cases? 320312 Software Engineering (P. Baumann) 14

  15.  What users Acceptance really need testing Integration Testing System Requirements testing Integration Design testing Unit Code testing  Integration testing = test interactions among units • Import/export type compatibility • range errors • representation • …and many more  Sample integration problems F1 calls F2( char[] s ) -- F1 assumes array of size 10, F2 assumes size 8 • • F1 calls F2( elapsed_time ) -- F1 thinks in seconds, F2 thinks in miliseconds • Strategies: Big-bang, incremental (top-down, bottom-up, sandwich) 320312 Software Engineering (P. Baumann) 15

  16.  What users Acceptance really need testing Top-Down Integration System Requirements testing Integration Design testing Unit Code testing A top module is tested with stubs B B F G stubs are replaced one at a time, "depth first" C as new modules are integrated, some subset of tests is re-run D E 320312 Software Engineering (P. Baumann) 16

  17.  What users Acceptance really need testing Bottom-Up Integration System Requirements testing Integration Design testing Unit Code testing A B B F G drivers are replaced one at a time, "depth first" C worker modules are grouped into builds and integrated D E cluster 320312 Software Engineering (P. Baumann) 17

  18. What users Acceptance really need testing Sandwich Testing System Requirements testing Integration Design testing Unit Code testing A Top modules are tested with stubs B F G C Worker modules are grouped into builds and integrated D E cluster 320312 Software Engineering (P. Baumann) 18

  19.  What users Acceptance really need testing System Testing System Requirements testing Integration Design testing Unit Code testing  System testing = determine whether system meets requirements • = integrated hardware and software  Focus on use & interaction of system functionalities • rather than details of implementations  Should be carried out by a group independent of the code developers  Alpha testing : end users at developer‟s site  Beta testing: at end user site, w/o developer! 320312 Software Engineering (P. Baumann) 19

  20.  What users Acceptance really need testing Acceptance Testing System Requirements testing Integration Design testing Unit Code testing  Goal: Get approval from customer • try to structure it!  be suresuresure that the demo works  Customer may be tempted to demand more functionality when getting exposed to new system • Ideally: get test cases agreed already during analysis phase • …will not work in practice, customer will feel tied • At least: agree on schedule & criteria beforehand  Best: prepare with stakeholders well in advance 320312 Software Engineering (P. Baumann) 20

  21.  static Testing Methods dynamic regression  Static testing • Collects information about a software without executing it • Reviews, walkthroughs, and inspections; static analysis; formal verification; documentation testing  Dynamic testing • Collects information about a software with executing it • Does the software behave correctly? • In both development and target environments? • White-box vs. black-box testing; coverage analysis; memory leaks; performance profiling  Regression testing 320312 Software Engineering (P. Baumann) 22

  22.  static Static Analysis dynamic regression  Control flow analysis and data flow analysis • Provide objective data, eg, for code reviews, project management, end of project statistics • Extensively used for compiler optimization and software engineering  Examples of errors that can be found: • Unreachable statements • Variables used before initialization • Variables declared but never used • Possible array bound violations  Extensive tool support for deriving metrics from source code • e.g. up to 300 source code metrics [Cantata++] • Code construct counts, Complexity metrics, File metrics 320312 Software Engineering (P. Baumann) 23

  23. static Formal Verification dynamic regression  Given a model of a program and a property, determine whether model satisfies property, based on mathematics • algebra, logic, … • See earlier (invariants) and later!  Examples • Safety • If the light for east-west is green, then the light for south-north should be red • Liveness • If a request occurs, there should be a response eventually in the future 320312 Software Engineering (P. Baumann) 24

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend