learning objectives
play

Learning objectives Understand rationale and basic approach for - PowerPoint PPT Presentation

Learning objectives Understand rationale and basic approach for systematic combinatorial testing Learn how to apply some representative Combinatorial testing combinatorial approaches Category-partition testing Pairwise


  1. Learning objectives • Understand rationale and basic approach for systematic combinatorial testing • Learn how to apply some representative Combinatorial testing combinatorial approaches – Category-partition testing – Pairwise combination testing – Catalog-based testing • Understand key differences and similarities among the approaches ! – and application domains for which they are suited (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 1 (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 2 Combinatorial testing: Basic idea Key ideas in combinatorial approaches • Identify distinct attributes that can be varied • Category-partition testing – separate (manual) identification of values that characterize the – In the data, environment, or configuration input space from (automatic) generation of combinations – Example: browser could be “IE” or “Firefox”, for test cases operating system could be “Vista”, “XP”, or “OSX” • Pairwise testing • Systematically generate combinations to be – systematically test interactions among attributes of the program input space with a relatively small number of test tested cases – Example: IE on Vista, IE on XP, Firefox on Vista, • Catalog-based testing Firefox on OSX, ... – aggregate and synthesize the experience of test designers in a • Rationale: Test cases should be varied and particular organization or application domain, to aid in identifying attribute values include possible “corner cases” (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 3 (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 4

  2. An informal specification: Category partition (manual steps) check configuration 1. Decompose the specification into independently Check Configuration testable features • Check the validity of a computer configuration – for each feature identify • parameters • The parameters of check-configuration are: • environment elements – Model – for each parameter and environment element identify elementary characteristics (categories) – Set of components 2. Identify relevant values for each characteristic (category) identify (classes of) values – normal values • boundary values • special values • error values • 3. Introduce constraints (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 5 (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 6 An informal specification: An informal specification of parameter model parameter set of components Model Set of Components • A set of (slot, component) pairs, corresponding to the required and • A model identifies a specific product and determines a set of optional slots of the model. A component is a choice that can be varied constraints on available components. Models are characterized by within a model, and which is not designed to be replaced by the end user. logical slots for components, which may or may not be Available components and a default for each slot is determined by the implemented by physical slots on a bus. Slots may be required or model. The special value empty is allowed (and may be the default optional. Required slots must be assigned with a suitable selection) for optional slots. In addition to being compatible or component to obtain a legal configuration, while optional slots incompatible with a particular model and slot, individual components may may be left empty or filled depending on the customers' needs be compatible or incompatible with each other. Example: Example: The default configuration of the Chipmunk C20 includes 20 gigabytes of The required “slots” of the Chipmunk C20 laptop computer hard disk; 30 and 40 gigabyte disks are also available. (Since the hard include a screen, a processor, a hard disk, memory, and an disk is a required slot, empty is not an allowed choice.) The default operating system. (Of these, only the hard disk and memory are operating system is RodentOS 3.2, personal edition, but RodentOS 3.2 implemented using actual hardware slots on a bus.) The optional mobile server edition may also be selected. The mobile server edition slots include external storage devices such as a CD/DVD writer. requires at least 30 gigabytes of hard disk. (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 7 (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 8

  3. Step1: Identify independently testable Step 1: Identify independently testable units units and categories Parameter Model • Choosing categories – Model number – no hard-and-fast rules for choosing categories – Number of required slots for selected model (#SMRS) – Number of optional slots for selected model (#SMOS) – not a trivial task! Parameter Components • Categories reflect test designer's judgment – Correspondence of selection with model slots – regarding which classes of values may be treated – Number of required components with selection � empty differently by an implementation – Required component selection • Choosing categories well requires experience – Number of optional components with selection � empty and knowledge – Optional component selection Environment element: Product database – of the application domain and product architecture. The test designer must look under the surface of the – Number of models in database (#DBM) specification and identify hidden characteristics – Number of components in database (#DBC) (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 9 (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 10 Step 2: Identify relevant values Step 2: Identify relevant values: Model • Identify (list) representative classes of values Model number for each of the categories Malformed Not in database – Ignore interactions among values for different Valid categories (considered in the next step) Number of required slots for selected model (#SMRS) • Representative values may be identified by 0 applying 1 – Boundary value testing Many Number of optional slots for selected model (#SMOS) • select extreme values within a class 0 • select values outside but as close as possible to the class 1 • select interior (non-extreme) values of the class Many – Erroneous condition testing • select values outside the normal domain of the program (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 11 (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 12

  4. Step 2: Identify relevant values: Component Step 2: Identify relevant values: Database Correspondence of selection with Number of optional components Number of models in database (#DBM) model slots with non empty selection Omitted slots 0 0 Extra slots < #SMOS 1 Mismatched slots = #SMOS Many Complete correspondence Optional component selection Number of required components with Number of components in database (#DBC) Some defaults non empty selection 0 All valid 0 � 1 incompatible with slots 1 < number required slots � 1 incompatible with another = number required slots Many selection Required component selection � 1 incompatible with model Some defaults Note 0 and 1 are unusual (special) values. They might cause � 1 not in database All valid unanticipated behavior alone or in combination with particular � 1 incompatible with slots values of other parameters. � 1 incompatible with another selection � 1 incompatible with model � 1 not in database (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 13 (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 14 Step 3: Introduce constraints Step 3: error constraint • A combination of values for each category [error] indicates a value class that corresponds to a test case specification – corresponds to a erroneous values – in the example we have 314.928 test cases – need be tried only once – most of which are impossible! Example • example Model number: Malformed and Not in database zero slots and at least one incompatible slot error value classes • Introduce constraints to – No need to test all possible combinations of errors – rule out impossible combinations – One test is enough (we assume that handling an – reduce the size of the test suite if too large error case bypasses other program logic) (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 15 (c) 2007 Mauro Pezzè & Michal Young Ch 11, slide 16

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend