T18 September 30, 2004 11:30 AM R EFINING R EQUIREMENTS WITH T EST C - - PDF document

t18
SMART_READER_LITE
LIVE PREVIEW

T18 September 30, 2004 11:30 AM R EFINING R EQUIREMENTS WITH T EST C - - PDF document

BIO PRESENTATION PAPER T18 September 30, 2004 11:30 AM R EFINING R EQUIREMENTS WITH T EST C ASES Tanya Martin-McClellan Texas Mutual Insurance Company Better Software Conference & EXPO September 27-30, 2004 San Jose, CA USA Tanya


slide-1
SLIDE 1

BIO PRESENTATION PAPER Better Software Conference & EXPO September 27-30, 2004 San Jose, CA USA

T18

September 30, 2004 11:30 AM

REFINING REQUIREMENTS WITH TEST CASES

Tanya Martin-McClellan Texas Mutual Insurance Company

slide-2
SLIDE 2

Tanya Martin-McClellan

Tanya Martin-McClellan is a Senior QA Specialist with Texas Mutual Insurance

  • Company. She has played many roles in software development: tester, technical writer,

project manager, product designer, technical support, and trainer. In those roles, she has seen just about everything that can go wrong do so. She collects requirements horror stories for entertainment and moral support. ☺ In her spare time, she blogs on LiveJournal as “happytester” and plays Dance Dance Revolution. Contact information: (512) 505-6275 tmartinm@texasmutual.com 221 W 6th St Ste 300 Austin, TX 78701.

slide-3
SLIDE 3

Refining Requirements with Test Cases

Tanya Martin-McClellan Senior QA Specialist Texas Mutual Insurance Company

slide-4
SLIDE 4

Why are requirements important?

“The hardest single part of building a software system is deciding what to build. No other part of the work so cripples the resulting system if done wrong. No other part is more difficult to rectify later .” No Silver Bullet: Essence and Accidents of Software Engineering, Frederick P Brooks. Addison Wesley

slide-5
SLIDE 5

The challenge

If we want to ensure that the requirements are as solid as they can be before we start testing, we have to start challenging them earlier. But how much earlier?

Project Begins Requirements Agreed on Construction Testing Project Ends Design and Prototype Simplified waterfall methodology for reference

slide-6
SLIDE 6

Learning objectives

  • Spot ambiguity in requirements
  • Spot unaddressed requirement issues
  • Communicate issues effectively through

test cases

slide-7
SLIDE 7

Assumptions

  • You have written requirements to work

from

  • You get to plan your tests prior to the

testing phase

  • You share your test designs with the rest
  • f the project team
  • Your projects use a methodology that is

known, documented, or otherwise accessible to you.

slide-8
SLIDE 8

Spotting ambiguity

slide-9
SLIDE 9

Weasel Words

slide-10
SLIDE 10

Requirement Short Checklist

Does it specify how the user will interact with the application? Does it specify an expected system response? Does it cover all expected exception handling and alternate paths? Could you write a test case that would measure whether the requirement was met?

slide-11
SLIDE 11

Spotting unaddressed requirements

slide-12
SLIDE 12

Why would there be unaddressed requirements?

  • Failures of system or process
  • Failures of training
  • Specific failures
slide-13
SLIDE 13

How to miss functional requirements

  • Keep the users out of it
  • Lose track of requirements between

gathering them and construction

  • Don’t think things through to their logical

conclusions

slide-14
SLIDE 14

How to miss training and documentation requirements

  • Don’t ask about training or

documentation

  • Don’t ask who the audience is
  • Don’t ask what the goal is
slide-15
SLIDE 15

How to miss usability requirements

  • Keep user input out of this – leave it to the

architect or the developers

  • Don’t have a prototype
  • Don’t listen to prototype feedback if you do have

a prototype

  • Don’t plan any usability testing
  • If you do plan usability testing, don’t account for

time to make changes based on the results or retesting

slide-16
SLIDE 16

How to miss performance requirements

  • Don’t talk about it with users or IT
  • perations
  • Consider it to be secondary to functionality
  • Don’t plan any performance testing in the

project plan

  • Don’t account for additional cycles of

improvement and retesting if you do

slide-17
SLIDE 17

How to miss security requirements

  • Consider it secondary to functionality
  • Keep the users out of it
  • Ignore the possibility of deliberate misuse
  • Ignore your confidential or protected data
slide-18
SLIDE 18

How to miss regulatory requirements

  • Don’t talk to your legal department

about the project

  • Don’t read any legislative updates

that relate to your industry

slide-19
SLIDE 19

How to spot what’s missing

  • Use a checklist of questions
  • Think every requirement through to its

logical conclusion. Are the requirements suggesting something that isn’t explicitly stated?

slide-20
SLIDE 20

Communicate issues effectively through test cases

slide-21
SLIDE 21

Two ways to do this

  • Use a notes, risks, assumptions, or other

related section of your document to communicate where the gaps are.

  • Make an assumption about what the

answer should be to your question. Write your test case based on that assumption, and document the assumption.

slide-22
SLIDE 22

The final, vital step

  • Review your test cases with the project

team.

  • Discuss the issues, questions, and

assumptions that are part of your test cases.

  • Listen carefully to the responses you

receive.

slide-23
SLIDE 23

Thank you!

Y’all stop by and see me when you’re in Austin, y’hear?

slide-24
SLIDE 24

Generic Test Case List DRAFT 1

Application can be installed

2

Application can be reinstalled

3

Application can be uninstalled

4

Application can be upgraded

5

Closing the session unlocks all locked records

6

More than one user may not update the same record at the same time

7

No unexpected crashes/environment is stable

8

No unexpected error messages are displayed

9

Searching for a record does not cause a record lock

10

Session times out within N minutes

11

Sessions closed without saving/submitting records do not lock records

12

Sessions closed without saving/submitting records do not update the data source

13

User attempting to update a record that is being updated by another user receives an appropriate error message

14

Valid user can open N sessions of the application

15

Valid user can open no more than Y sessions of the application

16

Viewing a record in display mode does not cause a record lock

17

When user selects a cancel or back out option, no updates occur to the record he/she is editing

18

Audit trail can be accessed

19

Audit trail contains sufficient information to determine who changed what information in the application, and when it was done

20

Invalid login attempts are logged

21

Invalid user cannot log in

22

Repeated failed login attempts will lock out user

23

User without access to function X receives error when attempting function X

24

User without access to menu option X cannot see menu option X when logged in

25

Valid user can log in

26

Entry areas accept valid input of the expected data type when cut and pasted

27

Entry areas accept valid input of the expected data type when entered manually

28

Entry areas allow all valid values as input

29

Entry areas appropriately limit the size of inputs

30

Error message provides sufficient information to both instruct the customer on next steps and assist IT in diagnosing the cause of the error

31

Error message text matches the error situation

32

Field labels are accurate (match the fields they control)

33

Navigation options work consistently

34

No unexpected data is displayed

35

Online Help for page matches the page from which it is launched

36

Online Help launches correctly

37

Online Help matches the application from which it is launched

38

Position to option positions to the first record that matches what was entered

Environment & Configuration Security Basic functions Page 1 of 2

slide-25
SLIDE 25

Generic Test Case List DRAFT 39

Search results display results that match the passed parameters

40

Values entered into entry areas is saved appropriately

41

API accepts input as documented

42

API provides output as documented

43

API handles error conditions as documented

44

“Look and feel” is consistent within the application

45

“Look and feel” of the application is consistent with other TMI products

46

Accessibility standards followed

47

Application can be navigated using only the keyboard

48

Application can be navigated using only the mouse

49

Error message text is spelled correctly

50

Error message text is understood by customer

51

Field labels are legible

52

Field labels are spelled correctly or abbreviated to generally accepted standards

53

Field labels are understood by target audience

54

Field labels clearly tie to entry areas

55

Navigation options are clearly labeled

56

Number of records displayed per page matches company standard

57

Online Help has no spelling or grammar errors

58

Online Help is business-appropriate

59

Online Help is understood by the customer

60

Users understand navigation options

Usability & Standards Page 2 of 2

slide-26
SLIDE 26

a few briefly fast meaningful rapid sudden a lot by and large foundational more rarely suitable abnormal clear functional natural regular sustained abnormally colorful generally naturally regularly timely acceptable common generous neutral repeatedly to be determined accessible commonly good norm review trendy adequate comprehensible gradually normal right typical after concise idiot-proof normally robust unacceptable all confirm immediately

  • bjectionable

route unclear ample considerable important

  • bvious

secure undesirable any considering improper

  • bviously

send unimportant apparent constantly in general

  • ccasionally

several unintelligible apparently continually inadequate

  • ften

short unsuitable applicable cool incidental

  • n a regular basis

significant unusual appropriate correct incorrect

  • rdinary

similar to useful as desired cute intelligible plain slightly user-friendly as needed direct intermittantly pleasant small usual as required easy to understand invalid pleasing some usually attractive easy to use large plenty sometimes valid authorize efficient legible pretty sooner or later various average enough lengthy prior to standard vibrant bad enters like proper steadily when needed before eventually little quick striking whichever brief extensive long quickly substantial wrong

slide-27
SLIDE 27

Refining Requirements with Test Cases

Summary:

By writing test cases to address missing or incomplete requirements and reviewing your test plan with the project team, you can bring those requirement deficits to light and increase the chances that they will be addressed prior to the testing phase.

Learning Objectives:

  • Spot ambiguity in requirements
  • Spot unaddressed requirement issues
  • Communicate those effectively through test cases

Keywords: verification, requirements, test cases Other Documents Provided:

  • Generic requirement test case list to get you started (GenericTC.xls)
  • Weasel words text file for comparison (WeaselWords.txt)

Before we start:

This presentation assumes that:

  • you are working in an IT organization that follows a development methodology
  • your job includes designing tests
  • you work as part of a project team to create software
  • requirement definition, however it is done where you work, involves creating

written requirements that are shared with you before or while you are designing tests. When working in a more dynamic, less formal setting, these lessons may not apply.

About requirements:

“The hardest single part of building a software system is deciding what to

  • build. No other part of the work so cripples the resulting system if done
  • wrong. No other part is more difficult to rectify later.”

No Silver Bullet: Essence and Accidents of Software Engineering, Frederick P Brooks. Addison Wesley In all of the software development methodologies I have seen, requirements gathering is near the beginning of the project, and testing is near the end. Where in the process you design your tests and review them varies. There is no single best time to refine requirements – too early, and the users don’t yet know what they need; too late, and changes can no longer be made within schedule.

slide-28
SLIDE 28

Refining Requirements with Test Cases

As QA professionals, our verification of the requirements can be carried out both through the processes used by project managers and through our own test-related processes. This presentation concentrates on the latter method.

Spotting ambiguity:

When reading a requirement, sometimes you are instantly aware that the requirement is worded too ambiguously. An extreme example is the requirements I got once for a new web application. The product manager was just letting the developer do whatever he wanted and hoping a viable product would come of it, so the requirements were: “1. It has to work. 2. It has to look cool.” The only way to be more ambiguous is to have no requirements at all – which is a different problem. Usually, though, it is not as obvious when the requirements are ambiguous, especially if you have to evaluate them quickly, or you aren’t given an opportunity to ask questions when they are being created. One way to spot ambiguity in written requirements is by searching them for “weasel words.” [I have included this as a .txt file in the conference materials for your convenience.] These words or phrases, if not defined in the document, are indicators of ambiguous requirements because they indicate a desire on the part of the customer, but don’t provide necessary information to measure whether the product will meet that

  • desire. For example, the word “various” in a written requirement means that the person

who wrote the requirement is trying to capture several different situations that are handled the same way. However, if they don’t list all of the situations specifically, you may fail to test some of them, not knowing they are part of the requirement. Keep in mind that these words aren’t by themselves a sure-fire indicator of an ambiguous

  • requirement. However, when you see a requirement with one or more “weasel words,”

consider it carefully.

  • Does it specify exactly how the user is going to interact with the application?
  • If it’s a system-to-system requirement, does it specify exactly how the two

systems will interact?

  • Does it say exactly what the expected system response will be?
  • Does it cover exception handling or alternate paths?
  • Could you write a test case to measure that?

If not, the requirements are still ambiguous. All requirements start out that way, but our verification helps to make them less so. Finding ambiguity in requirements will improve the overall quality of the end product and help keep the project on track, but the most serious requirement issues I have seen have not been simple ambiguity – those have been marked by requirements that are completely missing.

Spotting unaddressed requirement issues:

slide-29
SLIDE 29

Refining Requirements with Test Cases

How often has this happened to you?

  • 1. During customer acceptance, the customer asks what happened to feature X that he

was promised at the beginning of the project. He refuses to sign off until feature X is put in, and there’s no longer enough time for regression testing if it’s added.

  • 2. After the requirements have been agreed to, the customer mentions in passing that

there is another business unit who will be affected by the project. That business unit hasn’t been involved in the project, but the project leader wants them to be included in customer acceptance “to give them a feel for how it will work.” During customer acceptance, the new business unit has no idea what’s going on. They want the project halted until they determine how they will have to change their processes to accommodate the changes to their workflow, or the project can go ahead, but must be temporarily disabled to keep the changes from occurring.

  • 3. Your security expert doesn’t get brought into the project until after the coding has
  • begun. She strongly recommends several changes to the design, but the architect

doesn’t have time to rework the design. No one makes any public statements about the changes, so you aren’t sure when you get the product for testing whether you’re getting the product the security expert wanted, or whether it is what you were expecting before she got involved.

  • 4. When your testing phase is about to begin, the project leader asks to “throw in” a

usability test. You get the test results, but no one knows what, if anything, to do about them.

  • 5. When your testing phase is going on, the project leader asks to “throw in”

performance testing. You get the test results, but no one knows what, if anything, to do about them.

  • 6. The project is complete, and you’re wrapping things up to hand it off to support when

they ask, “What do we do when we need to fix user data on the fly?” You suddenly realize that, even though that’s something that gets done all the time in the old application, no one built a safe and easy way to do it in the new application. All of these are examples of missed requirements.

  • 1. This was a missed functionality requirement. Your project team had the right

customer, and they told you what they wanted, but someone forgot to write it down, or write the code, somewhere along the line.

  • 2. This is a combination of missing functionality requirements and missing

documentation and training requirements. In this case, your project team did not have the right customers at the beginning of the project.

  • 3. This was a missed security requirement. Security cannot be effectively “built in”

after the functionality – it has to be designed in. Losing sight of that at the beginning of the project leads to situations like this.

slide-30
SLIDE 30

Refining Requirements with Test Cases

  • 4. This was a missed usability requirement. The project manager should anticipate

these at the beginning of the project and plan for it. Usability testing could be done earlier, with the prototype, to allow suggested changes to be included in the final product.

  • 5. This was a missed performance requirement. If no performance standard exists,

either it needs to be created and tested for/monitored against, or no testing is

  • required. A descriptive performance test, while interesting, can’t “pass” or “fail”

because there is no criterion to judge it against.

  • 6. This was a missed supportability requirement. The team or teams who support

the application should be consulted at the beginning of the project to ensure these don’t get missed. In addition to preventing missed requirements, this improves rapport with the people who have to support your application. Bad rapport with this group could lead to poor support for the application, or even bad-mouthing your application to the customers when they have to make changes. There are many reasons a requirement could be missed:

  • Process failure: maybe your methodology doesn’t capture a certain category of

requirements.

  • Training failure: maybe your project team doesn’t know how to use the

methodology correctly.

  • Specific project failures:
  • Timeline pressures caused the team to rush through steps
  • Political pressures prevented honest discussion of some issues
  • High turnover within the project team caused the team to lose focus
  • Personality issues between team members hindered discussions

Some specific actions that cause requirement failure include:

  • Not talking to the users
  • Talking to the wrong users
  • Not talking to ALL the users
  • Not writing down requirements
  • Not keeping track of the requirements
  • Not having open reviews of the requirements
  • Not having a prototype when a prototype is appropriate
  • Not taking notes during prototype reviews
  • Not asking questions when things are unclear
  • Not considering the effects of all decisions
slide-31
SLIDE 31

Refining Requirements with Test Cases

  • Not listening to feedback
  • Not asking about

training/documentation/usability/performance/security/supportability requirements

  • Allowing design free-for-alls without any review
  • Keeping people who disagree with each other from discussing the requirements
  • “Throwing in” usability or performance tests without a plan
  • “Throwing in” designs for performance or security after construction has begun
  • Not considering security or legal concerns

The only thing that can prevent missed requirements is a very strong project manager who anticipates problems and avoids them, but those are very rare, indeed. The next best thing is if everyone on the project team, including the QA lead, takes it upon himself to assist the project manager. How you can best do so depends on your project manager’s preferences and your own, but here are some suggestions that have worked for me:

  • Show interest in your project manager’s methodology. Look into the methodology to

find out whether there are things that aren’t being addressed early enough in the project plan, and talk to your project manager about it. Maybe they had to put this together themselves, and forgot about the need for security, or usability tests. Maybe they brought the methodology with them from another job where some of the test factors were significantly different. Maybe they inherited the methodology and want to replace it. You won’t know unless you ask.

  • Bring all requirement issues to the attention of the project manager first before

attempting to get them resolved. No one likes to be ambushed by issues.

  • Find a project management checklist from a project management source like

http://www.gantthead.com/ and use it to evaluate the requirements you get and the project milestones. Some of the lists are already phrased as questions to ask your

  • customers. They are all good for their purposes; you need to pick one that works for

you and for your project. (Checklists for a project to build a hardware device, a client application, and an e-commerce site, for example, may differ greatly.)

  • Remember, the project leader has the same goal as you on this: they want “SMART”

requirements – Specific, Measurable, Achievable, Realistic, and Timely. You’re going to care about the “specific and measurable” part more than anyone else on the team, so use your insight to help your project leader.

slide-32
SLIDE 32

Refining Requirements with Test Cases

Communicating requirement issues clearly with test cases

There are two ways I have been able to communicate ambiguous or missing requirements through test cases. However, neither of them would work if I did not sit down to review my test cases with the customers and project manager. I consider the test cases to be the last step in design in which these requirement issues can be effectively resolved, so these methods are designed to encourage the rest of the project team to take the issues as seriously as I do. Method One: highlight in test cases. If your test case or test plan document has a place to indicate issues, assumptions, or risks, use this to raise the issues you have spotted with the requirements and discuss briefly what the potential impact would be of failing to specify the requirements. An example: Notes/Assumptions/Limitations common to this set:

  • This assumes that the QA data set will have certain records in place and

certain records absent when delivered for testing. If these records are not as expected, the test data used can be modified.

  • The first increment will be delivered without a security layer that restricts

transactions.

  • The screen layouts have not been communicated at this time. These test

cases will have to be modified for later regression testing to include the necessary screen shots and field labels.

  • The user has not requested consistency as a requirement, and no UI

standards are in place. Therefore, test cases A.A.2af and A.A.2ag will not be tested for this project.

  • A separate usability test will be arranged for this functionality.
  • Performance standards have not been set for these functions, and will not

be tested.

slide-33
SLIDE 33

Refining Requirements with Test Cases

Method Two: force a decision. Test Case Identifiers: A.A.1.1f User in multiple sessions is ___________ (denied access? Notified? Ignored? Shot?) Risk Factor: C (Discretionary) Associated Use Case: A.A.1.1 Access Agency System Internally Associated Data: none Assumptions & Dependencies: Security System has been updated with user info Scenario: User logs in multiple times

Action Initial Screen Data Expected Results 1 Type user name into username field I-Login “goodguy” User name is displayed in username field 2 Type password into password field I-Login “teriyaki” Eight asterisks are displayed in the password field 3 Submit the user info I-Login {Enter} or Click “Submit” Intranet Applications Main Page is displayed. 4 Enter the Agency System Intranet Apps Main Page Select “Agency” Agency Search page is displayed 5 Repeat the four steps above this one three times in separate browser sessions 6 Type user name into username field I-Login “goodguy” User name is displayed in username field 7 Type password into password field I-Login “teriyaki” Eight asterisks are displayed in the password field 8 Submit the user info I-Login {Enter} or Click “Submit” User is notified /permitted to continue/ denied access/ shot/ electrocuted/ insulted/ given $1 million?

Why does this work?

  • It makes it clear that a decision needs to be made
  • The context of a test case makes it clear why you need to know the outcome
  • It is disarming; the humor in the situation often gets past peoples’ defenses
  • It focuses the group on what happens to the user instead of how the system

performs a task. Thank you all for your time. Please feel free to send me any requirements stories, questions, or rants at tmartinm@texasmutual.com or stop by and visit me at Texas Mutual Insurance Company when you’re in Austin, Texas.