Software Reviews Fernando Brito e Abreu (fba@di.fct.unl.pt) - - PDF document

software reviews
SMART_READER_LITE
LIVE PREVIEW

Software Reviews Fernando Brito e Abreu (fba@di.fct.unl.pt) - - PDF document

Software Reviews Fernando Brito e Abreu (fba@di.fct.unl.pt) Universidade Nova de Lisboa (http://www.unl.pt) QUASAR Research Group (http://ctp.di.fct.unl.pt/QUASAR) ABSTRACT WHAT IS A REVIEW ? WHY REVIEWS ? DEFECT CLASSIFICATION


slide-1
SLIDE 1

1

Software Reviews

Fernando Brito e Abreu (fba@di.fct.unl.pt) Universidade Nova de Lisboa (http://www.unl.pt)

QUASAR Research Group (http://ctp.di.fct.unl.pt/QUASAR)

Software Engineering / Fernando Brito e Abreu 2 6/28/2005

ABSTRACT

  • WHAT IS A REVIEW ?
  • WHY REVIEWS ?
  • DEFECT CLASSIFICATION
  • SCHEDULE AND DURATION
  • REVIEW TYPES
  • WALKTHROUGHS
  • INSPECTIONS
  • THE FINAL REPORT
  • DEFECT PREVENTION
slide-2
SLIDE 2

2

Software Engineering / Fernando Brito e Abreu 3 6/28/2005

What is a Review ?

An evaluation of any deliverable done by

  • ther team members besides the author

A validation technique (e.g.: detection of

discrepancies with the requirements spec)

A verification technique (e.g.: detection of

non conformities with adopted standards)

Software Engineering / Fernando Brito e Abreu 4 6/28/2005

Why Reviews ?

To counteract against perceptive mismatch

Inability to deal with detailed information chunks

and loosely defined complex interactions

George Miller, "The Magical

Number 7 ± 2 : Some Limits in our Capacity for Processing Information", Psychological Review, n.63, p.81-97, 1956.

slide-3
SLIDE 3

3

Software Engineering / Fernando Brito e Abreu 5 6/28/2005

Why Reviews ?

To share responsibility

In most Engineering areas (e.g. Civil or

Mechanic) projects are signed both by author and reviewer (thus becoming co-responsible)

Software Engineering / Fernando Brito e Abreu 6 6/28/2005

Why Reviews ?

Defects found earlier cost less to remove Better visibility and control of the development

process

Continuous learning Positive impact in teamwork Easier maintenance Less harm when somebody leaves a project Considerable reduction of testing effort and

remaining defects [Freedman90]

slide-4
SLIDE 4

4

Software Engineering / Fernando Brito e Abreu 7 6/28/2005

Schedule and Duration

Schedule:

Immediately after the conclusion of a given stage of an

identified deliverable

Included in Project Plan and done regularly and

repetitively

No immunity!

Duration

Depends on deliverable dimension and complexity Hint: a fixed percentage of the time spent developing it [Britcher88] proposes they should not exceed 2 hours Short reviews (e.g.: 1 h), done often, are preferable

Software Engineering / Fernando Brito e Abreu 8 6/28/2005

What to Review?

All kinds of deliverables are eligible If possible we should review everything

but often we cannot afford that extra effort therefore we must select a sample

Sampling has a statistic connotation How to sample the material to review?

slide-5
SLIDE 5

5

Software Engineering / Fernando Brito e Abreu 9 6/28/2005

What to Review?

Shall we select:

a random sample (like in other areas of

industry?

a representative sample

  • f best practices?
  • f average practices?
  • f worst practices?

The aim is to maximize review efficiency!

Complexity metrics can (should) be used ...

Software Engineering / Fernando Brito e Abreu 10 6/28/2005

Reviews Types

REVIEWS Informal Formal Internal Walkthroughs Inspections External Demonstrations Audits

slide-6
SLIDE 6

6

Software Engineering / Fernando Brito e Abreu 11 6/28/2005

Walkthroughs

Revisions in which one participant makes a

description of the analyzed deliverable

Usually the presenter is the author (not compulsive) All participants try to identify defects and non

conformities

No distinct role is assigned to participants (except

presenter)

Detail and speed depend on presenter preparation This type of review should not be long, therefore also

not the sample being reviewed

Software Engineering / Fernando Brito e Abreu 12 6/28/2005

Walkthroughs

Bad aspects

Lack of formality often leads to uncontrolled situations Less effectiveness and efficiency compared to

Inspections

Defects found are often neither classified nor recorded

Good aspects

Small cost (no preparation time is required ...) Not much intrusive Applicable in small organizations Can bootstrap Inspections!

slide-7
SLIDE 7

7

Software Engineering / Fernando Brito e Abreu 13 6/28/2005

Inspections (Peer Reviews)

Initially proposed by M. E. Fagan at IBM [Fagan76]

and matured during a decade of usage [Fagan86]

Effective technique for defects detection Formal review technique with well-defined input

and output requirements

Software Engineering / Fernando Brito e Abreu 14 6/28/2005

Inspections

A group of participants is nominated Deliverable to review, as well as related others,

must be made available beforehand (e.g.: 3 or 4 days)

Participants must be familiar with inspections

procedures

specific training is required to be able to participate

Each participant has a well defined role:

moderator, reader, producer and recorder

slide-8
SLIDE 8

8

Software Engineering / Fernando Brito e Abreu 15 6/28/2005

Inspections - Moderator

Directs the meeting, registering attendance and

individual preparation times

Avoids deviations during the inspection (e.g.:

tendency to propose solutions)

Avoids or solves conflicts Must adopt a non authoritarian role Registers the total meeting duration Submits results to the defect tracking system Deliberates about the meeting follow-up

Software Engineering / Fernando Brito e Abreu 16 6/28/2005

Inspections - Reader

Divides (before the meeting) the reviewed

deliverable in small sections for presentation purposes

Describes each section, with adequate granularity,

during the meeting

Guarantees the inspection pace, adopting a

rhythm, neither too fast, nor too slow

Example:200 documented LOCs per hour

slide-9
SLIDE 9

9

Software Engineering / Fernando Brito e Abreu 17 6/28/2005

Inspections - Producer (author)

Assures that the deliverable to be reviewed, as well

as others inter-related ones, are available for all participants in advance

During the meeting will contribute with his/her

particular knowledge of the deliverable being reviewed to solve emerging doubts

After the meeting must guarantee that defects

found are removed (other person can be assigned)

Software Engineering / Fernando Brito e Abreu 18 6/28/2005

Inspections - Recorder

Responsible for the synthesis of opinions expressed

by all participants

Registers the following:

physical location of defects found (note: that is why all

material should be printed with line numbers)

description of defects category of each defect (from checklist) defects root cause (if identified)

slide-10
SLIDE 10

10

Software Engineering / Fernando Brito e Abreu 19 6/28/2005

Inspections - All participants

Try to identify defects, conformance to internally

adopted standards and synchronic traceability before the meeting

Only report a given defect when the reader goes

through the corresponding section of the deliverable being reviewed

Help to classify and identify the root cause of

defects found

Sign the participation document

Software Engineering / Fernando Brito e Abreu 20 6/28/2005

Inspection role accumulation

Reader Recorder Author Moderator Yes Yes No Reader No No Recorder No

slide-11
SLIDE 11

11

Software Engineering / Fernando Brito e Abreu 21 6/28/2005

The Final Report

In all types of review a final report must be

  • produced. It should include:

detected defects defects classification meeting date meeting duration participants total preparation time estimated time/effort for detected defects removal

Software Engineering / Fernando Brito e Abreu 22 6/28/2005

The Final Report (continued)

Plays a very important role for all participants in the

development process:

project manager client programmer quality assurance team

Data collection and report generation should be

computer supported!

slide-12
SLIDE 12

12

Software Engineering / Fernando Brito e Abreu 23 6/28/2005

Inspections - Gold Rules

Inspections are for peers. Upper management

representatives should not be present!

Solution for defects removal should not be sought! The product, not the producer, is being reviewed! Checklists should be used for defect mining (e.g.

JavaCheck)

Inspection team becomes, as a whole, co-

responsible for the quality of the deliverable

Remaining defects must be assumed by the team

Software Engineering / Fernando Brito e Abreu 24 6/28/2005

Inspection Costs

FIXED

Planning (schedule/staff) Setting sampling criteria Setting defect and fault

classification scheme

Form development Building / selecting tool Training

FOR EACH MEETING

Organizing effort

selecting “sample” contacting participants

Meeting room Preparation effort Meeting effort Reporting defects

found

Correction effort (?…)

slide-13
SLIDE 13

13

Software Engineering / Fernando Brito e Abreu 25 6/28/2005

Inspection Benefits (revisited)

Better teamwork / motivation Exchange of experience / knowledge between

the participants (acts as actual training)

Identification of specific training needs Break in the monotony (by exchanging roles) Incentive to increase performance (have less

errors than the average)

Promotion of a Quality Culture

Software Engineering / Fernando Brito e Abreu 26 6/28/2005

Inspection Benefits (revisited)

Standards enforcement Reduction in defect rate and in V&V costs Increased understanding / visibility of ongoing

projects

Reduction of personal turnover Database of inspection results - allows to base

defect prevention actions (causal analyses)

Several of these benefits are not tangible!

slide-14
SLIDE 14

14

Software Engineering / Fernando Brito e Abreu 27 6/28/2005

Cost/Benefit ratio

(with and without reviews)

783 82 2177 116 Totals 201 3 804 12 67,0 Production 315 21 1230 82 15,0 Pre-installation/Integration testing 234 36 143 22 6,5 Coding / Modular testing 33 22 1,5 Analysis and Design Cost # Def. Cost # Def. defect PHASE REVIEWS WITH REVIEWS NO Cost p/ Note: a cost reduction of 64% occurred!

Source: [Pressman97]

Software Engineering / Fernando Brito e Abreu 28 6/28/2005

The Human Factor

As all other Software Quality Improvement

activities and techniques, reviews may be:

considered as a threat to individual ”freedom” banned with the excuse of being an overhead

Counter measures:

Publicize defect efficiency numbers asap! Recognize and reward the work of best

reviewers!

slide-15
SLIDE 15

15

Software Engineering / Fernando Brito e Abreu 29 6/28/2005

Variants / Critics to Fagan

Inspections

Simple Review (author plus another)

usually adopted in other fields of Engineering

Active Design Reviews [Parnas87]

group meetings can be skipped (lower cost) Meeting Gain (MG) must be calculated: MG = Defects found during meeting / All defects found MG = average 33% but with high variance [Porter95]

Software Engineering / Fernando Brito e Abreu 30 6/28/2005

Variants / Critics to Fagan

Inspections

Phased Inspections - using InspeQ toolset

[Knight91]

Department of Computer Science / University of Virginia

Cost-Benefit assessment of alternatives

[McCarthy96] ATT Bell Labs/University of Maryland experiments Inspection alternatives:

PI: Preparation-Inspection (1st round is for understanding) DC: Detection-Collection (1st round emphasis on detection) DD: Detection-Detection (2 rounds with no meeting) DD alternative had the best cost-benefit ratio!

slide-16
SLIDE 16

16

Software Engineering / Fernando Brito e Abreu 31 6/28/2005

Review Tools Web Links

InstantQA (http://www.reasoning.com/) ReviewPro ( http://www.sdtcorp.com/) CheckMate ( http://www.sybernet.ie ) Leap (http://csdl.ics.hawaii.edu/Tools/LEAP/LEAP.html) CSRS ( http://www.ics.hawaii.edu/~csdl/csrs.html) Assist

(http://www.cs.strath.ac.uk/research/efocs/assist.html)

Software Engineering / Fernando Brito e Abreu 32 6/28/2005

Relevant standards

ANSI/IEEE Std 1012 : "Standard for Software

Verification and Validation Plans"

ANSI/IEEE Std 1028 : "Standard for Software

Reviews and Audits"

DoD Mil Std 1521 B : "Technical Reviews and

Audits for Systems, Equipments and Computer Software"

slide-17
SLIDE 17

17

Software Engineering / Fernando Brito e Abreu 33 6/28/2005

Bibliography (1)

[Ackerman89] Ackerman, A. Frank & Bushwald, Lynne S. & Lewski, Frank H. : "Software Inspections: An Effective Verification Process", IEEE Software, 6(3), p.31-36, May 1989. [Britcher88] Britcher, Robert N. : "Using Inspections to Investigate Program Correctness", IEEE Computer, p.38-44, November 1988. [Collofello88] Collofello, James S. : "The Software Technical Review Process", Curriculum Module SEI-CM-3- 1.5, Software Engineering Institute, Carnegie Mellon University, Pittsburgh, June 1988. [Cross88] Cross, John

  • A. (ed), "Support Materials for the

Software Technical Review Process", Support Materials SEI-SM-3-1.0, Software Engineering Institute, Carnegie-Mellon University, Pittsburgh, April 1988.

[Fagan76] Fagan, Michael E. : "Design and Code Inspections to Reduce Errors in Program Development", IBM Systems Journal, 15(3), March 1976. [Fagan86] Fagan, Michael E. : "Advances in Software Inspections", IEEE Transactions on Software Engineering, 12(7), p. 744-753, July 1986. [Freedman90] Freedman, Daniel P. & Weinberg, Gerald M. : "Handbook of Walkthroughs, Inspections and Technical Reviews: Evaluating Programs, Projects and Products" (3rd edition), Dorset House Publishing, NY, ISBN 0-932633-19-6, 1990. [Gale90] Gale, J. L. & Tirso, J. R. & Burchfield, C. A. : "Implementing the Defect Prevention Process in the MVS Interactive programming organization", IBM Systems Journal,29(1), p.33-43, 1990. [Gilb93] Gilb, Tom & Graham, Dorothy: “Software Inspection”, Addison- Wesley, ISBN 0-201-63181-4, 1993.

Software Engineering / Fernando Brito e Abreu 34 6/28/2005

Bibliography (2)

[Parnas87] Parnas, D.L. & Weiss, D.M. : "Active Design Reviews: Principles and Practices", Journal of Systems and Software, nº7, p.259-265, 1987. [Pressman97] Pressman, Roger S. : "Software Engineering: a Practitioner's Approach" (4th edition), McGraw-Hill, 1997. [Russell91] Russell, Glen W. : "Experience with Inspection in Ultralarge-Scale Developments", IEEE Software, 8(1), p.25-31, January 1991. [Strauss94] Strauss, Susan H. & Ebenau, Robert G., Software Inspection Process, McGraw-Hill, 1994. [Votta93] Lawrence G. Votta, “Does Every Inspection Need a Meeting?”, SIGSOFT'93 Symposium on Foundations of Software Engineering, ACM Press, 1993. [Yourdon89] Yourdon, Edward : Structured Walkthroughs (4th edition), Yourdon Press, Prentice-Hall, 1989. [Hollocker90] Hollocker, Charles P. : "Software Reviews and Audits Handbook", John Wiley, New York, 1990. [Knight91] Knight,John C. & Myers, E. Ann : "Phased Inspections and Their Implementation", ACM Sigsoft, SW Engineering Notes, 16(3), p.29, July 1991. [Mays90] Mays, R.G. & Jones, C.L. & Holloway, G.J. & Studinski, D.P. : "Experiences with Defect Prevention", IBM Systems Journal, 29(1), 1990. [McCarthy96] Patricia McCarthy, Adam Porter, Harvey Siy & Lawrence G. Votta Jr.“An Experiment to Assess Cost- Benefits of Inspection Meetings and their Alternatives: A Pilot Study”, Third Intern. Software Metrics Symposium, IEEE Computer Society Press, Berlin, March 96.