Software Measurement Massimo Felici and Conrad Hughes - - PowerPoint PPT Presentation

software measurement
SMART_READER_LITE
LIVE PREVIEW

Software Measurement Massimo Felici and Conrad Hughes - - PowerPoint PPT Presentation

Software Measurement Massimo Felici and Conrad Hughes mfelici@staffmail.ed.ac.uk conrad.hughes@ed.ac.uk http://www.inf.ed.ac.uk/teaching/courses/sapm/ Slides: Dr James A. Bednar SAPM Spring 2009: Software Measurement 1 Why Measure? If we


slide-1
SLIDE 1

Software Measurement

Massimo Felici and Conrad Hughes

mfelici@staffmail.ed.ac.uk conrad.hughes@ed.ac.uk http://www.inf.ed.ac.uk/teaching/courses/sapm/

Slides: Dr James A. Bednar

SAPM Spring 2009: Software Measurement 1

slide-2
SLIDE 2

Why Measure?

If we want to make reasonable decisions about projects, we have to measure some sort of data on which to base those decisions. However, making such measurements is extremely time consuming, which is expensive. It is also extremely difficult (impossible?) to get unambiguous data. Thus it is crucial to decide what to measure about your projects, and what you will do with that information.

SAPM Spring 2009: Software Measurement 2

slide-3
SLIDE 3

Identifying Issues

How do you figure out what to measure? Some are

  • bviously things that you must measure to have any idea

what you are doing:

  • Project constraints

(e.g. you need to know if you are going over budget)

  • External requirements/Product acceptance criteria

(you need to demonstrate that requirements are met) Others are based on analysis of what risks you face in this project, what has gone wrong in previous projects, etc.

SAPM Spring 2009: Software Measurement 3

slide-4
SLIDE 4

Issues That Can Be Measured

  • 1. Schedule : Can we expect it to be done on time?
  • 2. Cost : Can we afford to finish this project, or will it end

up costing more than it is worth?

  • 3. Size : How big is the product so far? Is the scope stable?
  • 4. Quality : Is the product being made well, with few bugs?
  • 5. Ability : How much design/coding/debugging/etc. can

this team do per month?

  • 6. Performance : Is the program fast enough, using

reasonable resources? Most of these interact strongly with the others.

SAPM Spring 2009: Software Measurement 4

slide-5
SLIDE 5

Issues 1. Schedule

What you want to know: What you can measure: Is progress being made? Dates of milestone delivery Is work being done? Components completed Requirements met Paths tested Problem reports resolved Reviews completed Change requests completed

SAPM Spring 2009: Software Measurement 5

slide-6
SLIDE 6

Issues 2. Cost

What you want to know: What you can measure: How much is it demanding Total effort

  • f our staff?

Number of staff involved Staff experience levels Staff turnover Are we getting our money’s Earned value worth? Cost Is project making good use Availability dates (too early, late?)

  • f external resources?

Resource utilization

SAPM Spring 2009: Software Measurement 6

slide-7
SLIDE 7

Issues 3. Size

What you want to know: What you can measure: How large is this program Lines of code so far? Number of components Words of memory Database size How much does this Requirements met program accomplish so far? Function points Change requests completed

SAPM Spring 2009: Software Measurement 7

slide-8
SLIDE 8

Issues 4. Quality

What you want to know: What you can measure: How reliable is the software? Problem reports Defect density Failure interval How hard was it to fix Rework size the bugs? Rework effort

SAPM Spring 2009: Software Measurement 8

slide-9
SLIDE 9

Issues 5. Ability

What you want to know: What you can measure: Is the development process well managed? Capability Maturity Model level How productive is this team? Code size / effort Functional size / effort

SAPM Spring 2009: Software Measurement 9

slide-10
SLIDE 10

Issues 6. Performance

What you want to know: What you can measure: Is the program fast enough? Cycle time Are the resources required by CPU utilization the program acceptable? I/O utilization Memory utilization Response time

SAPM Spring 2009: Software Measurement 10

slide-11
SLIDE 11

Prioritizing Issues Example

Issue Probability Relative Project

  • f occurrence

impact exposure Aggressive schedule 1.0 10 10 Unstable requirements 1.0 8 8 Staff experience 1.0 5 8 Reliability requirements 0.9 3 4 COTS performance 0.2 9 1

Only do significant work to measure those issues to which the exposure is high; any extra work you give your coders for useless quantities will put all your data at risk.

SAPM Spring 2009: Software Measurement 11

slide-12
SLIDE 12

Making a Measurement Plan

  • Issues and measures
  • Data sources
  • Levels of measurement
  • Aggregation structure
  • Frequency of collection
  • Method of access
  • Communication and interfaces
  • Frequency of reporting

SAPM Spring 2009: Software Measurement 12

slide-13
SLIDE 13

Limitations 1

  • Most measures are misleadingly precise, yet not very

accurate

  • Size doesn’t map directly to functionality, complexity,
  • r quality
  • Incremental design requires measuring of incomplete

functions

  • The most meaningful software statistics are time

consuming to collect

SAPM Spring 2009: Software Measurement 13

slide-14
SLIDE 14

Limitations 2

  • Many measures only apply after coding has been

done

  • Performance and resource utilization may only be

known after integration and testing

  • Often no distinction between work and re-work
  • Milestones don’t measure effort, only give critical

paths

  • Time lag between problems and their appearance in

reports

SAPM Spring 2009: Software Measurement 14

slide-15
SLIDE 15

Limitations 3

  • Difficult to compare relative importance of measures
  • Important measures may be spread across

components

  • Hard to find reliable historical data to compare with
  • Changes suggested by one performance indicator

may affect others

  • Overall Capability Maturity Model level may not predict

performance on a specific project

SAPM Spring 2009: Software Measurement 15

slide-16
SLIDE 16

Checking Your Data

  • Are units of measure comparable (e.g. lines of code in

Java versus Python)? Normalization?

  • What are acceptable ranges for data values?
  • Can we tolerate gaps in data supplied?
  • How far does a value have to be from our plan for us

to need a new plan?

SAPM Spring 2009: Software Measurement 16

slide-17
SLIDE 17

Indicator 1. Design Progress

Project time Number of units completing design Planned Actual

With an indicator and a plan, you can see if you are on track.

SAPM Spring 2009: Software Measurement 17

slide-18
SLIDE 18

Indicator 2. Effort

Project time Staff hours per month Actual Planned

SAPM Spring 2009: Software Measurement 18

slide-19
SLIDE 19

Estimator: Size-Effort

Staff months (log scale) Upper 95% Lower 95% Number of lines of source code (log scale)

With enough data, you can try to predict future performance.

SAPM Spring 2009: Software Measurement 19

slide-20
SLIDE 20

Summary

  • Measurement is time-consuming, difficult, and

impossible to do perfectly

  • You need to choose what you want to find out, and

how to approach measuring that

  • Always be aware of the limitations of the measurement

and of how it relates to what you really want to know

  • Be careful when trying to relate past performance to

the future

SAPM Spring 2009: Software Measurement 20

slide-21
SLIDE 21

Required Readings

  • S.L. Pfleeger, R. Jeffery, B. Curtis, B. Kitshenham. Status Report on Software
  • Measurement. IEEE Software, March/April 1997.
  • B. Clark. Eight Secrets of Software Measurement. IEEE Software,

September/October 2002.

  • J. Boegh, S. De Panfilis, B. Kitchenham, A. Pasquini, A Method for Software

Quality Planning, Control, and Evaluation. IEEE Software, March/April 1999.

  • J. Clapp. Getting Started on Software Metrics. IEEE Software, January 1993.

SAPM Spring 2009: Software Measurement 21

slide-22
SLIDE 22

Suggested Readings

  • S.L. Pfleeger. Software Metrics: Progress after 25 Years?, IEEE Software,

November/December 2008.

  • R.J. Offen, R. Jeffery. Establishing Software Measurements Programs. IEEE

Software, March/April 1997.

SAPM Spring 2009: Software Measurement 22