WEBINAR: Solving the Process Maturity Challenge Presented by David - - PowerPoint PPT Presentation

webinar solving the process maturity challenge presented
SMART_READER_LITE
LIVE PREVIEW

WEBINAR: Solving the Process Maturity Challenge Presented by David - - PowerPoint PPT Presentation

WEBINAR: Solving the Process Maturity Challenge Presented by David Consulting Group (DCG) and MKS, Inc. Establishing Effective Measurements in Support of the SEI CMMI 1 Terms CMMI Capability Maturity Model Integration Set of


slide-1
SLIDE 1

WEBINAR: Solving the Process Maturity Challenge Presented by David Consulting Group (DCG) and MKS, Inc.

slide-2
SLIDE 2

1

Establishing Effective Measurements in Support of the SEI CMMI

slide-3
SLIDE 3

2

Terms

  • CMMI ­ Capability Maturity Model

Integration ­ Set of guidelines to achieve process maturity

  • Level 1­5 ­ Numerical designation of

process maturity per the CMMI

slide-4
SLIDE 4

3

Terms

Maturity Levels 5 Optimizing 4 Managed 3 Defined 2 Repeatable 1 Initial

slide-5
SLIDE 5

4

Presentation Objective

  • To present recommendations for establishing an

effective measurement program compliant with CMMI Level 3 Process Areas

  • Topics include:
  • Steps to developing an effective and practical

measurement program

  • An overview of the CMMI measurement requirements

for Level 2 and 3

  • A recommended set of metrics which comply with the

model requirements and promote process improvement

slide-6
SLIDE 6

5

Establishing a Measurement Program

Steps to developing an effective and practical measurement program

  • Obtain the sponsorship and support of senior management
  • Create an organizational metrics “guru” role

– e.g CMO (Chief Measurement Officer)

  • Agree on criteria for measurement selection (see following slides) and

the business goals to support

slide-7
SLIDE 7

6

Establishing a Measurement Program

  • Think through and address implementation

requirements

– e.g., tools, training, data collection and reporting processes, etc.

  • Define and document measurements in a

Measurement Plan

– current and new

slide-8
SLIDE 8

7

Criteria for Selecting Measurements

  • Measurements should support business goals and needs.

For this presentation, assuming the following business needs:

  • Meeting cost, schedule and technical commitments
  • Managing and improving software quality
  • Managing and improving software productivity
  • Managing and improving process effectiveness
  • The measurements must satisfy CMMI Level 3 practices
  • The measurements must be relatively easy to implement

and be kept to an effective minimum number

slide-9
SLIDE 9

8

Criteria for Selecting Measurements per Watts Humphrey*

  • The measures must be robust

– i.e., precise and relatively unaffected by minor changes in tools, methods, or product characteristics.

  • The measures should suggest a norm

– i.e., the meaning of a high or low value should be obvious.

  • The measures should relate to specific product or

process properties

– i.e., errors, size, or resources expended.

  • The measures should suggest an improvement

strategy

– i.e., should indicate what needs to be improved. * from Managing the Software Process

slide-10
SLIDE 10

9

Criteria for Selecting Measurements per Watts Humphrey*

  • The measures should be a natural result of the process.

– The effort to collect measurements should be kept to a minimum.

  • The measures should be simple.

– They should not be difficult to explain.

  • The measures should be both predictable and trackable

– i.e., measures that provide planned versus actual comparisons.

* from Managing the Software Process

slide-11
SLIDE 11

10

CMMI measurement requirements

CMMI measurement requirements for the Level 2 and 3 Process Areas

slide-12
SLIDE 12

11

Level 2 Process Area

Measurement and Analysis

Measurement and Analysis describes the need to develop and sustain a measurement capability that is used to support management information needs.

slide-13
SLIDE 13

12

CMMI Measurement Requirements

  • CMMI defines measurements requirements for:

Level 2 – Measurements and Analysis – Project Planning

  • Estimating project attributes

– Project Monitoring and Control

  • Monitoring project performance

Level 3 – Integrated Project Management – Requirements Development – Product Integration – Verification – Validation

slide-14
SLIDE 14

13

CMMI Measurement Requirements

  • Measurements are also needed to support CMMI activities:
  • For project monitoring and control, estimates and actuals for:
  • Schedule
  • Size of software work products
  • Effort
  • Cost
  • Critical Computer Resources (disk, memory, etc)
  • PPQA audit results
  • Numbers and status of change requests and problem reports

against configuration items

  • Historical process data from across the organization
  • Defect data from peer reviews and testing activities
  • Data on the conduct of peer reviews
slide-15
SLIDE 15

14

Levels of Data Collection and Reporting

Project A Project B Project C Project D Org. Tasks

TASKS Planning Tracking CM Reqs. Design Code …….. TASKS Planning Tracking CM Reqs. Design Code …….. TASKS Planning Tracking CM Reqs. Design Code …….. TASKS Planning Tracking CM Reqs. Design Code …….. TASKS EPG PPQA Process Def. Audits Training ……..

Organizational Measurements

slide-16
SLIDE 16

15

CMMI Compliant Metrics

A recommended set of compliant metrics which promote process improvement

slide-17
SLIDE 17

16

Recommended CMMI Measures

Project, Task To identify and address staffing issues; to support current & future planning Meeting cost and schedule commitments

PP SP 1.1 PP SP 1.2 PP SP 1.4 PP SP 2.1 PMC SP 1.1 PMC SP 1.6 PMC SP 1.7 PPQA SP 2.2 OPD SP 1.4 OPF SP 1.3 IPM SP 1.2 IPM SP 1.4

Effort – planned vs. actual, including:

  • Development tasks
  • Maintenance
  • Project mgmt. tasks
  • Intergroup tasks
  • Quality Assurance
  • EPG tasks

Project To manage cost commitments; to identify & address cost issues as early as possible Meeting cost commitments

PP SP 2.1 PMC SP 1.1 PMC SP 1.6 PMC SP 2.2 IPM SP 1.2 OPD SP 1.4 MA SP 1.2 MA SP 2.1

Cost – planned vs. actual Level of Tracking Justification Business Need Supported CMMI PAs Measures

slide-18
SLIDE 18

17

Sample Chart

Cost ($K) Jan­02 Feb­02 Mar­02 Apr­02 May­02 Jun­02 $K ­ Plan 50 100 150 200 250 275 $K ­ Actual 25 80 120 175 % Work Complete 15% 30% 40% 50%

Project Level Cost Measurement Example

50 100 150 200 250 Jan­02 Feb­02 Mar­02 Apr­02 May­02 Jun­02 $K 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% $K ­ Plan $K ­ Actual % Work Complete

slide-19
SLIDE 19

18

Recommended CMMI Measures

Project To identify and address scope issues, e.g., “reqs. creep”; to support current replanning and future planning; to support productivity Meeting cost, schedule, and functionality commitments

RD SP 2.1 RD SP 3.3 PP SP 1.2 PP SP 1.4 PMC SP 1.1 PMC SP 1.6 OPD SP 1.4 IPM SP 1.2 IPM SP 1.4 IPM SP 1.5 IPM SP 1.6

Size & Requirements (e.g., Function Points) – estimated vs. current Project, Task To identify and address schedule issues Meeting schedule commitments

PP SP 1.1 PP SP 1.2 PP SP 1.4 PP SP 2.1 PMC SP 1.1 PMC SP 1.6 PMC SP 1.7 PPQA SP 2.2 OPD SP 1.4 OPF SP 1.3 IPM SP 1.2 IPM SP 1.4

Schedule – planned vs. actual, including: (Same as for Effort) Level of Tracking Justification Business Need Supported CMMI PAs Measures

slide-20
SLIDE 20

19

Sample Chart

Size (Function Points) Jan­02 Feb­02 Mar­02 Apr­02 May­02 Jun­02 Plan 50 50 50 50 50 50 Current 50 54 54 66 Threshold 62.5 62.5 62.5 62.5 62.5 62.5

Project Level Size Measurement Example

10 20 30 40 50 60 70 80 Jan­02 Feb­02 Mar­02 Apr­02 May­02 Jun­02 Function Points Plan Current Threshold

Updated estimate

  • r count

Original Size Estimate Allowable Growth Threshold

slide-21
SLIDE 21

20

Recommended CMMI Measures

Org., Project To support future planning; to analyze process capability Meeting cost and schedule commitments

OPD SP 1.4 IPM SP 1.6 PP SP 1.4 PP SP 2.1 PMC SP 1.1 PMC SP 1.6

Productivity – planned

  • vs. final

Project To manage critical hardware requirements Meeting technical commitments

PP SP 2.4 PMC SP 1.1 IPM SP 1.2 IPM SP 1.3

Critical Computer Resources – planned vs. actual Level of Tracking Justification Business Need Supported CMMI PAs Measures

slide-22
SLIDE 22

21

Recommended CMMI Measures

Org., Project To track the backlog of software problems in delivered programs and the plan to fix them Managing quality

CM SP 2.1

Change Requests & Problem Reports Org., Project, Task To manage, assess and improve product quality Meeting and improving product quality needs

VER SP 2.1 VER SP 2.3 VER SP 3.2 PP SP 1.2 OPD SP 1.4

Defects, Peer Review, and Test Data Level of Tracking Justification Business Need Supported CMMI PAs Measures

slide-23
SLIDE 23

22

Recommended CMMI Measures

Course To identify and improve aspects of training courses that need improvement Managing training effectiveness

OT SP 2.3

Training Effectiveness Org., Project, Procedure and products To ensure processes are compliant to documented procedures and standards Managing process compliance

PPQA SP 2.2

PPQA Audit results (audits and % of non­ compliance findings) Org., Project To plan and track training to ensure business needs are met; to manage training cost Managing training

OT SP 2.2

Training – planned vs. actual Level of Tracking Justification Business Need Supported CMMI PAs Measures

slide-24
SLIDE 24

23

  • Program, function, and work product identifiers
  • Type and phase of review or test, e.g., design

inspection or unit test

  • Who attended and how much time was spent

preparing (reviews)

  • How long the review meeting lasted (reviews)
  • Size of the work product, e.g., pages of design
  • Total defects detected (by severity)
  • Time spent fixing defects (rework)

Peer Review Measures Collected

slide-25
SLIDE 25

24

  • For each defect found:
  • Defect type, e.g., missing, wrong, or extra
  • Defect origin, i.e., what phase when inserted
  • Defect severity, i.e., major or minor
  • Defect category (optional), e.g., logic, data, etc.
  • Defect location, e.g., module or program element

name

  • Work product ID, e.g., change ID#
  • Type of review or test when found, e.g., Code

Inspection, Unit Test, etc.

  • Date closed
  • Time to Fix – the amount of time to fix and revalidate

Peer Review Data

slide-26
SLIDE 26

25

Example of Defect Insertion & Removal Profile

Rate = Ave. Defects/KSLOC Range Reqs Design Code Unit Test Int Test Sys Test Field Total Insertion Rate 2.5 7.2 22.7 0.9 0.3 0.1 0.0 Detection Rate 1.0 5.0 17.0 4.0 2.5 2.0 2.0 33.6 Leakage Rate 1.5 3.6 9.3 6.1 3.9 2.0 0.0 Removal Effectiveness 40% 58% 65% 40% 39% 50% 100% Average | Best in Class*

40% | 50% 70% | 85% 65% | 85% 35% | 55% 35% | 45% 40% | 55%

Peer Review Effectiveness­> 71% Total Effectiveness­> 94% Best in Class­> 85+% Best in Class­> 99+% Peer Reviews Testing * from Capers Jones, Software Assessments,

Benchmarks and Best Practices , Addison Wesley, 2000

Analysis of insertion rates and defect­removal effectiveness against industry benchmark data provides valuable information about an

  • rganization’s software process capability and reveals areas that need

improvement.

slide-27
SLIDE 27

26

Analysis for Continual Improvement

  • At the organizational level, aggregate project data to determine
  • verall process performance, especially for:
  • Effort distribution by task
  • Productivity
  • Defect Insertion and Removal Effectiveness
  • Continually address data integrity issues so the data can be

trusted

  • Establish a commitment to quality and process improvement
  • Report regularly to senior management
  • From analysis, create action plan for improvement
  • Ensure action plan is sponsored by management and carried out

successfully

  • Measure to assess results of improvement actions
slide-28
SLIDE 28

27

Questions?

Patricia A Eglin p.eglin@davidconsultinggroup.com Consultant David Consulting Group