Service Data Quality and Improvement Quality: When Session T31 - - PowerPoint PPT Presentation
Service Data Quality and Improvement Quality: When Session T31 - - PowerPoint PPT Presentation
ASQ World Conference on Service Data Quality and Improvement Quality: When Session T31 99.7% Isnt Good May 17, 2016 Enough Jennifer Wilson Dodd Starbird Learning Objectives 1. Discuss the value of using quality data for process and
SLIDE 1
SLIDE 2
Learning Objectives
- 1. Discuss the value of using quality data for process
and performance control, and understand the critical difference between defects per opportunity (DPO) and defects per unit (defective rate).
- 2. Discuss the change management challenges that
a team overcame in creating visibility and team buy-in to improve performance, and learn how to create a positive culture of human error prevention.
- 3. Integrate key concepts of Lean (visible metrics)
with key concepts of Six Sigma (sampling and process capability), all with the common foundation
- f Deming Quality to tie the solution together!
SLIDE 3
Engaging the Team in Quality
Employee Engagement is:
- Every team member actively caring
about the team’s performance and
- utcomes for customers
- Working together to improve
- The key to Quality!
It’s a culture, not a program! We took an Engaged Team Performance (ETP) approach…
SLIDE 4
Differentiation: Data-driven Insights
1,000s of Utilities 100,000s of Facilities 1,000,000s of Households Consumption Demand Expense Capital Impact
Data-driven insights (demand to impact)
Energy Carbon Water Waste Telecom Lease
SLIDE 5
Why We Needed a Data Quality Project
Data Entry QA data from December, 2014:
- 99.7% line item accuracy… only 3 defects
per 1,000 opportunities (3,000 DPMO)
- Data Entry team was only using this number
- But only 88.3% bill-level accuracy!
- 11.7% defective bills (1 or more defects)
- Data Entry leaders had stopped using this
number “because it made people feel bad”
- Estimated impact of 800 downstream people
spending 25% of their time scrubbing and fixing data: 200 FTE!
Sources: Dec 2014 QA files and data user survey Mar 2015
SLIDE 6
Survey Question #1
How does your organization measure quality? Pick the closest answer…
- A. Mostly from the customer perspective (output
quality, final yield, complaints, etc.)
- B. Mostly from an internal perspective (defects
per hundred or million, individual quality, etc.)
- C. Both an internal and an external perspective
- D. We don’t measure quality all that much!
Survey ¡Link ¡#1
SLIDE 7
Data Collection Plan
- Designed two data collection spreadsheets (account & bill)
- Designed and ran proof of concept
- Tested repeatability and reproducibility with 4 collectors
- Re-designed and re-piloted
- Collected population stratification factors for all accounts
(all channels of data entry were in scope)
- Sampled 1,500 accounts, 650 sites, 3 months of bills
- Estimation of defect rates to precision of +/- 3%
- Purpose:
- Validate defect rate; correlate causal factors
- Timing:
- Deployed: week of March 9 (starting with pilot)
- Gathered data with 8 data collectors over a month
- Results analysis: April 13–15, 2015
SLIDE 8
Measured Data Quality: Stratified Defects
Yields: 89.6% Bill Data Entry Quality
- verall
93.3% Bill Quality from EDI (Automated) Data Entry 43% Bill Quality on Post-Audit* resolved bills * Post-Audit bills were a subset that the computer system identified as potentially defective, but were queued for future review with substantial backlogs
SLIDE 9
Data Entry Process Improvements
- Turned off an automated edit resolution (computer
based data scrubbing) program that was allowing known defects to get deferred instead of fixed (52% of auto-resolved edits were defects!)
- Launched a project to use Optical Character
Recognition (OCR) technology to expand on benefits of Electronic Data Interchange (EDI)
- Trended causes of defects using Quality Control
data and delegated assignments to teams to discuss causes and solutions in daily huddles
- Example key cause and solution: Inactive accounts
- Open inactive accounts cause reporting errors downstream
- Closing inactive accounts was perceived to be a lower-
priority task and was backlogged 30 days on some teams
- Catching up on that backlog has avoided $75,000 in
rework labor costs annually
SLIDE 10
Data Entry Performance Improvements
- Started to communicate “bill-level” quality scores
- Quality Control caught up to real-time to produce
weekly team quality scores v. sharing data just monthly
- Added new quality metrics to team whiteboards
(shown below) to balance efficiency and quality focus
- Sharing
“quality stories” in daily huddles
- Quantified
cost of quality in same terms as production… “We probably made 3,000 defects yesterday!”
SLIDE 11
Data Entry QC Scores
85.0% 87.0% 89.0% 91.0% 93.0% 95.0% 97.0% 99.0%
Monthly Data Entry Quality Scores
SLIDE 12
Data Entry Quality: Before and After
Process and Performance Improved! 89.5% Bill Quality in May 93.8% Bill Quality in August/Sept Shift in performance and training!
Individual QC Scores
SLIDE 13
Nik Wallenda
Who is Nik Wallenda?
SLIDE 14
Nik Wallenda
Nik is accountable for his performance.
SLIDE 15
Quality Sampling Principles
All work should have some chance of being checked (to drive individual accountability) But individuals should know the work is likely to go right out the door (no safety net!) Sample sizes for checking should be statistically calculated to deliver an appropriate precision of resulting quality measurements by (at least) work type, client, team, & individual
SLIDE 16
Sampling Strategy
Lots of sources for sampling equations. Use software…! One for continuous data, one for discrete…
SLIDE 17
Approach
Collect the minimum sample size per month (usually…), in order to get:
- Appropriate precision by individual for
purposes of an annual review
- Appropriate precision by client for
quarterly reviews and/or required service-level reporting
- Appropriate precision by task type
every month at a team level, for:
- Trending of performance
- Ongoing monitoring
- Root cause analysis of defects
SLIDE 18
Deming’s Point
SLIDE 19
Survey Question #2
Where is your organization’s most significant
- pportunity to improve your quality measures?
- A. Balancing internal and external quality
measures
- B. Gathering process (root cause) data to
improve process quality at the source
- C. Collecting the right sample size to get the
most effective data as efficiently as possible
- D. All of the above!
- E. None of the above
Survey Link #2
SLIDE 20
Discussion of Learning Objectives
- 1. Discuss the value of using quality data for process
and performance control, and understand the critical difference between defects per opportunity (DPO) and defects per unit (defective rate).
- 2. Discuss the change management challenges that
a team overcame in creating visibility and team buy-in to improve performance, and learn how to create a positive culture of human error prevention.
- 3. Integrate key concepts of Lean (visible metrics)
with key concepts of Six Sigma (sampling and process capability), all with the common foundation
- f Deming Quality to tie the solution together!
SLIDE 21
Presenters
Jennifer Wilson Director, Quality and Business Process Improvement, Ecova Inc. jgarciawilson@ecova.com Dodd Starbird Managing Partner, Implementation Partners LLC dodd@implementationpartners.com
SLIDE 22