service data
play

Service Data Conference Quality: When Session M12 99.7% Isnt Good - PowerPoint PPT Presentation

ASQ Lean and Six Sigma Service Data Conference Quality: When Session M12 99.7% Isnt Good February 29, 2016 Enough Jennifer Wilson Dodd Starbird Learning Objectives 1. Discuss the value of using quality data for process and


  1. ASQ Lean and Six Sigma Service Data Conference Quality: When Session M12 99.7% Isn’t Good February 29, 2016 Enough Jennifer Wilson Dodd Starbird

  2. Learning Objectives 1. Discuss the value of using quality data for process and performance control, and understand the critical difference between defects per opportunity (DPO) and defects per unit (defective rate). 2. Discuss the change management challenges that a team overcame in creating visibility and team buy-in to improve performance, and learn how to create a positive culture of human error prevention. 3. Integrate key concepts of Lean (visible metrics) with key concepts of Six Sigma (sampling and process capability), all with the common foundation of Deming Quality to tie the solution together!

  3. Differentiation: Data-driven Insights Energy Carbon Water Waste Telecom Lease Data-driven insights 1,000s of Utilities (demand to impact) 100,000s of Facilities 1,000,000s of Households Capital Impact Demand Consumption Expense

  4. Why We Needed a Data Quality Project Data Entry QA data from December, 2014: • 99.7% line item accuracy… only 3 defects per 1,000 opportunities (3,000 DPMO) • Data Entry team was only publishing this number • But only 88.3% bill-level accuracy! • 11.7% defective bills (1 or more defects) • Data Entry leaders had stopped using this number “because it made people feel bad” • Estimated impact of downstream scrubbing and fixing data: 200 FTE! Sources: Dec 2014 QA files and data user survey Mar 2015

  5. Survey Question #1 How does your organization measure quality? Pick the closest answer… A. Mostly from the customer perspective (output quality, final yield, complaints, etc.) B. Mostly from an internal perspective (defects per hundred or million, individual quality, etc.) C. Both an internal and an external perspective D. We don’t measure quality all that much!

  6. Data Collection Plan Designed two data collection spreadsheets (account & bill) • • Designed and ran proof of concept • Tested repeatability and reproducibility with 4 collectors • Re-designed and re-piloted Collected population stratification factors for all accounts • (all channels of data entry were in scope) • Sampling 1,500 accounts, ~650 sites, with 3 months of bills each • Estimation of defect rates to precision of +/- 3% Purpose: • • Validate defect rate; correlate causal factors Timing: • • Deployed: week of March 9 (starting with pilot) • Gathered data with 8 data collectors over a month • Results analysis: April 13–15, 2015

  7. Measured Data Quality: Stratified Defects Defect types 89.6% Bill Data Entry Quality overall 93.3% Bill Quality from EDI (Automated) Data Entry 43% Bill Quality on Post-Audit resolved

  8. Data Entry Process Improvements Turned off an automated edit resolution (computer • based data scrubbing) program that was allowing known defects to get deferred instead of fixed (52% of auto-resolved edits were defects!) Launched a project to use Optical Character • Recognition (OCR) technology to expand on benefits of Electronic Data Interchange (EDI) Trended causes of defects using Quality Control • data and delegated assignments to teams to discuss causes and solutions in daily huddles Example key cause and solution: • Open inactive accounts cause reporting errors downstream • Closing inactive accounts was perceived to be a lower- • priority task and was backlogged 30 days on some teams Catching up on that backlog has avoided $75,000 in • rework labor costs annually

  9. Data Entry Performance Improvements Started to communicate “bill-level” quality scores • Quality Control caught up to real-time to produce • weekly team quality scores v. sharing data just monthly Added new quality metrics to team whiteboards • (shown below) to balance efficiency and quality focus Sharing • “quality stories” in daily huddles Quantified • cost of quality in same terms as production… “We probably made 3,000 defects yesterday!”

  10. Data Entry QC Scores Monthly Data Entry Quality Scores 99.0% 97.0% 95.0% 93.0% 91.0% 89.0% 87.0% 85.0%

  11. Data Entry Quality: Before and After Process and Individual QC Scores Performance Improved! 89.5% Bill Quality in May 93.8% Bill Quality in August Shift in performance and training

  12. Nik Wallenda Who is Nik Wallenda?

  13. Nik Wallenda Nik is accountable for his performance.

  14. Quality Sampling Principles All work should have some chance of being checked (to drive individual accountability) But individuals should know the work is likely to go right out the door (no safety net!) Sample sizes for checking should be statistically calculated to deliver an appropriate precision of resulting quality measurements by (at least) work type, client, team, & individual

  15. Approach Calculate the minimum sample size per month (usually…), in order to get: • Appropriate precision by individual for purposes of an annual review • Appropriate precision by client for quarterly reviews and/or required service-level reporting • Appropriate precision by task type every month at a team level, for: - Trending of performance - Ongoing monitoring - Root cause analysis of defects

  16. Finally! Set a random generator to apply the selection percentage by task type; apply that test immediately to every completed task (systematic sampling). Conduct regular checks to make sure we have selected enough samples for appropriate client and individual reporting, and over-sample (randomly) in stratified groups as needed. Use the data for root cause analysis and performance management!

  17. Deming’s Point

  18. Survey Question #2 Where is your organization’s most significant opportunity to improve your quality measures? A. Balancing internal and external quality measures B. Gathering process (root cause) data to improve process quality at the source C. Collecting the right sample size to get the most effective data as efficiently as possible D. All of the above! E. None of the above

  19. Discussion of Learning Objectives 1. Discuss the value of using quality data for process and performance control, and understand the critical difference between defects per opportunity (DPO) and defects per unit (defective rate). 2. Discuss the change management challenges that a team overcame in creating visibility and team buy-in to improve performance, and learn how to create a positive culture of human error prevention. 3. Integrate key concepts of Lean (visible metrics) with key concepts of Six Sigma (sampling and process capability), all with the common foundation of Deming Quality to tie the solution together!

  20. Presenters Jennifer Wilson Director, Business Process Improvement, Ecova Inc. jgarciawilson@ecova.com Dodd Starbird Managing Partner, Implementation Partners LLC dodd@implementationpartners.com

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend