w3
play

W3 November 6, 2002 1:00 PM T EST A UTOMATION : R EDUCING T IME TO - PDF document

BIO PRESENTATION PAPER W3 November 6, 2002 1:00 PM T EST A UTOMATION : R EDUCING T IME TO M ARKET J im Dougherty and Keith Haber LexisNexis International Conference On Software Testing Analysis & Review November 4-8, 2002 Anaheim, CA


  1. BIO PRESENTATION PAPER W3 November 6, 2002 1:00 PM T EST A UTOMATION : R EDUCING T IME TO M ARKET J im Dougherty and Keith Haber LexisNexis International Conference On Software Testing Analysis & Review November 4-8, 2002 Anaheim, CA USA

  2. Jim Dougherty Jim Dougherty is a Consulting Software Test Analyst in the Global Electronics Product Development division at LexisNexis in Miamisburg, OH. He has been with LexisNexis in a test capacity for the past six years. Prior to joining LexisNexis, his testing background was as a member of the Air force Communications Command’s Operational Test & Evaluation Center at Wright-Patterson AFB, OH. He was a Team Engineer on a Wideband Microwave evaluation team and later was assigned as the Team Chief of a Base Telecommunications evaluation team evaluating airbase telecommunications systems worldwide. During his time at LexisNexis Jim has been the test lead for many critical projects. He has been the Automation Team lead for the past two years. Jim is an American Society for Quality Certified Software Quality Engineer. Keith Haber Keith Haber is a Software Test Analyst in the Global Electronic Product Development division at LexisNexis in Miamisburg, Ohio. He has been a member of the GEPD Product Certification group since 1998, and has worked exclusively in the field of test automation for over three years. Keith is a junior at Wright State University pursuing a Bachelor of Science degree in Computer Science.

  3. Test Automation: Reducing Time To Market Jim Dougherty/Keith Haber

  4. Motivation for an Automation Process • Very large testing effort – 45 test analysts – 10,000 manual test cases (Summer 2000) • Limited technical experience among test analysts – Simple capture-replay scripts only • Results of Early Automation: – Expensive to maintain – Too limited in scope to provide real benefit

  5. Development of an Automation Process • Creation of the process – Role assignments • Automation Team • Software Test Analysts • Flow of the work • Mechanics – Requests for Automation/Priorities/Task Sizing • Automation request form • Queue priority and sizing tools

  6. Graphic Illustration – Flow Chart

  7. Creating the Automation Architecture • Library of functions • Change Control • Coding Standards • Documentation and Reporting • Templates

  8. Function Library • Stand-Alone Functions – SetFieldText () – ClickLink () – SelectListEntry () – CloseBrowserMessages () – EnableJavaScript () – VerifyTextPresent ()

  9. Change Control • Using Microsoft Visual Source Safe – Common directory used for function library – Application specific scripts stored by application name and generation – Write privileges only to the automation team – STA use Get Latest Version as directed in turnover documentation

  10. Coding Standards • Standards in accordance with accepted software development practices • Standards document created – partial content: – Directory structure – Functions – Methods – Application States – Commenting – Turnover documentation

  11. Documentation • Automation Tool Users Guide • Project Results DB – Current Assignments – Project Status – Measure productivity • Data Repository Document • Implementing Data Driven Automation • Results Analysis • Third Party Code Modification - Policy

  12. Templates and Tools • Quality Control Inspection template • Turnover Document template • Automation Sizing Algorithm Document • Automation Queue Priority Algorithm Document

  13. Improving on the automation tool • Creation of helper scripts – MakePlanFile.inc – MakeWinDef.inc – MakeMethods.inc – TestcaseCounter.inc – MakeHelpFile.inc – TextSearch.inc

  14. Results Analysis • Created the Script Result Reporter – Scripts post their results to Access DB – Ability to view results as Excel or Word document – Currently creating an interface to the test analysts metrics data base

  15. Implementing Data Driven Automation • Cost analysis indicated approximately a fifty percent per test case savings by going to Data Driven automation • Analysis further indicated that seventy-five to eighty percent of the test plans submitted for automation were amenable to Data Driven

  16. Data Driven Automation – Value Added • Reduction in the time required to create test cases • Script run time is controlled by the STAs by selecting the number of spreadsheet lines selected • Data Driver Engine and Composite Methods less vulnerable to product changes • Significant cost reduction over the existing method used to automate

  17. Data Driven Automation - Mechanics • DataDriver.inc created (Data Driver engine) • A new test plan template used to create data driven test plans • The data driven scripts are smaller – fewer lines of code • Maintenance costs have been reduced • Automation costs have been reduced

  18. DataDriven Template • Sample Data Driven Template

  19. DataDriven Summary • Scope of Testing Can Be Increased With No Corresponding Enlargement Of The Automated Script • Spreadsheets Created For A Specific Product Or Area Of A Product Offer Possibilities For Re-Use • Approximate Reduction In The Cost Of Automation After Full Implementation Of The Data Driven Process Is Forty Percent • Approximate Reduction In The Cost Of Maintenance Of The Automated Product Is Fifty Percent • Cost Of Automation And The Cost Of Testing Are Reduced • Scope Of Testing Is Increased While Time To Market Is Reduced • Quality Is Improved

  20. Benefits & Measurement of the Process • XML Gateway • lexis.com  • LexisNexis™ Academic & Library Services

  21. Functional Test Costs - ALS $10,000.00 $20,000.00 $30,000.00 $40,000.00 $50,000.00 $60,000.00 $70,000.00 $80,000.00 $90,000.00 $- 5/21/00 - INET to UINFO 179 6/25/00 - WNA 8/6/00 - Major TA 8/22/00 - ALS Quarterly 9/11/00 - NT Data Release 9/28/00 - ALS forms release 209 Total Cost Automated & Manual 11/2/00 - ALS Quarterly 11/13/00 - ALS Forms Release 11/30/00 - ALS Forms Release 12/10/00 - Major TA 1/4/01 - ALS Forms Release 1/8/01 - NT State Rollover (Automated & Manual vs. Manual Only) Functional Test Cost For ALS Projects 1/15/01 - ALS Forms Release May 21, 2000 through May 23, 2002) 255 3/15/01 - ALS Quarterly 3/20/01 - ALS Forms Release 3/22/01 - ALS Forms Release 3/29/01 - NT Data Release 4/29/01 - Major TA 6/3/01 - ASET 264 6/14/01 - FRA release for ALS 6/14/01 - NT Data Release Total Cost Manual 8/19/01 - Minor TA 296 8/21/01 - ALS Quarterly 9/11/01 - ALS Forms Release 9/20/01 - NT Data Release 10/4/01 - ALS Forms Release 12/8/01 - Major TA 12/11/01 - ALS Forms Release 1/3/02 - ALS Forms Release 1/8/02 - NT State Rollover Number of test cases 1/17/02 - NT Data Release 2/7/02 - ALS Forms Release 3/2/02 - Solaris 3/7/02 - ALS Forms Release 3/7/02 - ALS Forms Release 3/14/02 - NT Data Release 4/8/02 - Emergency Release 4/9/02 - LNMA R1 4/10/02 - AUR 4/20/02 - Major TA 326 4/30/02 - ALS Quarterly 5/14/02 - ALS Forms Release 331 5/23/02 - Qmigration 0 50 100 150 200 250 300 350 Number of Test Cases

  22. Percent Payback – Automating ALS $10,000.00 $20,000.00 $30,000.00 $40,000.00 $50,000.00 $60,000.00 $70,000.00 $80,000.00 $90,000.00 $- 5/21/00 - INET to UINFO 6/25/00 - WNA 8/6/00 - Major TA 8/22/00 - ALS Quarterly 9/11/00 - NT Data Release 9/28/00 - ALS forms release 11/2/00 - ALS Quarterly 11/13/00 - ALS Forms Release Total Cost Automated & Manual 11/30/00 - ALS Forms Release Percent Pay Back For Automating ALS Functional Tests 12/10/00 - Major TA 1/4/01 - ALS Forms Release 1/8/01 - NT State Rollover 1/15/01 - ALS Forms Release (May 21, 2000 through May 23, 2002) 3/15/01 - ALS Quarterly 3/20/01 - ALS Forms Release 3/22/01 - ALS Forms Release 3/29/01 - NT Data Release 4/29/01 - Major TA 6/3/01 - ASET 6/14/01 - FRA release for ALS 6/14/01 - NT Data Release Total Cost Manual 8/19/01 - Minor TA 8/21/01 - ALS Quarterly 9/11/01 - ALS Forms Release 9/20/01 - NT Data Release 10/4/01 - ALS Forms Release 12/8/01 - Major TA 12/11/01 - ALS Forms Release 1/3/02 - ALS Forms Release Percent Pay Back 1/8/02 - NT State Rollover 1/17/02 - NT Data Release 2/7/02 - ALS Forms Release 3/2/02 - Solaris 3/7/02 - ALS Forms Release 3/7/02 - ALS Forms Release 3/14/02 - NT Data Release 4/8/02 - Emergency Release 4/9/02 - LNMA R1 4/10/02 - AUR 4/20/02 - Major TA 4/30/02 - ALS Quarterly 5/14/02 - ALS Forms Release 5/23/02 - Qmigration 0.0% 100.0% 200.0% 300.0% 400.0% 500.0% 600.0%

  23. XML GW Test Cycle Duration Time to Market (Days) 0% Automated 17 66 Test Cases 16 15 Days/ 1 Resource 15 14 13 0% Automated 12 44 Test Cases 11 10 Days/ 1 Resource 10 9 0% Automated 50% Automated 66% Automated 8 22 Test Cases 95% 44 Test Cases 66 Test Cases 7 5 Days/ 1 Resource Automated 5 Days/ 1 Resource 5 Days/ 2 Resources 6 44 Test Cases 3 Days/ 1 5 95% Automated Resource 4 22 Test Cases 3 2 Days/ 1 Resource 2 60% Reduction 1 Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec 2001 R1 R2 R3 3/8/01 8/22/01 11/15/01 _ _ _ _ _ No Automation

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend