T15 5/17/2007 3:00:00 PM "F ROM S TART U P TO W ORLD C LASS T - - PDF document

t15
SMART_READER_LITE
LIVE PREVIEW

T15 5/17/2007 3:00:00 PM "F ROM S TART U P TO W ORLD C LASS T - - PDF document

BIO PRESENTATION PAPER T15 5/17/2007 3:00:00 PM "F ROM S TART U P TO W ORLD C LASS T ESTING " Iris Trout Bloomberg, lp International Conference On Software Test Analysis And Review May 14-18, 2007 Orlando, FL USA Iris C. Trout


slide-1
SLIDE 1

BIO PRESENTATION PAPER International Conference On Software Test Analysis And Review May 14-18, 2007 Orlando, FL USA

T15

5/17/2007 3:00:00 PM

"FROM START UP TO WORLD CLASS TESTING"

Iris Trout Bloomberg, lp

slide-2
SLIDE 2

Iris C. Trout Iris C. Trout

Iris Trout is currently the Director of Quality at Bloomberg, LP where she is responsible for the quality assurance of trading applications globally. She has created a world class

  • rganization by implementing basic, solid, QA strategies to gain efficiency and improve

productivity. Iris has been managing financial Quality Assurance teams and process improvement efforts for many years. Previously she was a Vice President at JP Morgan overseeing the CMM and Six Sigma rollout within their retail banking division. She has also created and managed several Quality Assurance testing efforts including teams at McKinsey & Company, MetLife and Merrill Lynch. Iris was formerly a CMM lead assessor and holds a black belt in Six Sigma. She feels that concentrating on the implementation of QA best practices and practical solutions can yield strong team results.

slide-3
SLIDE 3

May 13, 2007 Iris Trout, Bloomberg LP 1

How to Build a World How to Build a World How to Build a World How to Build a World-

  • Class Quality

Class Quality Class Quality Class Quality Assurance Team Assurance Team Assurance Team Assurance Team Presented by: Iris Trout of Bloomberg LP Presented by: Iris Trout of Bloomberg LP May 13, 2007 May 13, 2007

slide-4
SLIDE 4

May 13, 2007 Iris Trout, Bloomberg LP 2

How to Build a World How to Build a World-

  • Class QA Organization

Class QA Organization

Topics to be covered during the course of this presentation:

  • Key Components to a Strong QA Team
  • Evaluating Your Staffing Model
  • Hiring a Solid Tester
  • Obtain a Commitment to QA from Your Stakeholders
  • Implement a Communication Plan for QA
  • Establish and Promote QA’s Goal and Mission Statement
  • Service Level Agreements and QA Processes
  • Review Your Processes Early and Often
  • How do Peer Reviews Relate to/Benefit QA?
  • Metrics and Reporting
  • What is your Strategy
  • How Will You Implement Change?
  • A Phased Approach
  • Tool Sets for Testing / Keys to a Successful Test Automation

Effort

  • Wrap-up
slide-5
SLIDE 5

May 13, 2007 Iris Trout, Bloomberg LP 3

How to Build a World How to Build a World-

  • Class QA Organization

Class QA Organization

Presentation Handouts: 1. QA staffing interview questionnaire 2. Stakeholder analysis template 3. Stakeholder Communication Plan 4. Quality Masters training presentation 5. Metrics scorecard sample with sample metrics 6. QA organizational goal template

slide-6
SLIDE 6

May 13, 2007 Iris Trout, Bloomberg LP 4

Do Any of These Scenarios Sound Familiar? Do Any of These Scenarios Sound Familiar?

  • Have you been asked to create a QA group, but you don’t know where to

even begin?

  • Have you been given unmanageable deadlines?
  • Does your QA staff consist of an untrained hodgepodge of employees that

are treated as though they are untalented or insignificant?

  • Are your testing tools--if available--inadequate or rarely used?
  • Do developers have very little time for your QA staff? Does your QA staff

rarely receive adequate requirements, and do business owners make changes to requirements frequently without advising your QA staff?

  • Does management view your QA staff as a "must have" without fully

understanding why?

  • Are you being asked to automate your tests, but you can’t understand how

to use the tool or how to train the testers on proper automation techniques?

slide-7
SLIDE 7

May 13, 2007 Iris Trout, Bloomberg LP 5

Key Components to a Strong QA Team Key Components to a Strong QA Team

  • Hire or internally identify talented QA test resources.
  • Gain a commitment from your stakeholders to QA. Maintain constant

and open lines of communication between IT, Business, and QA.

  • Establish a Quality Center of Excellence (QCOE) to operate as the hub

for development and implementation of QA process improvement throughout the SDLC and consistently across the QA organization.

  • Incorporate strong processes for bug identification, tracking, resolution

and metrics reporting.

  • Employ a strategy with actionable goals to accomplish process
  • improvements. Communicate your goals—and their results—to your

stakeholders!

  • Continuously invest in your staff by giving them the right tools to do the

job (i.e. training).

slide-8
SLIDE 8

May 13, 2007 Iris Trout, Bloomberg LP 6

  • Identify:
  • Continually evaluate your QA staff by identifying those with the

right skill sets, high performers, and agents of change.

  • Hire:
  • Hire new staff members with strong QA methodology to

complement other product-savvy team members.

  • Consider starting with qualified QA consultants until your new

team is fully trained. This creates a good mix of product knowledge and QA knowledge.

  • Train:
  • Evaluate vendors that can help increase your staff’s knowledge of

QA best practices as they relate to QA process and methodology.

  • Encourage your staff to meet with Business and IT to increase the

depth of their product knowledge.

  • Promote cross training.

Evaluate Your Staffing Model Evaluate Your Staffing Model

slide-9
SLIDE 9

May 13, 2007 Iris Trout, Bloomberg LP 7

Hiring a Solid Tester Hiring a Solid Tester

  • Make sure that you ask the right questions (refer to the QA

Interview checklist).

  • Where possible, involve your stakeholders in the interview

process.

  • Ask for real-life examples of the applicant’s work.
  • Show potential candidates around the area where they will be

working, and have them meet with other testers.

  • Consider, but don’t rely solely upon, industry certifications.
slide-10
SLIDE 10

May 13, 2007 Iris Trout, Bloomberg LP 8

Obtain a Commitment to QA from Your Obtain a Commitment to QA from Your Stakeholders Stakeholders

  • Who are QA’s stakeholders?
  • Why does a tester need to be familiar with their stakeholders?
  • How do I assess the level of commitment I have from my

stakeholders?

  • Refer to the Stakeholder Analysis form.
  • What is the tester’s and/or test manager’s role in communicating

with the stakeholders, and how often?

  • Refer to the Stakeholder Communication Plan.
slide-11
SLIDE 11

May 13, 2007 Iris Trout, Bloomberg LP 9

Implement a Communication Plan for QA Implement a Communication Plan for QA

Keep talking…

  • Testers should meet with stakeholders.
  • QA management should implement a stakeholder

communication plan.

  • Test managers should meet with Business and IT

leaders to discuss goals, plans, and QA progress

  • Management should meet with their teams on a

regular basis and according to a published schedule:

  • Quarterly department meetings
  • Weekly manager meetings
  • Bi-weekly one-on-one meetings with each team

leader

  • Quarterly skip level meetings
slide-12
SLIDE 12

May 13, 2007 Iris Trout, Bloomberg LP 10

Establish and Establish and Promote QA Promote QA’ ’s s Goal and Goal and Mission Statement Mission Statement

GOAL statement:

  • Your goal is to establish a highly workable and executable

set of Service Level Agreements (SLA) between business partners and the test team that will serve to help improve the workflow while increasing test coverage and test quality with an unbiased, independent view.

MISSION statement:

  • To increase the levels of efficiency, reliability, test

coverage, quality, and communication while supporting the testing of the current releases.

slide-13
SLIDE 13

May 13, 2007 Iris Trout, Bloomberg LP 11

Service Level Agreements and QA Processes Service Level Agreements and QA Processes

A Service Level Agreement (SLA) is a contract for the delivery

  • f a specified level of service.
  • Create an SLA with your stakeholders to define the quality delivery of a

service:

  • SLA should state how, what, and when you will deliver.
  • SLA should address what’s In-scope and out-of-scope, timing

issues, etc.

  • SLAs should contain one or more clearly written and measurable

service level objective.

  • Determine how the level of customer satisfaction will be measured prior

to setting your SLAs.

  • Be sure that all of your processes are sustainable, repeatable,

measurable, well-documented and easily accessible for inspection.

slide-14
SLIDE 14

May 13, 2007 Iris Trout, Bloomberg LP 12

(SLAs continued)

  • Your guidelines should be well-understood and explicitly communicated.
  • Your SLAs should satisfy the needs of the business lines that are being

supported:

  • Identify key issues of priority, investment, resources, risk.
  • You may have to adjust your processes to manage the SLAs.
  • Understand what components support the business.
  • Be able to react to service level issues--this will be expected.
  • Track, measure, and report to ensure you are meeting your agreements.
  • Revisit and refine your SLAs as the needs of the business evolve.

Service Level Agreements and QA Processes Service Level Agreements and QA Processes

slide-15
SLIDE 15

May 13, 2007 Iris Trout, Bloomberg LP 13

Items to consider including in your SLA:

  • Projects requiring more than 5 days of testing need to be accompanied by

functional specifications or requirements or QA will not accept the project.

  • QA will produce test plans for projects that require 5 days or more of

testing.

  • QA will allocate its test resources according to corporate priorities.
  • QA will implement best practices at each stage of the SDLC and provide

a list of deliverables that will accompany each stage.

  • Test plans and test matrices will be submitted to IT and and Business for

inspection and base-lining.

  • In the event that QA is unable to obtain the time needed to fully perform a

QA review, QA will test the product utilizing a risk-based testing process (critical, major, minor). This process will guarantee a certain level of test coverage based upon the test case criticality. Items to be tested will be contained in the test plan and agreed to by all stakeholders.

  • QA will implement a formal peer review process for its test script

inventory, which will require a peer review before scripts can be introduced into QA’s production test bed.

  • QA will publish a list of tested versus untested code daily to be used as

decision criteria for daily build/move cycle discussions.

Service Level Agreements and QA Processes Service Level Agreements and QA Processes

slide-16
SLIDE 16

May 13, 2007 Iris Trout, Bloomberg LP 14

Review Your Processes Early and Often Review Your Processes Early and Often

  • Review existing QA processes across the organization:
  • Common processed should be identified, assessed and implemented if in

line with QA best practices.

  • Identify what you test and its criticality vis-à-vis the needs of the business that

you are supporting:

  • Establish a test asset inventory exercise with built-in sunsetting criteria.
  • Produce and market process and tool-related documentation, with buy-in from

stakeholders.

  • Build a standalone repository with version-control.
  • Review and evaluate QA techniques, and assure that recommended

techniques are consistently applied throughout the department:

  • Deploy QCOE to manage this process.
slide-17
SLIDE 17

May 13, 2007 Iris Trout, Bloomberg LP 15

CMM Definition of Peer Review

The purpose of Peer Reviews is to remove defects from the software work products early and efficiently. An important corollary effect is to develop a better understanding of the software work products and of defects that might be prevented.

What is a Peer Review ? What is a Peer Review ?

slide-18
SLIDE 18

May 13, 2007 Iris Trout, Bloomberg LP 16

How do Peer Review s relate to QA? How do Peer Review s relate to QA?

Reviewing test cases between tester and peer tester positions the tester to find flaws in test logic and/or missing test cases, and will reduce the amount

  • f bugs that occur in production.

Implement a scorecard that encourages a standardized approach to assessing your test inventory that is also metrics driven. A simple scorecard can help you to look at scripts for:

Functional Elements:

Effectiveness of test cases in achieving stated business objectives.

Quality Elements:

Repeatability factor. Ease-of-use. Entrance/exit criteria. Quality of support documentation.

Automation Elements (if applicable)

Utilization of tool efficiencies. Reuse ability. Maintenance plans for scripts.

slide-19
SLIDE 19

May 13, 2007 Iris Trout, Bloomberg LP 17

How do Peer Review s Benefit QA? How do Peer Review s Benefit QA?

  • Test scripts are the lifeblood within any QA organization, and are often

the most easily overlooked when it comes to reinvestment.

  • A controlled peer review process is auditable, both by management and

QA’s stakeholders.

  • Peer reviews ensure your that organization is moving in the same

direction in terms of process improvement.

  • Can help you to establish an automation pipeline and/or standard

approach to automation and implementation of automation efficiencies.

  • Don’t shoot for the stars out of the gate! Set realistic acceptability targets

as a first pass for the peer review process against your inventory:

  • 70% confidence rule in phase 1; 85% in phase 2, etc.
slide-20
SLIDE 20

May 13, 2007 Iris Trout, Bloomberg LP 18

Metrics and Reporting Metrics and Reporting

  • It’s important to track your success, but don’t try to over do it.
  • Metrics are usually collectible in very mature organizations. You

may not be at a stage where you’re ready to collect sophisticated measures (and that’s OK!).

  • Start small: count defects, measure time worked, and number of

products tested.

  • Be sure to define your data sources.
  • Be willing to collect the data.
  • Take the time to analyze trends:
  • Report back to stakeholders to show progress and/or potential pain

point in the SDLC.

  • Be willing to take ACTION!

Remember: what gets measured gets managed!

slide-21
SLIDE 21

May 13, 2007 Iris Trout, Bloomberg LP 19

Metrics and Reporting Metrics and Reporting

Here are a few good metrics to start with:

  • Work Unit Performance:
  • Number of requirements tested
  • Number of problem reports opened
  • Number of test cases attempted
  • Number of test cases that passed
  • Number of test cases that failed
  • Volume of testing completed via automated versus manual

test cases

  • Number of components integrated
  • Number of functions integrated
slide-22
SLIDE 22

May 13, 2007 Iris Trout, Bloomberg LP 20

Metrics and Reporting Metrics and Reporting …and some more..

  • Functional correctness:
  • Defects (aging status)
  • Number of defects opened per period
  • Number of defects closed per period
  • Number of defects withdrawn due to user error
  • Number of defects found (by stage of lifecycle)
  • Number of defects open (as of a specific date)
  • Number of defects closed (as of a specific date)
  • Unit defect found in
  • Size of unit with defect
  • Defect density (number found by lines of code)
  • Maintainability
  • Time to restore
slide-23
SLIDE 23

May 13, 2007 Iris Trout, Bloomberg LP 21

So So… …What is Your Strategy? What is Your Strategy? Tools

  • Consider establishing an automation effort. Start small;

build regression suites over time:

  • Document and version control your test scripts.
  • Use a templated test plan.
  • Training:
  • Teach the basics of QA methodology. Build on it later.
  • Productivity:
  • Always measure your results (i.e. time saved through

automation).

slide-24
SLIDE 24

May 13, 2007 Iris Trout, Bloomberg LP 22

How Will You Implement Change? How Will You Implement Change? Formal Approach

Adopt full life-cycle methodology with a strong emphasis on quality. Get all members of the team to participate in this roll out and require processes to be consistent and standardized across the organization.

Advantages

  • Achieves best practices

throughout the

  • rganization via a single

effort

Disadvantages

  • Disruptive
  • May require significant

change for some areas

  • Hard to get buy-in
slide-25
SLIDE 25

May 13, 2007 Iris Trout, Bloomberg LP 23

How Will You Implement Change? How Will You Implement Change? Moderate Approach

Review each area that requires improvement, and make subtle changes along the way. Each area gets to pick what they want to improve and how.

Advantages

  • Little resistance, each area

buys in

Disadvantages

  • Disruptive
  • May require significant

change for some areas

  • Hard to get buy-in
  • Doesn’t achieve consistency
slide-26
SLIDE 26

May 13, 2007 Iris Trout, Bloomberg LP 24

How Will You Implement Change? How Will You Implement Change? Phased Approach

  • Allows you to make incremental

improvements over time.

  • Change is manageable as opposed

to overwhelming.

  • Tasks are sometimes very complex

and will require time for execution.

  • Requires support from all

stakeholders.

  • Need to communicate changes to

gain alignment.

  • Allows you to set realistic goals that

are achievable.

Advantages

  • Easiest to implement
  • Least disruptive and

provide immediate benefits

Disadvantages

  • Takes time to reach

your goals

slide-27
SLIDE 27

May 13, 2007 Iris Trout, Bloomberg LP 25

A Phased Approach A Phased Approach

Script Validation QA Process Improvement Test Automation – Quality and Quantity Improved Metrics and Reporting Tools – Test Coverage Training Communication and Management Productivity Estimating/Sizing

End State Current State

slide-28
SLIDE 28

May 13, 2007 Iris Trout, Bloomberg LP 26

Tool Sets for Testing Tool Sets for Testing Test Tool Evaluation:

  • Automation is not always the quick fix and

requires a lot of technical talent and training. Not all testers can learn automation.

  • Automation does not have an immediate return on

investment.

  • Building infrastructure, training, and creating

regression tests all take time.

slide-29
SLIDE 29

May 13, 2007 Iris Trout, Bloomberg LP 27

Tool Sets for Testing Tool Sets for Testing When evaluating a test tool, look for these things:

  • Ease of use, ability to customize the tool
  • Platform support
  • Tool characteristics and integration
  • Test language
  • Test tool database
  • Server requirements
  • Test suite recovery logic
  • Test management
  • Reporting capability
  • Performance, stress, and load test capability
  • Vendor reliability and support
slide-30
SLIDE 30

May 13, 2007 Iris Trout, Bloomberg LP 28

Keys to a Successful Test Automation Effort Keys to a Successful Test Automation Effort

  • Establish stakeholder buy-in
  • Successful training of (some) manual testers on automation

tools and methodology

  • Release management and QA processes adjusted

incrementally towards best practices

  • Implementation of a test inventory and management system
  • Highly-motivated test team primed for process changes
  • Functional support from developers and testers
  • Stable development environment – functional and GUI

changes are communicated to the test automation team

  • Executive Level and Senior Management long-term support

in terms of QA budget and resources

slide-31
SLIDE 31

May 13, 2007 Iris Trout, Bloomberg LP 29

Everyone Plays a Part in Everyone Plays a Part in QUALITY QUALITY

Remember: Teamwork = SUCCESS! Stay aligned …. and keep in alignment! Focus on a cross-team partnership! Keep the lines of communication open! Communicate often and honestly!

Think Quality!!!

slide-32
SLIDE 32

How to Design a World Class Quality Assurance Team By Iris Trout Bloomberg, LP 5/07

slide-33
SLIDE 33

So you finally did it. You convinced your boss that your company needs a software quality assurance department. Well, congratulations. Now what? Where do you begin? Do you have enough experience for the job? Are there people who are already performing this function without much guidance and charter? No need to panic. Building a software quality assurance department is easy when you take the right steps and get the right people and processes in place. Let's take a look at what you are going to need. 1- What will the department be called? A Department Title. Sounds easy. Don't forget to pick one that "speaks" to your stakeholders. A name that is easy to use and says what you do. Many people choose SQA, Software QA or AQA, Application Quality Assurance, or simply QA. 2- What people will make up the department? Assess the people you have, if

  • any. If you are starting from scratch that may be easier than if you inherit a

team that has not been organized. My preference for newly formed departments is to start with some seasoned consultants that have performed QA in many organizations like your own. These people generally will guide you and advise you and will become the underpinning of your organization, however temporary or permanent they become. If you have people on your team already gather them together and tell them what is happening. Everyone needs to be involved and understand what their role will be. 3- How many people will I need? Now this is more difficult question to answer when you are just starting out. Perhaps you have been given a certain amount

  • f headcount. If that is the case you are in a good state. Take the headcount,

hire strong QA methodology people and start from there. If you need to determine headcount your work has just begun. Meet with the stakeholders for whom you will be performing Quality Assurance testing. Determine the volume of changes going through the pipeline, the level of releases to production currently being made and the complexity of the applications. You are never going to get it all right the first time so you may have to take a stab at how many people you will need based

  • n the answers you get. Until the testers start testing, you can't really

determine how "buggy" the code is unless defects have been tracked in unit testing.

slide-34
SLIDE 34

4- Hiring- Now the fun begins. Who is qualified for this position? What questions should I ask of them? How will I know if their responses are accurate and suitable for my environment? First thing is to put your job description together. You will need people who think quickly on their feet, can communicate well, can think outside of the box and can get along with

  • thers. No small feat.

You should also gather a list of interview questions. Have them handy while interviewing your candidates. Try to ask the same questions of each candidate so you can compare. Ask them a series of questions that will give you a good feel for whether they love software testing or are just there for a paycheck. Questions should be a good mix of technical and scenarios type questions so that you can hear them articulate how they handled a certain difficult client and how they describe the need for bug tracking. In the end you must go with your gut. Hire the most pleasant, enthusiastic people who aren't afraid of work and of speaking up. Strong willed, creative people make the best QA analysts. 5- Create a Mission and Goal statement. Put the words together that represent what you and your team are all about. What is the mission of the team? What do you expect to deliver? What are your goals? Be sure your team understands

  • them. Post them in conspicuous places for all to read and see.

6- How do we begin? Have your testers meet their stakeholders. Schedule meetings with all involved. Set up a training schedule for the testers to learn about the foundation of QA (if necessary). Have them learn about the applications that they will support. Training and communication will be the key to your staff's success. While meeting with the stakeholders be sure they understand your mission and goals. Get their buy in. 7- Establish Service level Agreements (SLA's). This will be critical to your

  • success. These are agreements that you and your stakeholders agree to work
  • by. Your level of service given a particular situation will be spelled out in

these SLA's and all will need to agree to them. This is your contract for delivery! Perhaps your organization has response time SLA's that the testers must test

  • against. Or you can create SLA's around how you will deliver by time of day
  • r day of week. Example- All testing will be ready for UAT by Friday at 2 or

The QA department will not accept code for testing after noon each day, etc. You can develop more as you go along. They are important because they are the rules by which you will operate.

slide-35
SLIDE 35

8- What are your processes? Are there any? Do they work? Your answer will depend on whether you have inherited an established QA organization or you are just starting out but analyzing process never stops regardless. Take time to look at how you process your work. Is it repeatable? Is it consistent? Can you understand the input and the expected outputs? Start small like creating a consistent test plan template. Or better yet, search the web and find one already created. "Us" testers love to share our work. A test plan that everyone uses is a big step in the right direction. This will be shared with your stakeholders for their approval. Once they get used to seeing a document used over and over, it becomes commonplace and they are more likely to read and approve it. A general rule that I like to follow is to keep looking at your process and improve over time rather than making sweeping changes to everything. It is better received by the testers and it gives you a chance to be sure that the change is really taking and that it is a change for the better. Wherever possible, measure the change from how you did it previously to how you are doing it now. You will be amazed at the time savings or the quality improvement you will gain when the change is the right one. 9- Common language. Yes I mean we all need to speak a common QA language in our office place. Everyone should understand your definition of unit testing, system testing, integration testing, criticality, script, suite, scenario

  • etc. Your definition might be slightly different than mine but as long as

everyone in your organization uses the same test language it will work. 10- Criticality. Is everything urgent? This can apply to bugs and to scripts. When a script is written determine its risk to the customer experience. Ask yourself, will the customer suffer in a large way if I don't run this script and the product doesn't work the way the customer expected or will it just be a minor annoyance? Once you determine the criticality of your scripts you can then determine which ones need to be run and when. As you will soon see, many times you will not be asked to provide a date was to when your staff can be done with their work but rather your stakeholders will try to decide for you. Life sometimes just doesn't go as you would have planned. So what do you do? You let your stakeholders know what you can and cannot accomplish in the given time frame. How you do this is by looking at what you deemed critical and start from there. If you have time left you can run what is marked major and so on…..

slide-36
SLIDE 36

If you have inherited an existing team with existing test scripts I highly recommend instituting a Peer Review process for those scripts. Peer review is a process where a tester meets with another tester to review the contents of their scripts. This process assists in weeding out scripts that are no longer functioning properly, that are too cumbersome or just poorly written. Use a scorecard with like criteria for pass/fail and you will see a tremendous improvement in the output of your scripts and the time it takes to run them. 11- Metrics. Read all about them. The only way for you to publicize the great work you and your team are doing is through metrics. Additionally, and more importantly these numbers and trends help you to better manage your team and your goals. There are literally hundreds of metrics for you to choose from. However, when you are just starting out there are a few basic ones that you should concentrate on. How many defects are you finding per application, per tester? How long is it taking to execute your tests? How long is it taking for your I&T department to correct the bugs. How many are outstanding for how many

  • days. How many requirements did you test against? How many tests

passed how many failed? These are typical metrics that can help you view the effectiveness of your

  • department. Trend your data, collect it, and analyze it. Use it for your

decision making. You can always add to what you are collecting. Consider a dashboard or scorecard that you can share with your stakeholders. 12- Tools. Consider your options as to what test tools you may need. Do you want to introduce automation? Do you have the skills on your team to perform this task? Don't be misled by the test tool vendors. Automation takes time before you save time. Most scripts should be manually written before they are automated so you won't necessarily save time up front. The value is in your regression suite once automated. These can run

  • unassisted. Don't ever forget to include set up and rework time in your

business case. Automation is a development effort and it requires funding and time for it to be successful.

slide-37
SLIDE 37

If you decide that automation is for you, there are some things to consider when evaluating tool. Tools evolution:

  • Ease of Use- Some are very complicated and difficult to learn
  • Ability to customize the tool-not all fields will be to your liking
  • Platform support- be sure you have the environment
  • Tool characteristics- Does it work standalone, does it require java, .net

etc.

  • Test language- this is tricky, sometimes the testers will need to learn

the tool's language (not always easy for them to do)

  • Test tool database- be sure your company can support this
  • Server requirements- many a manager gets stuck with buying the tool

and then finding out they need a server (may be expensive)

  • Test suite recovery logic- What is the recovery process, don't want to

rely on housing your tests where you can't retrieve them

  • Test Management - Does the tool provide information for you and

your staff about test status, failures etc.

  • Tool integration- Will it work with other tools you are using?
  • Reporting Capabilities- How easy is it to get reports for your manager

and stakeholders? Are they canned reports? Customizable?

  • Performance, Stress, Load test capabilities- Does the tool include the

ability to perform stress, performance or load testing or it is just a test scripting tool?

  • Vendor reliability and support- Is this a vendor that has been in

business awhile? What is their support and maintenance policy? Will they be available around the clock to answer technical questions? 13- How can I make Automation Successful? Like everything else in your world, stakeholder buy in is critical. They may not like the fact that is will be difficult for them to understand the "script". When you do manual testing, you can walk them through the script but when you automate it is not as easily understood. Your stakeholders may think that automation will increase your test time and that you will do more with less. They may not understand that this will take time to accomplish and that they should not expect immediate

  • gains. Be patient with them but be clear that this is a long term

commitment. You may need functional support from your developers to modify your tool (if the vendor allows). Get these commitments up front so you are not caught needing help without funding for it.

slide-38
SLIDE 38

Don't forget to get training for your staff. Even what looks like the simplest of tools requires training and understanding. Don't overlook the fact that many test tools have a language of their own that is very much like learning programming. If your testers are not of a developer mind set they may never learn how to use the tool. Did you hear that? NEVER. Not all testers have the ability to learn to automate using some tools but that doesn't make them poor testers. You may need to hire testers that are bit more technical, already know the tool or that has been developers in a former life. Software Quality Assurance testing can be a rewarding, exciting career. It requires a lot of hard work, patience, understanding and training. Just like all careers it requires dedicated people who believe in it and know how to accomplish the goals. Training and keeping up with the latest tools and techniques is a must have. Attending conferences, talking to others in the field, reading trade journals is also quite beneficial. In the end your dedication will win the trust of those around you and the savings that you will bring in terms of cost avoidance will become very

  • noticeable. Good luck. I've been doing this for well over twenty years

and have never tired of the ever changing, challenging world of software quality assurance. I hope you find it as exciting.