BIO PRESENTATION PAPER International Conference On Software Test Analysis And Review May 14-18, 2007 Orlando, FL USA
T15
5/17/2007 3:00:00 PM
"FROM START UP TO WORLD CLASS TESTING"
Iris Trout Bloomberg, lp
T15 5/17/2007 3:00:00 PM "F ROM S TART U P TO W ORLD C LASS T - - PDF document
BIO PRESENTATION PAPER T15 5/17/2007 3:00:00 PM "F ROM S TART U P TO W ORLD C LASS T ESTING " Iris Trout Bloomberg, lp International Conference On Software Test Analysis And Review May 14-18, 2007 Orlando, FL USA Iris C. Trout
BIO PRESENTATION PAPER International Conference On Software Test Analysis And Review May 14-18, 2007 Orlando, FL USA
5/17/2007 3:00:00 PM
Iris Trout Bloomberg, lp
Iris C. Trout Iris C. Trout
Iris Trout is currently the Director of Quality at Bloomberg, LP where she is responsible for the quality assurance of trading applications globally. She has created a world class
productivity. Iris has been managing financial Quality Assurance teams and process improvement efforts for many years. Previously she was a Vice President at JP Morgan overseeing the CMM and Six Sigma rollout within their retail banking division. She has also created and managed several Quality Assurance testing efforts including teams at McKinsey & Company, MetLife and Merrill Lynch. Iris was formerly a CMM lead assessor and holds a black belt in Six Sigma. She feels that concentrating on the implementation of QA best practices and practical solutions can yield strong team results.
May 13, 2007 Iris Trout, Bloomberg LP 1
May 13, 2007 Iris Trout, Bloomberg LP 2
Topics to be covered during the course of this presentation:
Effort
May 13, 2007 Iris Trout, Bloomberg LP 3
Presentation Handouts: 1. QA staffing interview questionnaire 2. Stakeholder analysis template 3. Stakeholder Communication Plan 4. Quality Masters training presentation 5. Metrics scorecard sample with sample metrics 6. QA organizational goal template
May 13, 2007 Iris Trout, Bloomberg LP 4
even begin?
are treated as though they are untalented or insignificant?
rarely receive adequate requirements, and do business owners make changes to requirements frequently without advising your QA staff?
understanding why?
to use the tool or how to train the testers on proper automation techniques?
May 13, 2007 Iris Trout, Bloomberg LP 5
and open lines of communication between IT, Business, and QA.
for development and implementation of QA process improvement throughout the SDLC and consistently across the QA organization.
and metrics reporting.
stakeholders!
job (i.e. training).
May 13, 2007 Iris Trout, Bloomberg LP 6
right skill sets, high performers, and agents of change.
complement other product-savvy team members.
team is fully trained. This creates a good mix of product knowledge and QA knowledge.
QA best practices as they relate to QA process and methodology.
depth of their product knowledge.
May 13, 2007 Iris Trout, Bloomberg LP 7
Interview checklist).
process.
working, and have them meet with other testers.
May 13, 2007 Iris Trout, Bloomberg LP 8
stakeholders?
with the stakeholders, and how often?
May 13, 2007 Iris Trout, Bloomberg LP 9
Keep talking…
communication plan.
leaders to discuss goals, plans, and QA progress
regular basis and according to a published schedule:
leader
May 13, 2007 Iris Trout, Bloomberg LP 10
set of Service Level Agreements (SLA) between business partners and the test team that will serve to help improve the workflow while increasing test coverage and test quality with an unbiased, independent view.
May 13, 2007 Iris Trout, Bloomberg LP 11
A Service Level Agreement (SLA) is a contract for the delivery
service:
issues, etc.
service level objective.
to setting your SLAs.
measurable, well-documented and easily accessible for inspection.
May 13, 2007 Iris Trout, Bloomberg LP 12
(SLAs continued)
supported:
May 13, 2007 Iris Trout, Bloomberg LP 13
Items to consider including in your SLA:
functional specifications or requirements or QA will not accept the project.
testing.
a list of deliverables that will accompany each stage.
inspection and base-lining.
QA review, QA will test the product utilizing a risk-based testing process (critical, major, minor). This process will guarantee a certain level of test coverage based upon the test case criticality. Items to be tested will be contained in the test plan and agreed to by all stakeholders.
inventory, which will require a peer review before scripts can be introduced into QA’s production test bed.
decision criteria for daily build/move cycle discussions.
May 13, 2007 Iris Trout, Bloomberg LP 14
line with QA best practices.
you are supporting:
stakeholders.
techniques are consistently applied throughout the department:
May 13, 2007 Iris Trout, Bloomberg LP 15
The purpose of Peer Reviews is to remove defects from the software work products early and efficiently. An important corollary effect is to develop a better understanding of the software work products and of defects that might be prevented.
May 13, 2007 Iris Trout, Bloomberg LP 16
Reviewing test cases between tester and peer tester positions the tester to find flaws in test logic and/or missing test cases, and will reduce the amount
Implement a scorecard that encourages a standardized approach to assessing your test inventory that is also metrics driven. A simple scorecard can help you to look at scripts for:
Functional Elements:
Effectiveness of test cases in achieving stated business objectives.
Quality Elements:
Repeatability factor. Ease-of-use. Entrance/exit criteria. Quality of support documentation.
Automation Elements (if applicable)
Utilization of tool efficiencies. Reuse ability. Maintenance plans for scripts.
May 13, 2007 Iris Trout, Bloomberg LP 17
the most easily overlooked when it comes to reinvestment.
QA’s stakeholders.
direction in terms of process improvement.
approach to automation and implementation of automation efficiencies.
as a first pass for the peer review process against your inventory:
May 13, 2007 Iris Trout, Bloomberg LP 18
may not be at a stage where you’re ready to collect sophisticated measures (and that’s OK!).
products tested.
point in the SDLC.
Remember: what gets measured gets managed!
May 13, 2007 Iris Trout, Bloomberg LP 19
test cases
May 13, 2007 Iris Trout, Bloomberg LP 20
May 13, 2007 Iris Trout, Bloomberg LP 21
build regression suites over time:
automation).
May 13, 2007 Iris Trout, Bloomberg LP 22
Adopt full life-cycle methodology with a strong emphasis on quality. Get all members of the team to participate in this roll out and require processes to be consistent and standardized across the organization.
throughout the
effort
change for some areas
May 13, 2007 Iris Trout, Bloomberg LP 23
Review each area that requires improvement, and make subtle changes along the way. Each area gets to pick what they want to improve and how.
buys in
change for some areas
May 13, 2007 Iris Trout, Bloomberg LP 24
improvements over time.
to overwhelming.
and will require time for execution.
stakeholders.
gain alignment.
are achievable.
provide immediate benefits
your goals
May 13, 2007 Iris Trout, Bloomberg LP 25
Script Validation QA Process Improvement Test Automation – Quality and Quantity Improved Metrics and Reporting Tools – Test Coverage Training Communication and Management Productivity Estimating/Sizing
End State Current State
May 13, 2007 Iris Trout, Bloomberg LP 26
May 13, 2007 Iris Trout, Bloomberg LP 27
May 13, 2007 Iris Trout, Bloomberg LP 28
tools and methodology
incrementally towards best practices
changes are communicated to the test automation team
in terms of QA budget and resources
May 13, 2007 Iris Trout, Bloomberg LP 29
Remember: Teamwork = SUCCESS! Stay aligned …. and keep in alignment! Focus on a cross-team partnership! Keep the lines of communication open! Communicate often and honestly!
So you finally did it. You convinced your boss that your company needs a software quality assurance department. Well, congratulations. Now what? Where do you begin? Do you have enough experience for the job? Are there people who are already performing this function without much guidance and charter? No need to panic. Building a software quality assurance department is easy when you take the right steps and get the right people and processes in place. Let's take a look at what you are going to need. 1- What will the department be called? A Department Title. Sounds easy. Don't forget to pick one that "speaks" to your stakeholders. A name that is easy to use and says what you do. Many people choose SQA, Software QA or AQA, Application Quality Assurance, or simply QA. 2- What people will make up the department? Assess the people you have, if
team that has not been organized. My preference for newly formed departments is to start with some seasoned consultants that have performed QA in many organizations like your own. These people generally will guide you and advise you and will become the underpinning of your organization, however temporary or permanent they become. If you have people on your team already gather them together and tell them what is happening. Everyone needs to be involved and understand what their role will be. 3- How many people will I need? Now this is more difficult question to answer when you are just starting out. Perhaps you have been given a certain amount
hire strong QA methodology people and start from there. If you need to determine headcount your work has just begun. Meet with the stakeholders for whom you will be performing Quality Assurance testing. Determine the volume of changes going through the pipeline, the level of releases to production currently being made and the complexity of the applications. You are never going to get it all right the first time so you may have to take a stab at how many people you will need based
determine how "buggy" the code is unless defects have been tracked in unit testing.
4- Hiring- Now the fun begins. Who is qualified for this position? What questions should I ask of them? How will I know if their responses are accurate and suitable for my environment? First thing is to put your job description together. You will need people who think quickly on their feet, can communicate well, can think outside of the box and can get along with
You should also gather a list of interview questions. Have them handy while interviewing your candidates. Try to ask the same questions of each candidate so you can compare. Ask them a series of questions that will give you a good feel for whether they love software testing or are just there for a paycheck. Questions should be a good mix of technical and scenarios type questions so that you can hear them articulate how they handled a certain difficult client and how they describe the need for bug tracking. In the end you must go with your gut. Hire the most pleasant, enthusiastic people who aren't afraid of work and of speaking up. Strong willed, creative people make the best QA analysts. 5- Create a Mission and Goal statement. Put the words together that represent what you and your team are all about. What is the mission of the team? What do you expect to deliver? What are your goals? Be sure your team understands
6- How do we begin? Have your testers meet their stakeholders. Schedule meetings with all involved. Set up a training schedule for the testers to learn about the foundation of QA (if necessary). Have them learn about the applications that they will support. Training and communication will be the key to your staff's success. While meeting with the stakeholders be sure they understand your mission and goals. Get their buy in. 7- Establish Service level Agreements (SLA's). This will be critical to your
these SLA's and all will need to agree to them. This is your contract for delivery! Perhaps your organization has response time SLA's that the testers must test
The QA department will not accept code for testing after noon each day, etc. You can develop more as you go along. They are important because they are the rules by which you will operate.
8- What are your processes? Are there any? Do they work? Your answer will depend on whether you have inherited an established QA organization or you are just starting out but analyzing process never stops regardless. Take time to look at how you process your work. Is it repeatable? Is it consistent? Can you understand the input and the expected outputs? Start small like creating a consistent test plan template. Or better yet, search the web and find one already created. "Us" testers love to share our work. A test plan that everyone uses is a big step in the right direction. This will be shared with your stakeholders for their approval. Once they get used to seeing a document used over and over, it becomes commonplace and they are more likely to read and approve it. A general rule that I like to follow is to keep looking at your process and improve over time rather than making sweeping changes to everything. It is better received by the testers and it gives you a chance to be sure that the change is really taking and that it is a change for the better. Wherever possible, measure the change from how you did it previously to how you are doing it now. You will be amazed at the time savings or the quality improvement you will gain when the change is the right one. 9- Common language. Yes I mean we all need to speak a common QA language in our office place. Everyone should understand your definition of unit testing, system testing, integration testing, criticality, script, suite, scenario
everyone in your organization uses the same test language it will work. 10- Criticality. Is everything urgent? This can apply to bugs and to scripts. When a script is written determine its risk to the customer experience. Ask yourself, will the customer suffer in a large way if I don't run this script and the product doesn't work the way the customer expected or will it just be a minor annoyance? Once you determine the criticality of your scripts you can then determine which ones need to be run and when. As you will soon see, many times you will not be asked to provide a date was to when your staff can be done with their work but rather your stakeholders will try to decide for you. Life sometimes just doesn't go as you would have planned. So what do you do? You let your stakeholders know what you can and cannot accomplish in the given time frame. How you do this is by looking at what you deemed critical and start from there. If you have time left you can run what is marked major and so on…..
If you have inherited an existing team with existing test scripts I highly recommend instituting a Peer Review process for those scripts. Peer review is a process where a tester meets with another tester to review the contents of their scripts. This process assists in weeding out scripts that are no longer functioning properly, that are too cumbersome or just poorly written. Use a scorecard with like criteria for pass/fail and you will see a tremendous improvement in the output of your scripts and the time it takes to run them. 11- Metrics. Read all about them. The only way for you to publicize the great work you and your team are doing is through metrics. Additionally, and more importantly these numbers and trends help you to better manage your team and your goals. There are literally hundreds of metrics for you to choose from. However, when you are just starting out there are a few basic ones that you should concentrate on. How many defects are you finding per application, per tester? How long is it taking to execute your tests? How long is it taking for your I&T department to correct the bugs. How many are outstanding for how many
passed how many failed? These are typical metrics that can help you view the effectiveness of your
decision making. You can always add to what you are collecting. Consider a dashboard or scorecard that you can share with your stakeholders. 12- Tools. Consider your options as to what test tools you may need. Do you want to introduce automation? Do you have the skills on your team to perform this task? Don't be misled by the test tool vendors. Automation takes time before you save time. Most scripts should be manually written before they are automated so you won't necessarily save time up front. The value is in your regression suite once automated. These can run
business case. Automation is a development effort and it requires funding and time for it to be successful.
If you decide that automation is for you, there are some things to consider when evaluating tool. Tools evolution:
etc.
the tool's language (not always easy for them to do)
and then finding out they need a server (may be expensive)
rely on housing your tests where you can't retrieve them
your staff about test status, failures etc.
and stakeholders? Are they canned reports? Customizable?
ability to perform stress, performance or load testing or it is just a test scripting tool?
business awhile? What is their support and maintenance policy? Will they be available around the clock to answer technical questions? 13- How can I make Automation Successful? Like everything else in your world, stakeholder buy in is critical. They may not like the fact that is will be difficult for them to understand the "script". When you do manual testing, you can walk them through the script but when you automate it is not as easily understood. Your stakeholders may think that automation will increase your test time and that you will do more with less. They may not understand that this will take time to accomplish and that they should not expect immediate
commitment. You may need functional support from your developers to modify your tool (if the vendor allows). Get these commitments up front so you are not caught needing help without funding for it.
Don't forget to get training for your staff. Even what looks like the simplest of tools requires training and understanding. Don't overlook the fact that many test tools have a language of their own that is very much like learning programming. If your testers are not of a developer mind set they may never learn how to use the tool. Did you hear that? NEVER. Not all testers have the ability to learn to automate using some tools but that doesn't make them poor testers. You may need to hire testers that are bit more technical, already know the tool or that has been developers in a former life. Software Quality Assurance testing can be a rewarding, exciting career. It requires a lot of hard work, patience, understanding and training. Just like all careers it requires dedicated people who believe in it and know how to accomplish the goals. Training and keeping up with the latest tools and techniques is a must have. Attending conferences, talking to others in the field, reading trade journals is also quite beneficial. In the end your dedication will win the trust of those around you and the savings that you will bring in terms of cost avoidance will become very
and have never tired of the ever changing, challenging world of software quality assurance. I hope you find it as exciting.