BIO PRESENTATION Better Software Conference & EXPO September 27-30, 2004 San Jose, CA USA
W12
September 29, 2004 3:00 PM
A MANAGER'S GUIDE TO GETTING THE MOST OUT OF TESTING AND QA
Brian Warren Ceridian Corporation
W12 September 29, 2004 3:00 PM A M ANAGER ' S G UIDE TO G ETTING THE - - PDF document
BIO PRESENTATION W12 September 29, 2004 3:00 PM A M ANAGER ' S G UIDE TO G ETTING THE M OST O UT OF T ESTING AND QA Brian Warren Ceridian Corporation Better Software Conference & EXPO September 27-30, 2004 San Jose, CA USA Brian Warren
BIO PRESENTATION Better Software Conference & EXPO September 27-30, 2004 San Jose, CA USA
September 29, 2004 3:00 PM
Brian Warren Ceridian Corporation
Brian Warren
Brian Warren is a QA Manager in Ceridian Corporation’s Human Resource Solutions division managing testing of benefits management tools and HR information systems. He has eleven years of experience in hardware and software testing and development, working in Dell’s Enterprise Systems Group, IBM’s Netfinity Group, a dot-com startup, and a niche software company that developed heads-down data entry products prior to joining Ceridian in 2003. In addition to his test management experience, Brian has worked in a wide range of development, testing, and IT roles, having served as test methodologist for Dell's Server Engineering organization. He also managed a software development and tools team responsible for test automation, load testing, and functional test tools, and worked as a database and web developer.
Brian Warren, Ceridian Corp Better Software 2004
Which of the organizations below do you
Quality Assurance Quality Control Testing Verification & Validation More than one of the above None of the above
How widely understood are the roles of your
Role clearly understood at all levels of the
Role clearly understood within Test/QA and the
Test/QA staff understand, others may not Some Test/QA staff still unclear Non-issue: No Test/QA to worry about
Understanding of the role Test/QA play in the
How much value they are perceived to add (and how
How other teams interact with them How the Test/QA organizations operate
In the absence of real understanding, the labels
“Quality Police” or intelligence gatherers? Product experts or process gurus?
Consider what happens then the following groups
Executive Management
Control how much is invested in Test/QA Evaluate performance of departments
Other teams (e.g.: development, marketing)
Supply critical information to Test/QA Primary customers of Test/QA outputs
Individuals within the organization
Perform the actual work
Customers
Final consumers of product and providers of corporate revenue
Job-seekers
Provide talent pool to support organizational growth
It depends on what the team is doing and
There is no single “industry standard” answer
IEEE, CMMI, RUP, and others each have their
IEEE covers all the bases with two definitions for
(1) “a planned and systematic pattern of all actions
necessary to provide adequate confidence that an item or product conforms to established technical requirements
(2) a set of activities designed to evaluate the process by
which products are developed or manufactured”
… and definitions for “testing” and “verification and
Testing: “the process of operating a system or component under
specified conditions, observing or recording the results, and making an evaluation of some aspect of the system or component.” (IEEE, 1990)
[Verification and Validation] determines whether development
products of a given activity conform to the requirements of that activity, and whether the software satisfies its intended use and user needs. (IEEE, 1998)
CMMI provides a definition (unsurprisingly)
“The purpose of SQA is to provide management with
appropriate visibility into the process being used by the software project and of the projects being built” (Paulk et al, 1995)
While RUP provides separate definitions for
Test: A discipline in the software-engineering process
whose purpose is to integrate and test the system. (RUP 2003)
Quality Assurance: All those planned and systematic
actions necessary to provide adequate confidence that a product or service will satisfy given requirements for quality. (RUP 2003)
Successful communication depends on
Technical professionals Non-technical professionals Executives and managers
Organizations, their objectives, and their
Johanna Rothman provides two very tangible definitions…
QA staff assess the product and the system that produced the
product, to see if there are improvements the project team can make for this project or the next one.
Testers test the product, performing verification activities. They
discover where the product works and where the product doesn’t
that’s for functionality, internationalization, performance, reliability, maintenance, usability, or some other attribute of the system.
Process Product Advisory Authoritarian Internal External
Reporting Focus Role
Based on 2-D model from Christensen and Thayer, 2001
Product Focused Process Focused Production Focused Testing Quality Assurance Quality Control
A question of “how much” you can do, not “which”
All options are important and each contributes to the overall
ability to deliver the right quality to your customers
The demand for support in each of these areas varies based
Solutions are not one-size-fits-all
Each focus represents a different set of problems and may
require different skill sets and structures
and issue data to technical audiences and business
processes and operations
infrastructure, and
testing tools and techniques
and trending analysis
troubleshooting skills
Production-focused
and issue data to technical audiences and business
adherence and compliance data to primarily non-technical audiences (management) Communicatio ns Skills
and market demands and product usage
and development processes
compliance policies Requirements and Business Analysis Skills
design, implementation, and usage
techniques
troubleshooting skills
review work products and determine process adherence or deviation
development, and measurement Technical Knowledge and Analytical Skills
Product-focused Process-focused Skill Set
Makes recommendations Makes decisions (gatekeeper)
Advisory Pro’s
Non-adversarial approach fosters cooperation and
communication
Places ownership of release decisions with those best suited
to handle them -- the business decision makers
Advisory Con’s
Can’t stop the business from making a “bad” decision should
it decide to do so
Success is heavily reliant upon the advocacy and data
presentation skills of Test/QA staff
Authoritative Pro’s
Ability to halt a project prevents Test/QA from being overrun
by stronger personalities from other organizations
Authoritative Con’s
Sets up an adversarial relationship that can stifle
communication, cooperation, and trust
Quality of release decisions dependent upon Test/QA
product
Test/QA organization can now be blamed regardless of the
decision made
“Some testers dream of having veto control
Kaner, Bach, and Pettichord in Lessons Learned in Software Testing
The Advisory Role
Great approach unless your Test/QA staff are unable to
collect and present information well enough to sway the rest
The Authoritative Role
Perfect for companies that have development managers who
make chronically poor decisions through ignoring available data and have Test/QA staff that make better business decisions than senior management
A Parting Thought:
If your testers aren’t capable of succeeding in an advisory
role, why would you trust them in an authoritative role?
Core Function Supporting Function Contractors, Consultants
Internal Pro’s
Common management between Test/QA and the teams they
support eases communications, information sharing
Single point of ownership makes accountability simple and
reduces opportunity for finger-pointing
Internal Con’s
Test/QA may be afraid to deliver bad news if it causes
problems for their management
May fragment Test/QA in large organizations, making it
difficult to maintain commonality between them
Parallel Pro’s
Independent management chain can make it easier for
Test/QA to deliver bad news
Can simplify management of very large Test/QA
Parallel Con’s
Greater separation can impede open communications Opportunity for finger-pointing between management chains
External Pro’s
Staff not prone to organizational bias or indoctrinated with
corporate culture
Contract or consulting staff do not require the same long-
term investment as permanent staff
External Con’s
Staff’s culture and biases are a product of their company and
may not match yours
External agencies may tell you what you want to hear in
base
replaced
base
capabilities
base
capabilities Start-up Ease
Test/QA personnel
Test/QA teams makes “tuning” more difficult
reduce ability to quickly adapt staffing levels
competing management chains
Them” competition creates barriers to trust and flow of information
Parallel
issue data to technical audiences and business
may develop as independent pockets Scalability
tuning – style and culture comes from the external
rapidly modified
to match its context
reduce ability to quickly adapt staffing levels Flexibility
and access to information since staff report to a separate company
minimizes “Us versus Them” mentality
barriers to sharing of information Communication s, Trust, and Access to Information
External Internal Area
Methods for auditing and reporting of corporate financials
MCI WorldCom – Arthur Andersen (Corporate Library 2002) Enron – Merrill Lynch (BBC UK, 2003)
While corporate governance is definitely not the same as
Gathering and communicating evaluations of enormously complex
systems
Potentially severe impacts (economic and/or legal) to providing
incorrect data
reform to statements made about the “software quality crisis”
decade or more is at the heart of what has gone so terribly wrong in corporate America over the past few years. If significant steps are not taken to revisit and remodel corporate governance practices, corporate America will continue to attract anger and animosity not
section of American Society.”
William H. Donaldson Chairman, Securities and Exchange Commission March 2003
Lesson: Management is ultimately accountable
Senior management is responsible for the actions of their company Additional controls will not compensate for lack of ethics or
integrity on the part of leadership
Implications to Test/QA:
No Test or QA structure will reliably halt the impacts of
irresponsible management. The solution lies in developing a trustworthy management team and providing that team the data needed to make informed decisions, not in building an organization based on mistrust of management.
Lesson: Auditors must be both independent and singularly
Even highly successful auditing organizations could be tempted to
sway results due to the presence (and revenues) of other engagements with auditing clients
Implications to Test/QA:
An organization responsible for auditing process compliance must
not be the same one responsible for developing, producing, or testing product. This constitutes a conflict of interest that is not
information.
The “right answer” for you depends on…
The size and scale of your organization The economic, market, and regulatory constraints of
How your organization is structured currently
But there are some common recommendations...
Establish and communicate a clear charter and
Must be both accessible and understandable in
Not useful in educating the company if the language
Structure Test or QA as an advisory
If your organization is not accustomed to this
Segment testing and QA functions by focus
Allows better support of each specific area and
Avoid combining process QA with any other
If your organization is too small to support a
Ceridian Corporation provides this information to clients and friends for general information purposes. None of this material should be construed to be offering legal advice, nor should it be relied on as specific advice to any individual or organization. Please consult your legal advisor for any specific legal advice.
Rothman, J., (2002), “What does your title say about your job?”, Stickyminds.com, Orange Park, FL. [Rothman, 2002] Paulk, M. C., Weber, C. V., Curtis, B., and Chrissis, M. B., (1995) The Capability Maturity Model – Guidelines for Improving the Software Process, Addison-Wesley, Reading, MA, pp. 35. [Paulk et al. 1995] Kulpa, M., and Johnson, K., Interpreting the CMMI: A process improvement approach, Auerbach Publications, New York, NY. [Kulpa and Johnson, 2003] Kaner, C., Bach, J., and Pettichord, B., (2002) Lessons Learned in Software Testing, Wiley and Sons, New York, NY, pp. 8-9. [Kaner et al, 2002] IEEE, (1990), “IEEE Standard Glossary of Software Engineering Terminology”, New York, NY. [IEEE, 1990] The Corporate Library, (2002), “Scandal Spotlight: Worldcom Scandal”, June 19-25 2002 edition, www.thecorporatelibrary.com, Portland, ME. [Corporate Library, 2002] Christensen, M. J., and Thayer, R. H., (2001), The Project Manager’s Guide to Software Engineering’s Best Practices, IEEE Computer Society Press, Los Alamitos, CA, p.281-285. [Christensen & Thayer, 2001]