Todd Warren CS 394 Spring 2011 Team structure at Microsoft Product - - PowerPoint PPT Presentation

todd warren cs 394 spring 2011 team structure at
SMART_READER_LITE
LIVE PREVIEW

Todd Warren CS 394 Spring 2011 Team structure at Microsoft Product - - PowerPoint PPT Presentation

Todd Warren CS 394 Spring 2011 Team structure at Microsoft Product Complexity and Scheduling Quality and software testing Knowing when to ship 3x Programming Program System 3x 9x Programming Programming System Product


slide-1
SLIDE 1

Todd Warren – CS 394 Spring 2011

slide-2
SLIDE 2

 Team structure at Microsoft  Product Complexity and Scheduling  Quality and software testing  Knowing when to ship

slide-3
SLIDE 3

Program Programming System Programming Product Programming System Product

Source: Fred Brooks Jr., Mythical Man Month, “The Tar Pit”

3x 3x

9x

slide-4
SLIDE 4

Professional Services Product Business

Marginal Costs Almost Constant Drive towards High Concentration Regional Appearance Mainly regional, with increasing tendency to globalization Highly Globalized Customer Relationship One to One One to Many Most Important Number to Watch Capacity Utilization Rate Market Share (Installed Base)

Relevance of Management Areas

1.

Human Resources

2.

Software Development

3.

Marketing and Sales

4.

Strategy 1. Strategy 2. Marketing and Sales 3. Human Resources 4. Software Development

Source: Hoch, Roeding, Purkert, Lindner, “Secrets of Software Success”, 1999

slide-5
SLIDE 5

 Product Manager  User-Interface designer  End-user liaison  Project Manager  Architect  Developers  Tool Smith  QA/Testers  Build Coordinator  Risk Officer  End User Documentation

Source: McConnell Program Management Software Development Engineers Test and Quality Assurance User Assistance / Education

slide-6
SLIDE 6

Size Matters!

  • Different Methodologies and Approaches

Scope of Feature and Quality Matters

Affects Level of Process needed and overhead

  • 5 person teams: Moderate Process, Shared Roles
  • 24 person teams (PMC): Moderate Process, Lifecycle oriented roles and

specialization—good for “Extreme” style process

  • 60-100 (MS Project): Moderate Process, some loose functional specialization

and lifecycle

  • 100-200 (Windows CE) person teams: Medium to Heavy Process, Lifecycle

roles and functional specialization

  • 1000+ Person Teams (Windows Mobile): Heavy Process, Multiple

Methodologies, Formal Integration Process

Higher Quality==more rigorous process

True also for open source, online projects

  • Apache is best example of very specified culture of contribution
slide-7
SLIDE 7

Feature Team Function A Development team 1 Development team 2 Development team 3 Function B Development team 1 Development team 2 Development team 3 Function C Development team 1 Development Team 2 Development Team 3

Casual More Formal Very Formal

slide-8
SLIDE 8

Core Edge Edge Edge Edge Edge Edge

Office Windows

Core Edge Edge Edge Edge Edge Edge

slide-9
SLIDE 9

25% Developers

45% Testers

10% Program Management

10% User Education / Localization

7% Marketing

3% Overhead

slide-10
SLIDE 10

 1 UI Designer  5 Program managers  8 Developers  10 testers

slide-11
SLIDE 11

30 Developers (27%)

36 Testers (33%)

15 Program Mgrs (14%)

20 UA/Localization (18%)

6 Marketing (5%)

3 Overhead (3%)

slide-12
SLIDE 12

112 Developers (25.9%)

247 Testers (57.3%)

44 Program Mgrs. (10.2%)

12 Marketing (2.7%)

16 Overhead (3.7%)

slide-13
SLIDE 13

Admin/Other 1% Development 22% Program Mgt 14% Test 49% User Ed / Localization 13%

slide-14
SLIDE 14

Role % ratio to dev

  • cust. Integration

15.79% 0.71 Developers 22.11% 1.00 Tester 35.79% 1.62 Program Managers 10.53% 0.48 UI Design 5.26% 0.24 UE 4.21% 0.19 Eng services 6.32% 0.29

slide-15
SLIDE 15

 3 month maximum is a good rule of thumb

for a stage/milestone.

  • Hard for people to focus on anything longer than 3

months.

  • Never let things go un-built for longer than a week
slide-16
SLIDE 16
slide-17
SLIDE 17

 216 days development (truthfully probably more like 260d)  284 days on “testing” in example

  • Component Tests: 188d
  • System wide tests:~97d

 50/50 split between design/implement and test/fix  Some Projects (e.g. operating systems, servers) longer

integration period (more like 2:1)

 Factors: How distributed, number of “moving parts”  Show why some of the Extreme methodology is appealing.

slide-18
SLIDE 18

 1/3 planning  1/6 coding  1/4 component test and early system test  1/4 system test, all components in hand

slide-19
SLIDE 19

Milestone Planned Date Actual Date

M1 start 5/12/97 5/12/97 M1 end 8/8/97 8/22/97 M2 start 8/11/97 8/25/97 M2 end 11/7/97 12/12/97 M3 start 11/10/97 12/15/97 M3 end ("code complete") 2/23/98 3/31/98 Beta 1 3/9/98 6/22/98 Beta 2 5/11/98 9/21/98 RTM U.S. 7/13/98 3/25/99

slide-20
SLIDE 20

brooks Office 2000 plan Office 2000 actual Project 2002 Office 2007 system test 25% 27% 46% 23% 30% Component test 25% 27% 23% 38% 26% Coding 17% 27% 19% 27% 18% Planning 33% 19% 13% 11% 26% 0% 20% 40% 60% 80% 100% 120% % of Project Time

Project Time Split

slide-21
SLIDE 21

 Design in Scenarios up front  What is necessary for the component

  • UI is different than API
  • Server is different than client

 Set Criteria and usage scenarios  Understanding (and controlling if possible) the

environment in which the software is developed and used

 “The last bug is found when the last customer dies”

  • Brian Valentine, SVP eCommerce, Amazon
slide-22
SLIDE 22

Exchange versions: 4.0 (latest SP), 5.0 (latest SP) and 5.5 Windows NT version: 3.51, 4.0 (latest SP’s)

  • Langs. (Exchange and USA/USA, JPN/Chinese, JPN/Taiwan, JPN/Korean,

Windows NT): JPN/JPN, GER/GER, FRN/FRN Platforms: Intel, Alpha, (MIPS, PPC 4.0 only) Connectors X.400: Over TCP, TP4, TP0/X.25 Connectors IMS: Over LAN, RAS, ISDN Connectors RAS: Over NetBEUI, IPX, TCP Connector interop: MS Mail, MAC Mail, cc:Mail, Notes News: NNTP in/out Admin: Daily operations Store: Public >16GB and Private Store >16GB Replication: 29 sites, 130 servers, 200,000 users, 10 AB views Client protocols: MAPI, LDAP, POP3, IMAP4, NNTP, HTTP Telecommunication: Slow Link Simulator, Noise Simulation Fault tolerance: Windows NT Clustering Security: Exchange KMS server, MS Certificate Server Proxy firewall: Server-to-Server and Client-to-Server

slide-23
SLIDE 23

 5m lines of code  4 processor architectures

  • ARM/Xscale, MIPS, x86, SH

 20 Board Support Packages  Over 1000 possible operating system

components

 1000’s of peripherals

slide-24
SLIDE 24

 2 code instances (“standard” and “pro”)  4 ARM Chip Variants  3 memory configuration variations  8 Screen sizes (QVGA, VGA, WVGA, Square..)  60 major interacting software components  3 network technologies (CDMA, GSM, WiFi)  Some distinct features for 7 major vendors  100 dependent 3rd party apps for a complete

“phone”

slide-25
SLIDE 25

Defects Found over Project Life

100 200 300 400 500 600 700 800 900 1000 3/1/98 4/1/98 5/1/98 6/1/98 7/1/98 8/1/98 9/1/98 10/1/98 11/1/98 12/1/98 1/1/99 2/1/99 3/1/99 4/1/99 5/1/99 6/1/99 7/1/99 8/1/99 9/1/99 10/1/99 11/1/99 12/1/99 1/1/00 2/1/00 Time Defects per week Open Resolved Closed

slide-26
SLIDE 26

Defect Activity by Phase

3/1/98 4/1/98 5/1/98 6/1/98 7/1/98 8/1/98 9/1/98 10/1/98 11/1/98 12/1/98 1/1/99 2/1/99 3/1/99 4/1/99 5/1/99 6/1/99 7/1/99 8/1/99 9/1/99 10/1/99 11/1/99 12/1/99 1/1/00 2/1/00 Week 100 200 300 400 500 600 700 800 900 1000 Defect/Week Releasing Unit Testing Coding Closed Resolved Open

slide-27
SLIDE 27

Feature is Specified Test Design Is written Unit Tests Implemented System Test” Component Testing Specialized Testing Test Release Document Regression Tests Bug Fix Feature Implemented

slide-28
SLIDE 28

Types of Tests

 Black Box  White Box  “Gray” Box

Stage of Cycle

 Unit Test /  Verification Test  Component  Acceptance Test  System Test  Performance Test  Stress Test  External Testing

(Alpha/Beta/”Dogfood”)

 Regression Testing

slide-29
SLIDE 29

ProOnGo LLC – May 2009

Catch Bugs Early

Cos Cost t of

  • f Fix

Fix Time Time

Guard the Process

M0 M1 M2 RTM

Test with the Customer in Mind Make it Measurable

Ship Requirement

slide-30
SLIDE 30

ProOnGo LLC – May 2009

Can we reliably measure these metrics and criteria? What are we building, and why? What metrics and criteria summarize customer demands? What do our bug trends say about our progress? Based on current trends, when will we pass all criteria? How close are we to satisfying agreed upon metrics/criteria? How risky is this last- minute code check- in? Do we pass all criteria? If not: what, why, how? Are the criteria passing stably, every time we test?

RTM Milestone

Confirm, Ship

M0

Specs & Test Plans

M1 .. Mn

Development & Test

slide-31
SLIDE 31

What are we building, and why? Any problems with this?

What metrics and criteria summarize customer demands?

Can we reliably measure these metrics and criteria?

ProOnGo LLC – May 2009

RTM Milestone

Confirm, Ship

M0

Specs & Test Plans

M1 .. Mn

Development & Test

// An API that draws a line from x to y VOID LineTo(INT x, INT y);

slide-32
SLIDE 32

ProOnGo LLC – May 2009

RTM Milestone

Confirm, Ship

M0

Specs & Test Plans

M1 .. Mn

Development & Test

Manual Test Pass

Automated Test Pass

Build Verification Tests

Canary

Most Frequent Shallow Coverage Least Frequent Completes Coverage

slide-33
SLIDE 33

 Fast Tests that can automatically run at

check-in time

  • Static Code Analysis (like lint)
  • Trial build, before check-in committed to SCM
  • Form-field tests:

▪ Check-In cites a bug number? ▪ Code-reviewer field filled out?

RTM Milestone

Confirm, Ship

M0

Specs & Test Plans

M1 .. Mn

Development & Test

ProOnGo LLC – May 2009

slide-34
SLIDE 34

RTM Milestone

Confirm, Ship

M0

Specs & Test Plans

M1 .. Mn

Development & Test

 Goal: find bugs so heinous that they could…

  • Block ability to dogfood
  • Derail a substantial portion of test pass (5%?)

 Unwritten Contract:

  • You break the build, you fix it within an hour
  • Day or night
  • Holds up productivity of entire team

ProOnGo LLC – May 2009

slide-35
SLIDE 35

RTM Milestone

Confirm, Ship

M0

Specs & Test Plans

M1 .. Mn

Development & Test

 Example on a Microsoft product:

  • Number of test cases: 6 digits
  • Number of test runs: 7 digits
  • 14 different target device flavors

 Runs 24/7, results available via web

  • Automatic handling of device resets / failsafe

 Requires creativity:

  • How would you automate an image editor?
  • A 3D graphics engine?

ProOnGo LLC – May 2009

slide-36
SLIDE 36

RTM Milestone

Confirm, Ship

M0

Specs & Test Plans

M1 .. Mn

Development & Test

 Cost of automating vs. Cost of running manually

  • Rationale/Quantitative way to decide whether to automate
  • Reality: few organizations maximize benefits of automation
  • Therefore, manual testing lives on

 Tough “Automated vs. Manual”

decisions:

  • Testing for audio glitches (does the audio crackle?)
  • Does the UI feel responsive enough?

ProOnGo LLC – May 2009

slide-37
SLIDE 37

 Who found  When  What, and it’s seveirty  How to reproduce  What part of the product

  • Create a

 Where fixed and by whom  State

  • Open, Resolved, Closed

 Disposition

  • Fixed, Not Fixed, Postponed, “By Design”
slide-38
SLIDE 38

 What must be true for a release to be done or

complete

 Includes a mix of criteria

  • All features implemented and reviewed
  • Documentation complete
  • All Bugs Closed (not necessarily fixed)
  • Performance Criteria met

 Video

slide-39
SLIDE 39

 Late in the cycle, a process for determining

what to fix

 Getting people together and prioritizing

impact on release criteria and overall stability goals

 Even Known Crashing bugs are postponed

depending on criteria

slide-40
SLIDE 40

 With software products, know what to build

for the customer

 Have checkpoints for progress (Milestones)  Many types of testing and structure: right

tool for the job

 Determine and measure ship criteria