"Test Automation on Large Agile Projects: It?s Not a Cakew - - PDF document

test automation on large agile projects it s not a cakew
SMART_READER_LITE
LIVE PREVIEW

"Test Automation on Large Agile Projects: It?s Not a Cakew - - PDF document

AW12 Concurrent Session 11/7/2012 3:45 PM "Test Automation on Large Agile Projects: It?s Not a Cakew alk" Presented by: Scott Schnier Agilex Technologies Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888


slide-1
SLIDE 1

AW12

Concurrent Session 11/7/2012 3:45 PM

"Test Automation on Large Agile Projects: It?s Not a Cakew alk"

Presented by: Scott Schnier Agilex Technologies

Brought to you by:

340 Corporate Way, Suite 300, Orange Park, FL 32073 888‐268‐8770 ∙ 904‐278‐0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com

slide-2
SLIDE 2

Scott Schnier Agilex Technologies Scott Schnier is currently a senior agile practice manager at Agilex Technologies working with other courageous and passionate people to bring agile development practices to the Federal Government. He has held positions of software engineer, director of development, mentor, architect, director of quality assurance, project manager, program manager, agile coach, ScrumMaster, and proxy product owner. Scott’s varied work experience has found him in small startups, Fortune 500 firms, and as a government contractor. A founding member of the Cincinnati chapter of ALN, Scott is currently active in the Washington DC chapter. Scott takes special pleasure in and has a passion for helping people work better together. .

slide-3
SLIDE 3

9/3/2012 1

T est Automation on Large Agile Projects,

It’s not a Cakewalk

Scott Schnier Agilex T echnologies Scott.Schnier@Agilex.com

1

The Story

 T

est Automation

  • Growth and division of work and

responsibility

  • Support of Agile Values
  • Lessons learned and victories

 There are no “best practices”

2

slide-4
SLIDE 4

9/3/2012 2

It’s not a cakewalk

3

The Setting

 Core group of Agile/Scrum practioners  Many staff new to Agile/Scrum  Customer willing to try something new,

frustrated with past failures.

 Government contract, multiple vendors.  Driven by Legislation

4

slide-5
SLIDE 5

9/3/2012 3

Geography

5

Scrum T eam Evolution

6

1 6 7 2 8 3 10 11 4 9 5 13 12 R D S

After 2 ½ years

slide-6
SLIDE 6

9/3/2012 4

T est Automation - Why ?

7

Keep Development on Track

8

slide-7
SLIDE 7

9/3/2012 5

Capture Knowledge

9

Automate repetitive work

10

slide-8
SLIDE 8

9/3/2012 6

“Working software over comprehensive documentation”

11

How do you know it still works today?

Why Automate T ests?

 Move discovery of defects to the left (

earlier)

 Respond to emergent requirements  Capture Intellectual property (test skill)  Enable test engineers to focus on creative

work not repetitive testing tasks.

 Make specifications executable and a

trusted way to understand the impact of change.

12

slide-9
SLIDE 9

9/3/2012 7

T est Vision

 The scrum team is the value creation

engine

 T

ests are best created in the Scrum team

 T

est is a skill not a role

 Need to support test while making the

scrum team primary

13

Issues on the Journey

 T

est debt accumulation

 Specialized testing tools

contribute to segmentation of responsibility

 People who do functional

testing straddling more than one team

14

slide-10
SLIDE 10

9/3/2012 8

Managing T est Debt

 Organization  Definition of “Done”

15

How do we get test debt

16

How many points is that story? Oh… 8 points plus testing

slide-11
SLIDE 11

9/3/2012 9

T est Debt avoidance

 Size of a story is more accurate with an

integrated discussion regarding all of the test and product work.

17

Regression test debt

 The story is complete but 30% of the

regression tests are broken.

18

slide-12
SLIDE 12

9/3/2012 10

Definition of “Done”

 To be complete - tests

must be done and running green on the Continuous Integration pipeline.

 If we make an exception -

then “test fixing” stories should be estimated and in the backlog so PO’s can agree to the exception.

19 20

T esting work straddling teams

slide-13
SLIDE 13

9/3/2012 11

Traditional SDLC Workstreams

21

A more Agile Organization

22

slide-14
SLIDE 14

9/3/2012 12

Where is testing done?

 In the scrum team  Recall definition of done  What happens to the regression tests

that accumulate?

23

Organization for T est Management

Partitioned rotating triage/ leadership

Completely partitioned Autonomous teams

 Scrum teams Plus System Test

Integration

 Dedicated Maintenance Team

24

End of sprint handoff Scrum 1 2 3 4 5 6 7 9

STI Maint’

Scrum 1 2 3 4 5 6 7 9

slide-15
SLIDE 15

9/3/2012 13

Lessons

 Organize test sets to support scrum team

affinity.

 With more than ~5 scrum teams an

integration or “DevOps” team is necessary and good.

 Listen for and stamp out opportunities

for debt to accumulate.

 T

est Community of Practice is valuable and takes work to be effective

25

Key challenge

26

 Design organization/responsibility so

that test debt does not accumulate.

slide-16
SLIDE 16

9/3/2012 14

Shared responsibility

 Functional scrum

team

  • Develops tests
  • Executes acceptance

tests

  • Promotes to regression
  • Monitors selective

regression projects

  • Performs impact analysis
  • f new functions/fixes

regression

 System T

est integration

  • Maintains test

framework/standards

  • Executes regression

tests – All

  • Creates and maintains

reusable test components.

  • Helps functional scrum

teams with massive breakages.

27

Development workflow

1.Update personal workspace 2.Implement change (Story/task, defect, test) 3.Test locally, unit, integration, smoke 4.Update personal workspace 5.Resolve conflicts 6.Commit change to repository & test pipe line 7.Monitor key test projects 8.Revert or fix any problems

28

slide-17
SLIDE 17

9/3/2012 15

A slice of life

29

Skype Chat snippet [7/2/2012 12:23:28 PM] Dan : Trunk‐Dev is broken [7/2/2012 12:23:45 PM] Dan : anyone working for a fix? [7/2/2012 12:24:11 PM] Dan : …. ERROR :…. …. [7/2/2012 12:27:39 PM] Steve : I'm working with Mike on resolving it [7/2/2012 12:30:10 PM] Dan : Thanks

Continuous Integration status

30

slide-18
SLIDE 18

9/3/2012 16

Continuous Integration

 Release Build - Integration and Unit tests

(1000’s)(commit package)

 Rapid Smoke T

est and Regression Lite (20 min) (commit package)

 Smoke on each functional test server(>= daily)  Regression Lite on other Browsers(>= daily)  Regression Heavy (2 hours) (>=daily)  All other automated regression tests 1000’s(daily)  Semi- automated (100’s) & Manual tests

(10’s)(delivery at least)

 Ad-Hoc testing 10’s of hours( each delivery)

  • Human intuition, UI (CSS & other risks)

31

Ongoing challenge

32

  • As system gets larger individuals are

less likely to feel responsible or capable of fixing broken tests.

  • Start with one, two then three

functional scrum teams.

  • Months latter a team forms that

specializes on a particular component of the system.

  • A few more months another

component team arises….

  • Becomes harder to maintain the

social norms of “Stop and Fix “

  • The technical challenges also increase

when the system complexity grows.

  • The need and the risk of integration

problems also grow at the same time.

slide-19
SLIDE 19

9/3/2012 17

Challenges of Scale

 People are socially more distant  T

echnical skills become more focused

 Accountability becomes more elusive  One mistake can impact more people,

makes actions more conservative, slows velocity

33

Wisdom

 When number of teams exceeds 7 +/-2 need a

system level team focused on test assets/regression

 T

est Community of Practice is essential

 Organize test sets with an affinity for teams or

system components.

 As the program gets larger needs to have a team

with gentle authority to ensure consistently

 Ultimately with > 10 teams will need to consider

multilevel integration.

 Performance/stress test is a separate team.

34

slide-20
SLIDE 20

9/3/2012 18

Managing – T

  • ol Specialists

 Common tool

platform for test developers and product developers

 Increase the pool of

people who can create or fix a test.

35

Why a “new” framework

 Conejo T

est Framework motivators

  • Eliminate barriers to “every one is a tester”
  • Enable data driving for more resilient tests
  • Integrate multi modal testing into one

coherent framework Web UI, component, web services, manual.

  • Support the workflow from acceptance test

development to regression testing to

  • bsolesce
  • Integrate all test assets

36

slide-21
SLIDE 21

9/3/2012 19

T est design Goals

 Minimize collateral code, focus on test

target.

 Enable tests to quickly respond to

changes in Application Under T est

 Easy to understand when it breaks (reuse

common patterns, canonical test classes)

37

T est types

Unit Integration Functional Scope Class Component(s) System Persistence No Maybe Yes Author Self Anyone Not the Author Tests system interface No No Yes Traceable to epic, story

  • r defect

No Maybe Yes Execution Pre release build Pre release build Post release build*

38

slide-22
SLIDE 22

9/3/2012 20

T est Architecture

39

Application Under Test Test Framework Interface Classes Tests Utility Classes Se

JUnit

T est Execution

40

@Test @TestHeaderInfo(description = "Test Buyer Sequence", … functionalArea = {"Buyer" , "Order" } ) public void testBuyerSequence() { //Test code goes here.... } mvn test ‐Dconejo.filter.functionalAreas="Seller" @Test @TestHeaderInfo(description = "Test Seller Sequence", … functionalArea = {"Seller" } ) public void testSellerSequence() { //Test code goes here.... }

slide-23
SLIDE 23

9/3/2012 21

Product and “Anti-Product”

41

Product

Test

DBA DBA Operations Operations Architecture Architecture

It’s not a cakewalk

42

slide-24
SLIDE 24

9/3/2012 22

Wrap Up

 Multiple Motivations  T

est is the Product anti-matter

 Needs to be approached as first class

component of the solution.

 Complex organizational and technical

concerns

 Part of the secret sauce of a successful

Agile effort.

43

Questions? – Ideas!

44