Automated Testing Elena Laskavaia March 2016 Quality Foundation - - PowerPoint PPT Presentation

automated testing
SMART_READER_LITE
LIVE PREVIEW

Automated Testing Elena Laskavaia March 2016 Quality Foundation - - PowerPoint PPT Presentation

Winning the Battle against Automated Testing Elena Laskavaia March 2016 Quality Foundation of Quality People Process Tools Developers dont test Development vs Testers dont develop Testing Testers dont have to be


slide-1
SLIDE 1

Winning the Battle against Automated Testing

Elena Laskavaia March 2016

slide-2
SLIDE 2

Quality

People Process Tools

Foundation

  • f Quality
slide-3
SLIDE 3

Development vs Testing

  • Developers don’t test
  • Testers don’t develop
  • Testers don’t have to be skilled
  • Separate Developers and Testers
  • Make the Test team responsible for

quality

slide-4
SLIDE 4

One Team

Quality is a team responsibility

slide-5
SLIDE 5

The Process

When quality is bad let's add more steps to the process

slide-6
SLIDE 6

Story about the broken Trunk

Thousands of developers Continuous stability is a must “Trunk is broken” too often Huge show stopper for R&D People did root-cause analysis Came up with Improved Process

slide-7
SLIDE 7

“Improved” Pre-Commit Process

Repeat for all hardware variants Manually execute sanity test cases Re-build and re-deploy whole system Clean compile All Pull/Update All Source

slide-8
SLIDE 8

Trunk is still

  • broken. Why?

Process was not followed

Process is too complex Process is too boring Process is too time consuming Environment / Hardware limitations Developers don’t know about the process Developers are lazy

slide-9
SLIDE 9

Automated Pre-Commit Testing

slide-10
SLIDE 10

Pre-Commit Tests with Source Management System

Fix Push Checks master

  • Peer reviews
  • Robots checks
slide-11
SLIDE 11

Automation Hack

Oh well we don’t have any more budget and time, let go back to manual testing It does not work at all now! Oops our tester quit, who knows how to run it? Need a person to run it for every build Spend 6 month developing testing framework Randomly pick a tool

Let's slap on some automation!

slide-12
SLIDE 12

Continuous Testing

Continuous Quality

slide-13
SLIDE 13

Cost of Automation

  • Cost of Tools
  • User Training
  • Integration and Customization
  • Writing Test Cases
  • Executing Test Cases
  • Maintaining Test Cases
slide-14
SLIDE 14

Jump Start

Make one team responsible Setup continuous integration Add pre-commit hooks Establish simple self- verifying process Add one automated test

slide-15
SLIDE 15

Key Principles

  • f successful automated testing
slide-16
SLIDE 16

Gate Keeper

test system must guard the gate

slide-17
SLIDE 17

100% Success

100% of tests must pass. zero tolerance

slide-18
SLIDE 18

NO random failures

  • Remove such tests from

automation

  • Use repeaters to keep intermittent

tests

  • Be prepared for the noise
  • Modify AUT to remove source of

randomness for tests

slide-19
SLIDE 19

Fully Automated

  • No monkeys pushing buttons

to start the testing

  • No monkeys watching

automated UI testing

  • Hooks on code-submission

(pre-commit, fast)

  • Hooks on build promotion

(overnight)

slide-20
SLIDE 20

Fast and Furioius

  • Feedback for pre-commit <=10 min
  • Overnight is absolute maximum
  • More tests degrade the system

response time

  • Not all tests are born equal!
  • Use tagging and filtering
  • Distribute or run in parallel
  • No sleeps
slide-21
SLIDE 21

Timeouts

  • Make sure tests are not hanging!
  • Use timeouts
  • Use external monitors to kill

hanging runs

  • Do not overestimate timeouts
slide-22
SLIDE 22

Test Scripts are Programs

  • Automated test cases are programs
  • Treat them as source code
  • They must be in text form
  • They must go to same version

control system

  • Subject to code inspection, coding

standards, build checks, etc

slide-23
SLIDE 23

Unit Tests

  • Mandatory with commit
  • Use servers to run

Part of the process

  • Use a mocking framework
  • Use UI bot
  • Use test generators
  • Inline data sources

Easy to write

slide-24
SLIDE 24

Unit

  • Unit tests cannot cover all
  • Test actual installed AUT
  • Run the program as user

would

  • Use same language for unit

and integration testing

Integration

slide-25
SLIDE 25

Pick and Choose

  • Difficult to set-up

cases

  • Rare but important

scenarios

  • Check lists
  • Module is actively

developed

  • Long maintenance

expected Candidates

you should not automate everything

slide-26
SLIDE 26

Self-Verification:

test the test system?

Automatically Check  Code submission is properly formatted (has bug id, etc)  Code submission has unit tests  Total number of tests is increased  Performance is not declined  Code coverage is not declined

slide-27
SLIDE 27

Failed Battles

slide-28
SLIDE 28

Tools we used or evaluated and failed

  • after 3 month of

writing tests realized that it won’t work on Linux WinRunner

  • was pretty good

until it was bought and it stopped launching with new eclipse WindowTester

  • 4 years ago:

database, no text for tests, no integration Jubula

  • slow, not

debuggable, blocks on support. python. Squish

  • domain specific

language RCPTT

slide-29
SLIDE 29

Working Solution

slide-30
SLIDE 30

Continuous Integration Tools

  • Unit testing

JUnit

  • Source Control and Code
Review

Git/Gerrit

  • Static Analysis

FindBugs

  • Build System
  • maven-surefire-plugin (for
unit tests)
  • maven-failsafe-plugin (for
integration tests)
  • findbugs-plugin for static
analysis

Maven/Tycho

  • Continuous Integration
Server
  • Gerrit Trigger plugin - pre-
commit builds and voting
  • FindBugs plugin - reports
and status
  • JUnit plugin - reports and
status

Jenkins

  • gui testing

SWTBot

  • junit mocking

Mockito

  • Code Coverage

Ecl Emma

  • Lots of custom libraries,
frameworks and bots

Custom

slide-31
SLIDE 31

Tips and Tricks

slide-32
SLIDE 32

Auto-Bots

Checks that can be added to every test  App crashed during a test  Test timeout exceeded  App generated unexpected log  Temp files were not cleaned up  Resource or memory leaks  Runtime error detection

slide-33
SLIDE 33

AutoBots: Junit Rules

public class SomeTest { // tests that we don’t leave tmp file behind (this is custom rule not // base junit) @Rule TmpDirectoryCheck tmpRule = new TmpDirectoryCheck(); @Test void testSomething(){ } } // base class with timeouts public abstract class TestBase { public @Rule Timeout globalTimeout = Timeout.seconds(1); // 1 second }

slide-34
SLIDE 34

Jenkins Split Verifiers

Regression testing Linux Regression testing Windows Static Analysis

+1 verify

To speed up verification for pre-commit hooks set up multiple jobs which trigger on the same event (i.e. patch submitted)

slide-35
SLIDE 35

Inline Data Sources: Comments in Java

// template<typename T2> // struct B : public ns::A<T2> {}; // void test() { // B<int>::a; // } public void testInstanceInheritance_258745() { getBindingFromFirstIdentifier("a", ICPPField.class); }

slide-36
SLIDE 36

Code Coverage

  • Run tests with code coverage
  • Not during pre-commit check
  • Unless it has validation hooks
  • Good tool for unit test design

(IDE)

  • Never ask for 100% coverage

Code Coverage -> Select Tests Based on changed code exclude tests that do not cover the changes

slide-37
SLIDE 37

Static Analysis

  • Can be run independently
  • Has to be a gatekeeper
  • Spent time to tune it (remove all

noisy checkers)

  • Push to desktop (if running as you

type - instantaneous feedback!)

  • Use alternative UX on desktop (i.e.

code formatter)

Jenkins Plugin: Code Reviewer Post defects from static analysis as reviewer comments on the patch

slide-38
SLIDE 38

Tagging and Filtering: Junit Categories

// tag class with categories in test class

@Category({PrecommitRegression.class, FastTests.class}) public class SomeClassTest { @Test public void someTest() { } }

// in maven pom.xml <build> <plugins> <plugin> <artifactId>maven-surefire-plugin</artifactId> <configuration> <groups>com.example.PrecommitRegression</groups> </configuration> </plugin> </plugins> </build>
slide-39
SLIDE 39

Runtime Filtering: Junit Assume

// skip test entirely if not running in osgi @Before public void setUp() { Assume.assumeTrue( Activator.isOsgiRunning() ); }

slide-40
SLIDE 40

Intermittent Test: Junit Rule

public class SomeClassTest { public @Rule InterimittentRule irule = new InterimittentRule(); // repeat this up to 3 times if it failing @Intermittent(repetition = 3) @Test public void someTest() {} }

You can create a runner or define a rule which repeats a test if it fails. Junit itself does not define either, you have to add it yourself (2 classes, 62 lines) of code).

slide-41
SLIDE 41

The End

One Team Simple Process Right Tools