Performance Testing at the Edge Alois Reitbauer, dynaTrace Software - - PowerPoint PPT Presentation

performance testing at the edge
SMART_READER_LITE
LIVE PREVIEW

Performance Testing at the Edge Alois Reitbauer, dynaTrace Software - - PowerPoint PPT Presentation

Performance Testing at the Edge Alois Reitbauer, dynaTrace Software 3,000,000,000 10,500,000,000 The Classical Approach Waterfalls are pretty But might get scary The dynaTrace Approach Many platforms Different usage scenarios High number


slide-1
SLIDE 1

Performance Testing at the Edge

Alois Reitbauer, dynaTrace Software

slide-2
SLIDE 2

10,500,000,000

3,000,000,000

slide-3
SLIDE 3

The Classical Approach

slide-4
SLIDE 4

Waterfalls are pretty

slide-5
SLIDE 5

But might get scary

slide-6
SLIDE 6

The dynaTrace Approach

slide-7
SLIDE 7

Many platforms Different usage scenarios High number of configurations No easy way to patch software

slide-8
SLIDE 8

8 APPLICATION

DYNATRACE SERVER

DYNATRACE CLIENT

WAN

DYNATRACE COLLECTOR

(OPTIONAL)

Web Server Java Server .NET Server Database

DYNATRACE COLLECTOR

(OPTIONAL)

Our Architecture

slide-9
SLIDE 9

Lessons learned

slide-10
SLIDE 10

Profiling was not enough

Good for finding problems Result comparison hard

Only valid until next check-in

Too much work

slide-11
SLIDE 11

The Life of a Log Statement

Enter the code

  • !
slide-12
SLIDE 12

The Life of a Log Statement

Somebody changes something

  • "
  • !
slide-13
SLIDE 13

The Life of a Log Statement

Your code gets deprecated

  • #"
  • !
slide-14
SLIDE 14

Methodology

slide-15
SLIDE 15

Defining our strategy

Start early Test Continuously Break in pieces

slide-16
SLIDE 16

Frequency vs. Granularity

Frequency Granularity

JUnit-based Tests (2x day) Total System Tests Long-running Stabiltiy Tests (2 w duration)

slide-17
SLIDE 17

Granularity

Comparability Complexity Quality

slide-18
SLIDE 18

Avoid Re-Runs

  • What could happen?
  • Which information do you

want?

  • What describes your

system?

  • What is different from the

last run?

slide-19
SLIDE 19

Aim high … … test 50% more

slide-20
SLIDE 20

Create Instability

.. adding some volatility increases the likelyness to discover problems …“

slide-21
SLIDE 21

„Last Mile Testing“

slide-22
SLIDE 22

Measurements

slide-23
SLIDE 23

Stability of Tests

slide-24
SLIDE 24

Use Dedicated Hardware

Comparability Stability Efficiency

slide-25
SLIDE 25

Trends in Unstable Tests

slide-26
SLIDE 26

Testing scalability

Small Dump Operations Big Dump Operations

slide-27
SLIDE 27

Understand your measurements

Response Time only Response Time and GC

slide-28
SLIDE 28

Be Specific on what to test

Throughput Response Time Memory Consumption Other KPI …

slide-29
SLIDE 29

Beyond Response Time

KPI Chart: Server Throughput Over Time

slide-30
SLIDE 30

Motivate your team

slide-31
SLIDE 31

How to make developers write tests

#1 Heroism #2 Boomerang #3 The other guy #4 Bug me not #5 Feedback #6 Code vs. Wine #7 Newb vs. Noob

slide-32
SLIDE 32

Test Case Complexity

First Start dynaTrace infrastructure When ready Start n WebSphere instances on servers … When ready Start Loadtest against WebSphere servers After loadtest start Execute test case

slide-33
SLIDE 33

Making complex things easy

$%&' ()*)+ (),-'./)+ 01(2+ '3(,4,''56

  • '5#''
slide-34
SLIDE 34

Finding the responsible code

Version Control History Lookup

slide-35
SLIDE 35

Always available

Continuous Integration Reports

slide-36
SLIDE 36

E-Mail Notification

slide-37
SLIDE 37

alois.reitbauer@dynatrace.com Mail blog.dynatrace.com Blog AloisReitbauer Twitter

slide-38
SLIDE 38

Time

Performance Threshold Performance

Threshold Time

Development Testing Production Development Testing Production Traditional Performance Management Continuous Performance Management