E6 Thursday, March 8, 2001 11:30 AM H OW TO E VALUATE AND S ELECT A - - PDF document

e6
SMART_READER_LITE
LIVE PREVIEW

E6 Thursday, March 8, 2001 11:30 AM H OW TO E VALUATE AND S ELECT A - - PDF document

P R E S E N T A T I O N Presentation Bio E6 Thursday, March 8, 2001 11:30 AM H OW TO E VALUATE AND S ELECT A H IGH -E ND L OAD T ESTING T OOL Marquis Harding Reality Test International Conference On Software Test Automation March 5-8, 2001


slide-1
SLIDE 1

International Conference On Software Test Automation March 5-8, 2001 San Jose, CA, USA P R E S E N T A T I O N Thursday, March 8, 2001 11:30 AM

HOW TO EVALUATE AND SELECT

A HIGH-END LOAD TESTING

TOOL

Marquis Harding

Reality Test

E6

Presentation Bio

slide-2
SLIDE 2

A Methodology for Evaluating Alternative Load Testing Tools

Marquis Harding

Re a l i t y T e s t

slide-3
SLIDE 3

This is a Reality Test 2

The Selection Problem

Tool selection is a difficult

choice

Many alternatives Costly Long evaluation period

No standard evaluation method No standard evaluation criteria

slide-4
SLIDE 4

This is a Reality Test 3

Agenda

Tool evaluation methodology

The experiment The results Technical environment Technical skill set

Customer evaluation

Environment details Evaluation methodology Results

slide-5
SLIDE 5

This is a Reality Test 4

What Is the Objective?

Predict, diagnose and correct problems in the

the system under test (SUT) before deployment.

10 20 30 40 50 Users Response Time

Unacceptable Performance Incorrect Behavior

slide-6
SLIDE 6

This is a Reality Test 5

10 20 30 40 50 Users Response Time 10 20 30 40 50 Response Time

Current SUT Performance Reconfigured Performance

What Tool Characteristics Matter?

  • Must scale on Production Equivalent Hardware
  • Must accurately represent real workload
  • Must be maintainable and repeatable

when SUT changes are tested

  • Must be Cost Effective
slide-7
SLIDE 7

This is a Reality Test 6

Tool Evaluation - The Experiment

Tool evaluation is an experiment You need to :

Gather information Identify materials Identify methodology Identify metrics Execute Analyze

Experiment must be Repeatable

Refresh Database Refresh logs Reset

slide-8
SLIDE 8

This is a Reality Test 7

Information Gathering

Vendor web sites Vendor literature

packs

Local user groups Internet resources Customer

references

slide-9
SLIDE 9

This is a Reality Test 8

Identify Materials

Materials required

Tool Target system Refresh mechanism Monitoring tools Analysis tools Time

And most importantly

Technical support Management support

slide-10
SLIDE 10

This is a Reality Test 9

Determine Methodology

Determine functions to test

1 to 3 or more representative scenarios

representative scenarios

Start with read only scenario then insert & Vary complexity Create input data Consider security You can’t test everything

slide-11
SLIDE 11

This is a Reality Test 10

Determine Metrics

Quantitative metrics

Memory usage CPU usage

Qualitative metrics

Ease of use Recording process Scripting Reporting Protocol support

slide-12
SLIDE 12

This is a Reality Test 11

Executing the Test

Some things to consider

Network Load - Day vs. Night System Load Stress of measurement tools

Test must be Repeatable

Refresh Database Refresh logs Reset

slide-13
SLIDE 13

This is a Reality Test 12

Analyze Results

Validate Run

Invalid Return Results Dropped Connections

Examine Timing Data

Tool Data External Reporting Data

slide-14
SLIDE 14

This is a Reality Test 13

Technical Environment

Ample supply of driver machines

As much hard drive storage space as possible

Keep Staged Database backups/dump files Keep all result files

Ample Memory

Budget 3MB per VU

Double your worst case time estimate Playback must Every error, omission and oversight costs one hour - Server response times slow with additional users User log on time grows exponentially

slide-15
SLIDE 15

This is a Reality Test 14

PerformanceStudio

Tool Implementation

Technical Skill Set

PerformanceStudio

System Under Test Architecture Business Processes

Tool Knowledge

Networking Database Management Windows NT HTTP SQL Project Management Statistics Unix

slide-16
SLIDE 16

This is a Reality Test 15

Customer Evaluation

After Information Gathering, the decision

came down to evaluate performance testing tools on a real production system!

Good Management Support Fair Technical Support Other Measurement Aids

WinNT - MS Perfmon SQL Server - SQL Trace WinDiff

slide-17
SLIDE 17

This is a Reality Test 16

The Experiment

Project X SQL Server driven application for customer Application to track user maintenance Evaluate performance testing tools on real

production system

All were shipping versions

Qualitative and Quantitative Analysis

slide-18
SLIDE 18

This is a Reality Test 17

Project Time

Preparation Time

Total time elapsed

2 Months

Active time spent on project

2 Weeks

Execution Time

Total time elapsed

6 Days

Active time spent on project

5 Days

Analysis Time

Total time elapsed

5 Days

Active time spent on project

5 Days

slide-19
SLIDE 19

This is a Reality Test 18

Technical Environment

Recording Environment

Application:

Customer service

Client:

Gateway Pentium 200 Windows NT Server

Server:

SQL Server 6.5 Dell Pentium II 450, 512 MB Ram

Tools:

Current shipping versions

slide-20
SLIDE 20

This is a Reality Test 19

The Recording Process

For a fair evaluation, scripts had to be I DENTI CAL

  • Three scenarios identified

2 - Focussed on specific areas of concern 1 - Complex Business Process

  • Complex Business Process Scenario Dropped

Proved redundant - first two yielded sufficient Script was complex and additional effort would

slide-21
SLIDE 21

This is a Reality Test 20

The Recording Process Cont.

Recorded original scripts with one tool. Used Tool specific recording to capture the

Play back 1 instance of original script Capture transactions

Both scripts were edited for Data Correlation Tool output and SQL Trace outputs analyzed

with WinDiff to ensure they were exactly the

slide-22
SLIDE 22

This is a Reality Test 21

The Execution Process

  • Executed four tests
  • User Load: 1, 50, 150 and 300 Virtual Users
  • Scheduling Difficulties
  • Tool 1 scheduling features available

Random events Complex logon patterns User profiling

  • 5 Days to Execute
  • Generally off hours
slide-23
SLIDE 23

This is a Reality Test 22

Execution Hardware

Driver Machines

Gateway Pentium 166 Mhz 128MB Gateway Pentium II 233 Mhz 256MB Dell Pentium II 450 Mhz 512MB Dell Pentium II 450 Mhz 512MB Dell Pentium II 450 Mhz 512MB

Controller

Gateway Pentium II 233 Mhz 256MB

slide-24
SLIDE 24

This is a Reality Test 23

Analysis - Quantitative Results

Used NT Performance Monitor

Memory Metric

: Available Bytes

Processor Metric: % Processor Time

Used SQL Trace to analyze Database

Verify that all tools performing same

slide-25
SLIDE 25

This is a Reality Test 24

Tool 1 Processor & Memory Stats

Average Footprint 1.60 MB/VU Average Processor Utilization Dell Pentium II 450 Mhz , 512MB, 60 Virtual Users Script 1 Memory Processor

slide-26
SLIDE 26

This is a Reality Test 25

Tool 1 Processor & Memory Stats

Gateway Pentium 166 Mhz,128MB, 60 Virtual User Script 2 Average Footprint

  • 0. 52 MB/VU

Average Processor Utilization Memory Processor

slide-27
SLIDE 27

This is a Reality Test 26

Tool 1 and SQL Server Statistics

Dell Pentium II 450 Mhz , 512MB Logon

slide-28
SLIDE 28

This is a Reality Test 27

Different Log-on Emulation

Tool 2, May Not accurately emulate connections for the SUT Tool 1, emulates connections as they were recorded

slide-29
SLIDE 29

This is a Reality Test 28

Surprising Differences!

Tool 1, found

  • Database Locking

Verified as problem by real user testing

  • Accurate connection

modeling

  • Accurate pacing
slide-30
SLIDE 30

This is a Reality Test 29

Analysis - Qualitative Results

Ease of Use

Script Length using

Tool 1 Script 1 2,715Lines Script 2 2,032 Average 2,374

Script Development Time

Tool 1 2 Days per Script Note: Knowledge gained by scripting in

  • ther tools saved scripting time.
slide-31
SLIDE 31

This is a Reality Test 30

Analysis - Qualitative Results

Features worth mentioning

Data Smart recording Script splitting Timing of individual commands Complex scheduling Server error handling Shared memory Ability to pass information between virtual users. Network recording Accurate script pacing Accurate connection emulation On-line monitoring Detailed reporting Support mechanism

slide-32
SLIDE 32

This is a Reality Test 31

Lessons Learned

Tool choice matters! Performance testing works!

Revealed application architecture deficiencies Found deadlocks Found redundant database code Determined optimization points

Be prepared

Time estimates Double your hard drive space Off hours availability

slide-33
SLIDE 33

Marquis Harding

Marquis Harding has over twenty-five years of Information Technologies and Software Quality Assurance experience. His backqround includes development and QA of large and mid-range mainframe, client/server, and Internet systems, senior management of QA and testing for large companies that span the financial, telecommunications and software industries. Mark has personated at international conferences on software development and testing. Marquis is a disabled Vietnam Veteran. While at Microsoft Corporation, he held the positions of Group Quality Assurance/Test Manager for Windows.com, Windows Update.com, and Microsoft.com, and Test Manager for IT Sales & Marketing/Product Support

  • Services. In six years at Charles Schwab & Co., Inc., he held the positions of

Senior Test Manager, ITG, as well as Development Manager for Schwab’s Financial Advisor Services division. Prior to this was a seventeen-year career at Pacific Telesis where he was employed as Manager of Information Technology Support for the CFO and Executive Vice President of Operations.