Continuous Performance Testing: A Tale of Two Worlds - - PDF document

continuous performance testing a tale of two worlds
SMART_READER_LITE
LIVE PREVIEW

Continuous Performance Testing: A Tale of Two Worlds - - PDF document

T18 Performance Testing Thursday, October 3rd, 2019 1:30 PM Continuous Performance Testing: A Tale of Two Worlds Presented by:


slide-1
SLIDE 1

¡ ¡ T18 ¡

Performance ¡Testing ¡ Thursday, ¡October ¡3rd, ¡2019 ¡1:30 ¡PM ¡ ¡ ¡ ¡ ¡

Continuous ¡Performance ¡Testing: ¡A ¡ Tale ¡of ¡Two ¡Worlds ¡ ¡

Presented ¡by: ¡ ¡ ¡

¡ Kaushal ¡Dalvi ¡

¡ Ultimate ¡Software ¡ ¡ ¡

Brought ¡to ¡you ¡by: ¡ ¡ ¡ ¡

¡

¡

¡ ¡

888-­‑-­‑-­‑268-­‑-­‑-­‑8770 ¡·√·√ ¡904-­‑-­‑-­‑278-­‑-­‑-­‑0524 ¡-­‑ ¡info@techwell.com ¡-­‑ ¡http://www.starwest.techwell.com/ ¡ ¡ ¡

¡

¡ ¡ ¡

slide-2
SLIDE 2

¡

Kaushal ¡Dalvi ¡

¡ Since ¡he ¡got ¡his ¡first ¡computer ¡at ¡age ¡thirteen, ¡Kaushal ¡Dalvi ¡has ¡been ¡interested ¡in ¡ systems ¡and ¡software ¡performance. ¡He ¡spent ¡days ¡researching ¡performance ¡ characteristics ¡of ¡different ¡motherboards, ¡CPUs, ¡GPUs, ¡RAM, ¡and ¡disks ¡to ¡configure ¡ and ¡overclock ¡them ¡in ¡order ¡to ¡squeeze ¡the ¡maximum ¡frames ¡per ¡second ¡out ¡of ¡the ¡ games ¡he ¡played. ¡Kaushal ¡built ¡and ¡maintained ¡websites ¡for ¡local ¡businesses, ¡where ¡ he ¡started ¡learning ¡about ¡performance ¡and ¡reliability. ¡He ¡developed ¡a ¡taste ¡for ¡it, ¡ and ¡now ¡ten ¡years ¡later ¡he ¡continues ¡working ¡in ¡the ¡same ¡field ¡as ¡the ¡director ¡of ¡ quality ¡and ¡performance ¡engineering ¡at ¡Ultimate ¡Software, ¡leading ¡teams ¡that ¡deal ¡ with ¡workload ¡modeling, ¡test ¡scripting ¡and ¡execution, ¡bottleneck ¡analysis, ¡ debugging, ¡and ¡much ¡more. ¡ ¡

slide-3
SLIDE 3

9/29/19 1

  • KAUSHAL DALVI

CONTINUOUS PERFORMANCE TESTING: A TALE OF TWO WORLDS

WHAT IS PSR

The latency and utilization characteristics of systems under different types of load.

PERFORMANCE

The ability of the system to maintain performance characteristics under growing load, given more resources.

SCALABILITY

The ability of the system to maintain or restore its performance characteristics when subjected to failures of components.

RESILIENCY

P S R

A POSSIBLY BETTER WAY TO REFER TO THE UMBRELLA TERM RATHER THAN JUST PERFORMANCE

slide-4
SLIDE 4

9/29/19 2

TYPES OF PSR TESTS

THE DIFFERENT TYPES OF TESTS UNDER THE PERFORMANCE UMBRELLA

A steady load based on regular peaks

  • bserved or expected in productions applied

for a short duration (usually 1-3 hours) to understand the performance characteristics

  • f the application in normal conditions.

LOAD

Prolonging the workload from load test to longer periods (usually 8-72 hours) to understand the behavior of the application over time.

ENDURANCE

Increasing the regular workloads by a factor of 2X, 3X and so on to understand the characteristics of the system under increased demand.

STRESS

Gradually increasing the volume of the data or in general the work objects that the system needs to process under normal workloads to understand the impact of growth on the performance characteristics of the system.

VOLUME

Any kind of load test run with the objective of understanding whether the system is capable

  • f maintaining performance characteristics

when subjected to increasing workloads while being provided additional resources.

SCALABILITY

Any kind of performance test run with the

  • bjective of understanding whether the

system is capable of maintaining or restoring performance characteristics when subjected to failures of components.

RESILIENCY

IMPORTANCE OF TARGETS

With great power comes great responsibility

The extra-ordinary ability to waste everyone’s time !

Running the wrong test can lead to false overconfidence. Or it can lead to chasing demons that do not exist in the real world.

slide-5
SLIDE 5

9/29/19 3

TARGETS

Knowing the terminology

NF NFR SLA SLA SLO SLO SLI SLI

NON FUNCTIONAL REQUIREMENT

A requirement that specifies criteria that can be used to judge the

  • peration of a system, rather than

specific behaviors.

SERVICE LEVEL AGREEMENT

An official commitment that prevails between a service provider and a client.

SERVICE LEVEL OBJECTIVE

An internally agreed upon

  • bjective that the team is

committed to meet.

SERVICE LEVEL INDICATOR

An indicator that shows the current level being achieved by the systems under observation.

SLA

Targets

(…) An official/legal commitment that prevails between a service provider and a client.

SLO

An internally agreed upon

  • bjective that the

team is committed to meet.

NFR

A requirement that specifies criteria that can be used to judge the

  • peration of a system,

rather than specific behaviors.

SLI

An indicator that shows the current level being achieved by the systems under

  • bservation.
slide-6
SLIDE 6

9/29/19 4

WORKLOAD MODELLING

Characterization of the work performed by the system

Designing the right performance test to answer relevant questions about PSR characteristics

Workload modeling identifies one or more workload profiles to be simulated against the tested application. The workload model then attempts to approximate real life usage scenario and includes different user types and characteristics.

https://blog.smartbear.com/software-quality/workload-modeling-and-profiles-for-load-testing/

PURPOSE OF PERFORMANCE TESTS

Not all performance tests are created equal

Also know as

  • ptimization/tuning testing,

to find the best solution to a design/architecture problem

A/B TESTS

To determine the upper limits

  • f the system in terms of load,

data, other factors

CAPACITY

Market facing tests to show how well the software can perform at realistic loads

BENCHMARKING

To check whether the performance of the software has been degraded by the changes to the source code

REGRESSION

slide-7
SLIDE 7

9/29/19 5

MICROSERVICES vs MONOLITHS

HIGH LEVEL PSR RELEVANT DIFFERENCES

???

MICROSERVICE MONOLITH

SMALL SERVICES INDEPENDANTLY DEPLOYABLE MULTI STACK NETWORK COMMUNICATION SAGAS BIG BALL OF MUD COMBINED DEPLOYMENT SINGLE STACK INTERNAL COMUNICATION TRANSACTIONS

EVERY SERVICE HAS ITS OWN ENVIRONMENT LEADING TO PROBLEMS AROUND CONFIGURATION AND SCHEDULING AMONG OTHER THINGS

TE TEST E T ENVIR IRONMENTS TS

EVERY SERVICE ALSO HAS ITS OWN DATABASE, MAKING TEST DATA MANAGEMENT A NIGHTMARE

TE TEST D T DATA TA

DIFFICULT TO COME UP WITH COMPLEX COMBINED TARGETS FOR PSR REQUIREMENTS

TA TARGE GETS TS

DIMINISHING RETURNS OBSERVED IF NOT EXPLICITLY DESIGNED FOR

AHM AHMDAHL DAHLS LAW AW

EASIER TO KNOW THE CAPACITY OF EACH INDIVIDUAL COMPONENT, OVERALL SYSTEM, NOT SO MUCH

CA CAPACI CITY

PSR TEST CHALLENGES - MICROSERVICES

AS OPPOSED TO TRADITIONAL MONOLITHS

slide-8
SLIDE 8

9/29/19 6

SU SUPPORT

PAID FOR SUPPORT CAN BE CRITICAL AT CRUNCH TIME

FI FIX YOUR OWN ISSUES

HAVING ACCESS TO THE SOURCE CODE MEANS CUSTOM FEATURES CAN BE ADDED

RE READ: FRE REE !!!

OPEN SOURCE TOOLS OFTEN REMOVE THE FIRST BARRIER TO ENTRY, THE COST

CA CAPABILITY

HISTORICALLY, COMMERCIAL TOOLS HAVE HAD MORE BELLS AND WHISTLES

CO CONTROL

ABILITY TO DEPLOY ON PREM PROVIDES MORE COTROL OVER DATA

OPEN SOURCE vs PROPRIETARY

HIGH LEVEL PSR RELEVANT DIFFERENCES

THE WEB EXPERIENCE

END USER PERSPECTIVE

slide-9
SLIDE 9

9/29/19 7

  • BROWSER
  • OS
  • ROUTER
  • ISP
  • REQUEST SENT OVER TCP

DNS CACHE CHECKS & CONNECTION

THE WEB REQUEST LIFECYCLE

THE COMPONENTS THE MAKE UP END USER PERCIEVED RESPONSE TIMES

  • APP SHELL HTML
  • BASIC HTML

SERVER RENDERS PAGE

slide-10
SLIDE 10

9/29/19 8

DOM

TREE

CSSOM

TREE

RENDER

TREE

BUILDING THE TREES

How the browser begins the process of showing the web page

RENDER BLOCKS DOWNLOAD ADDITIONAL RESOURCES

CSS, JS, WOFF files may block the parse and/or render trees from being generated

slide-11
SLIDE 11

9/29/19 9

AJAX REQUESTS

Data is increasingly being populated after the app shell has bootstrapped

VISUALLY COMPLETE AND INTERACTIVE

Page is fully loaded and is usable.

slide-12
SLIDE 12

9/29/19 10

FRONTEND PERFORMANCE METRICS

Each of these have an impact on the perception of performance by the user.

FIRST PAINT

When the app shows some sign of life/activty upon submitting request.

FIRST CONTENTFUL PAINT

When the canvas of the app has taken the shape and structure of the final form.

FIRST MEANINGFUL PAINT

When the user can start to get meaningful cues from the app.

HERO ELEMENT

When the most important part of the page that the user is interested in has loaded.

TIME TO INTERACTIVE (FIRST)

When most of the elements on the page respond to input from the user.

CONSISTENTLY INTERACTIVE

When the app is no longer issuing network requests (except websockets)

100 VS 100

ARE 100 MOBILE USERS EQUIVALENT TO 100 DESKTOP USERS?

LICENSES RESOURCES EXPECTATIONS

MOBILE vs DESKTOP

HIGH LEVEL PSR RELEVANT DIFFERENCES MOBILE USERS TEND TO USE MORE RESOURCES ON THE BACKEND MOBILE EMULATION LICENSES HAVE BEEN MORE EXPENSIVE MOBILE USERS TEND TO HAVE SHORTER ATTENTION SPANS

slide-13
SLIDE 13

9/29/19 11

E2E COMPLEXITY TOOLS REALISM BRITTLENESS LONGEVITY WI WINNER - UI UI The ease of creating an end to end scenario. WI WINNER - AP API Complexity in getting the script working. WI WINNER - AP API Availability of tools that are up to the task. WI WINNER - UI UI How closely script workload reflects real life. WI WINNER - AP API How reliable the scripts are when nothing changes. WI WINNER - AP API How often the scripts need to be updated.

UI vs API

HIGH LEVEL PSR RELEVANT DIFFERENCES

THE GRADES

THE ABSOLUTE BEST

The benchmark that can be held by

  • nly one service of its kind at a time.

EXCEEDS EXPECTATIONS

Noticable as performant by both external and internal consumers.

MEETS EXPECTATIONS

Meets performance expectations from internal and external consumers.

NEEDS IMPROVEMENT

Noticable as non-performant by external consumers. Causes internal consumers from meeting their targets.

DIS-SATISFACTORY

Causes external customers to be irked, prohibits internal consumers from meeting their targets.

BROKEN

Unusable by internal or external consumers.

slide-14
SLIDE 14

9/29/19 12

NO SILVER BULLETS

Don’t be tool obsessed with a tool. Replacing a tool to fill a minor gap starts a never ending cycle.

NOT A JOB

Monitoring is a skill not a job and not suited for someone who doesn’t know what it is they are monitoring.

NOT A CHECKBOX

Monitoring for the sake of it almost never helps, rather it generates more noise and causes more confusion than it helps.

AUTOMATION

Deploying monitoring should never have to be a manual task, it almost guarantees that it will eventually be ignored.

USER PERSPECTIVE

When starting to monitor, start from the perspective

  • f the user and then start

expanding outwards.

DON’T BUILD

Building a new monitoring system from scratch is almost never a good idea unless you are a tech giant that needs a custom solution.

MONITORING TIPS

DOS AND DONTS

ENSURING PERFORMANCE PERFORMANCE AS A FEATURE

Team treats SLOs as features to strive for and designs for them.

LAB PERFORMANCE TESTS

Providing input and confidence before shipping to production.

NON FUNCTIONAL REQUIREMENTS

At a product/feature/story level.

FEEDBACK

Continuously feed back into the system to keep improving it

REAL WORLD MONITORING

Validating expectations in production and re-enforcing base data and assumptions.

THE IDEAL SETUP

slide-15
SLIDE 15

9/29/19 13

OWN PERFORMANCE ON THE DOMAIN

GIVE PERFORMANCE A SEAT AT THE TABLE REMOVE IMPEDIMENTS

DIR/MGR

Supporting the Performance Peeps

THE PLAYERS (ON THE TEAM)

  • TEAM PERFORMANCE PEEP

Co-ordinates and drives performance testing activities in the domain

  • PRODUCT OWNER

Owns performance on the team, facilitates such that the TPP can execute their duties

  • TEAM MANAGERS/LEADS

Work with the TPP and their teams to incorporate performance testing activities

  • BUSINESS/PRODUCT ANALYSTS

Work with the TPP and the teams to ensure performance requirements and objectives are captured and honored

  • DEVELOPERS

Work with BA/PAs and TPPs to design and implement system that meet performance requirements

  • TEST ENGINEERS

Work with Devs, BA/PAs and TPPs to ensure performance requirements are validated

slide-16
SLIDE 16

9/29/19 14

FUN

TEAM BUILDING ACTIVITIES ARE IMPORTANT!

GOALS

CREATING A STRUCTURE WHERE SHARED GOALS ARE POSSIBLE

WORKLOAD

ALLOW FOR SCALABILITY WITHIN THE PSR RESOURCES

PRESENCE

HAVING PRESENCE SPREAD IN THE ORG ALONG WITH A CORE TEAM

KNOWLEDGE

SHARING TEST PLANS AND RESULTS HELPS COMPOUND GROWTH

PERFORMANCE GUILD

MEETINGS

DAILY – 5 minutes

DAILY HUDDLE

A quick daily huddle of all of the performance engineers.

  • To bring up significant

developments

  • To bring up impediments
  • To bring up resource needs
slide-17
SLIDE 17

9/29/19 15

PSR PROJECTS

Tools and frameworks to automate creation and execution of PSR tests

  • PSR TRIGGER

TC AGENTS BLOCKING

PERF CENTRE WITH ANY CI DOCKER BASED LOAD GENERATORS

slide-18
SLIDE 18

9/29/19 16

JSIM

AUTOMATED PSR TEST ASSET GENERATOR

RESCRIPTING – NO MORE

A SYSTEM THAT AUTOMATICALLY RECORDS AND COMPLETES LOADRUNNER SCRIPTS COMPLETE WITH PARAMETERIZATION, CORRELATION AND LOGIC

AUTO REGRESSION

SYSTEM GENERATES TEST SCRIPTS AUTOMATICALLY AND RUNS TESTS

SELF SERVE

WORKING TOWARDS ADDING SELF SERVICE CAPABILITIES

CI FRIENDLY

REST ENDPOINTS ALLOW CI SYSTEMS TO INITIATE TEST SCRIPT GENERATION

FUTURE

POSSIBILITY OF USING ML FOR PARAMETERIZATION AND CORRELATION

slide-19
SLIDE 19

9/29/19 17

EITHER TO MEET REGULATORY COMPLIANCE OR FOR CERTIFICATIONS OR BOTH.

COMPLIANCE

TO UNDERSTAND WHAT MEASURES YOU NEED TO TAKE FOR THE FUTURE.

CAPACITY PLANNING

TO NOT HAVE TO CROSS YOUR FINGERS AND HOPE FOR PERFORMANCE EVERY TIME YOU RELEASE TO PRODUCTION.

RELEASE CONFIDENCE

FIGHT FOR PSR IN YOUR ORG

THIS IS WHAT YOU GET

TO ACHIEVE BUSINESS AND FINANCIAL OBJECTIVES THROUGH BETTER PERFORMANCE

RETENTION/CONVERSION

INVEST IN YOUR PEEPS, CREATE BANDWIDTH FOR THEM TO TAKE ON PSR ACTIVITIES, FUND THEIR TRAINING, HIRE DEDICATED ENGINEERS

PEEPS

STRIVE TO GET BUY IN AT ALL DIFFERENT LEVELS IN THE ORG, FROM THE PEOPLE WITH THEIR HANDS ON THE KEYBOARD TO SENIOR LEADERSHIP

ORG WIDE BUY IN

START SMALL, BUT STRIVE TO HAVE PROCESSES IN EVERY PART OF THE ORG SO THAT PSR VALIDATION IS A PART OF THE DEFINITION OF DONE

TOOLS AND PROCESSES

slide-20
SLIDE 20

9/29/19 18

ULTIMATE SOFTWARE CAREERS Our mission is to deliver unified, end-to-end HCM cloud solutions—everything from HR, to payroll, to benefits, to time & attendance, to recruitment, to talent management—to improve the personal work experience for you and your people — the power behind your business.

https://www.ultimatesoftware.com/careers

CONTACT

THANK YOU

ULTIMATE SOFTWARE IS HIRING PERFORMANCE AND QUALITY ENGINEERS

@kaushald

  • www.kaushaldalvi.net
  • kaushal_dalvi@ultimatesoftware.com

kaushaldalvi github.com/kaushald/

THANK YOU

  • KAUSHAL DALVI