Dealing with Data in Title Performance Testing Bogdan Veres March - - PowerPoint PPT Presentation

dealing with data in
SMART_READER_LITE
LIVE PREVIEW

Dealing with Data in Title Performance Testing Bogdan Veres March - - PowerPoint PPT Presentation

Dealing with Data in Title Performance Testing Bogdan Veres March 2018 Hi, Im Bogdan Software Tester @ Softvision Mountain Biker since 2010 Computer Hardware QA Community Lead Enthusiast Traveler Performance & Automation Testing


slide-1
SLIDE 1
slide-2
SLIDE 2
slide-3
SLIDE 3

Title Dealing with Data in Performance Testing

Bogdan Veres March 2018

slide-4
SLIDE 4

Hi, I’m Bogdan

Software Tester @ Softvision

since 2010

Performance & Automation Testing Professional Traveler QA Community Lead Mountain Biker Computer Hardware Enthusiast

slide-5
SLIDE 5

Performance tests - why are they necessary?

  • Performance testing is necessary to make sure that the application runs

quickly, is stable and can scale

  • To prevent users being affected by low performance
  • Find out how many resources are required to handle the expected load
slide-6
SLIDE 6

What happens when the app is not performing?

Famous performance issues:

  • Bieber bug - too popular to handle
  • Google fail - caused by Michael Jackson’s death
  • Diablo 3 - authentication server fail on release
slide-7
SLIDE 7

Dan Downing’s 5 Steps of Load Testing

1. Discover

  • a. Define use case workflows
  • b. Model production workload

2. Develop

  • a. Develop test scripts
  • b. Configure environment monitors

3. Analyze

  • a. Run tests
  • b. Monitor system resources
  • c. Analyze results

4. Fix

  • a. Fix
  • b. Diagnose
  • c. Re-test

5. Report

  • a. Interpret results
  • b. Make recommendations
  • c. Present to stakeholders
slide-8
SLIDE 8

Generating the Load

slide-9
SLIDE 9

Performance KPIs

slide-10
SLIDE 10

Define required metrics for reporting

slide-11
SLIDE 11

Execution Time vs Cycle

slide-12
SLIDE 12

Metrics

  • System (CPU, Memory, Processes)
  • Network throughput
  • Disk IO
  • # of requests
  • # of messages in queues
  • # of connections
  • # of errors
  • # of DB connections
  • Detailed memory stats
  • Response size
  • Apache/IIS/NGINX/Tomcat
slide-13
SLIDE 13

Getting from this…

slide-14
SLIDE 14

...to this

slide-15
SLIDE 15

Tools

Metrics Tools OS CPU, Memory, Net, Netstat, Disk, Disk IO ,Swap, Processes perfmon Windows ps Linux prstat Linux dstat Linux Apache (connections, requests, errors) Apache Monitor Windows/Linux NGINX (connections, requests, errors) Telegraf (NGINX plugin) Windows/Linux IIS (connections, requests, errors) Perfmon Windows MSSQL(queries, heap memory, deadlocks) Perfmon Windows Telegraf Windows/Linux Oracle (queries, heap memory, deadlocks) Telegraf Windows/Linux Oracle DB Monitor Windows/Linux Network (tcp, http traffic) Wireshark Windows/Linux Telegraf Windows/Linux

slide-16
SLIDE 16

Setup Environment

Use Docker for:

  • Automation – Dockerfile for setup
  • DevOps – infrastructure as code
  • Scale – Docker compose
  • Maintenance – entire isolated runtime environment
slide-17
SLIDE 17

Docker environment

slide-18
SLIDE 18

TICK Stack

  • Telegraf - agent for collecting and reporting metrics
  • InfluxDB - time series database for real time analytics (SQL like query

language)

  • Chronograf - administrative user interface and visualization
  • Kapacitor - data processing engine
slide-19
SLIDE 19

TICK scripting language Influx Query Language

  • SQL like query language
  • A timestamp identifies a single point in any

given data series

  • InfluxDB isn’t CRUD
slide-20
SLIDE 20

Interpreting, Objectives & Recommendations

  • Observations
  • Correlations
  • Hypotheses
  • Conclusions
  • Compare graphs
  • Results from current build vs. previous build
  • Create conclusions – tie them back to test objectives
  • Review solution
  • Quantify the benefit, cost and effort
  • Final outcome is management’s judgement
slide-21
SLIDE 21

Reporting

  • Summary - 3 pages max
  • Add test details (# of users, hardware configuration, build version, scenario)
  • Key graphs in order of importance
  • Annotate graphs
  • Draw conclusions
  • Recommendations
  • Create a sections for errors (add details for errors)
  • Present your report - no one is going to read the it
slide-22
SLIDE 22

Resources

  • https://www.soasta.com/blog/
  • https://docs.influxdata.com/
  • http://www.perftestplus.com/resources.htm
  • http://focus.forsythe.com/articles/335/The-4-Hats-of-Application-Performance-Testing
  • https://testingpodcast.com/tag/dan-downing/
  • https://www.blazemeter.com/jmeter-blog-posts
slide-23
SLIDE 23