Visualization of Performance Anomalies with Kieker
Bachelor’s Thesis Sören Henning September 8, 2016
Sören Henning Visualization of Performance Anomalies September 8, 2016 1 / 41
Visualization of Performance Anomalies with Kieker Bachelors Thesis - - PowerPoint PPT Presentation
Visualization of Performance Anomalies with Kieker Bachelors Thesis Sren Henning September 8, 2016 Sren Henning Visualization of Performance Anomalies September 8, 2016 1 / 41 Outline 1. Introduction 2. Foundations 3. Approach 4.
Sören Henning Visualization of Performance Anomalies September 8, 2016 1 / 41
Sören Henning Visualization of Performance Anomalies September 8, 2016 2 / 41
Introduction
◮ e.g., Amazon: 100 ms delay → 1% decrease in sales (Huang 2011) ◮ e.g., Google: 500 ms delay → 20% drop in traffic (Huang 2011) Sören Henning Visualization of Performance Anomalies September 8, 2016 3 / 41
Introduction
◮ e.g., Amazon: 100 ms delay → 1% decrease in sales (Huang 2011) ◮ e.g., Google: 500 ms delay → 20% drop in traffic (Huang 2011)
◮ As soon as possible ◮ Use monitoring (e.g., measure execution times) ◮ Investigate these measurements for anomalies Sören Henning Visualization of Performance Anomalies September 8, 2016 3 / 41
Introduction
◮ e.g., Amazon: 100 ms delay → 1% decrease in sales (Huang 2011) ◮ e.g., Google: 500 ms delay → 20% drop in traffic (Huang 2011)
◮ As soon as possible ◮ Use monitoring (e.g., measure execution times) ◮ Investigate these measurements for anomalies
Sören Henning Visualization of Performance Anomalies September 8, 2016 3 / 41
Introduction
Sören Henning Visualization of Performance Anomalies September 8, 2016 4 / 41
Introduction
Sören Henning Visualization of Performance Anomalies September 8, 2016 5 / 41
Introduction
Sören Henning Visualization of Performance Anomalies September 8, 2016 6 / 41
Introduction
Sören Henning Visualization of Performance Anomalies September 8, 2016 6 / 41
Introduction
Sören Henning Visualization of Performance Anomalies September 8, 2016 6 / 41
Introduction
◮ G4.1. Feasibility evaluation ◮ G4.2. Scalability evaluation Sören Henning Visualization of Performance Anomalies September 8, 2016 6 / 41
Foundations
Sören Henning Visualization of Performance Anomalies September 8, 2016 7 / 41
Foundations
◮ Time behavior and resource efficiency
◮ Sequence of measurements at regular temporal intervals
◮ Anomaly: Abnormal data patterns ◮ Detection: Compare measured values with reference model Sören Henning Visualization of Performance Anomalies September 8, 2016 8 / 41
Foundations 1 5 8 6 4 3 6 7 3 3 1
1
1 1 4 3 5 3 7 4 3 1 4
raw measurements normalized time series
2 3 4 5 6
Sören Henning Visualization of Performance Anomalies September 8, 2016 9 / 41
Foundations
Sören Henning Visualization of Performance Anomalies September 8, 2016 10 / 41
Foundations
3
6
Sören Henning Visualization of Performance Anomalies September 8, 2016 11 / 41
Foundations
Sören Henning Visualization of Performance Anomalies September 8, 2016 12 / 41
Approach
Sören Henning Visualization of Performance Anomalies September 8, 2016 13 / 41
Approach
measurements time series
Sören Henning Visualization of Performance Anomalies September 8, 2016 14 / 41
Approach
Visualization Visualization Provider Analysis R-based Forecast Database Sören Henning Visualization of Performance Anomalies September 8, 2016 15 / 41
Approach
Sören Henning Visualization of Performance Anomalies September 8, 2016 16 / 41
Approach
Record Converter Record Reconstructor TCP Reader Anomaly Detector Flow Record Filter Record Distributor Record Converter Anomaly Detector One analysis branch per Filter Distribute by Filter
Sören Henning Visualization of Performance Anomalies September 8, 2016 17 / 41
Approach
Record Converter Record Reconstructor TCP Reader Anomaly Detector Flow Record Filter Record Distributor Record Converter Anomaly Detector
Distributor Filter Filter
Sören Henning Visualization of Performance Anomalies September 8, 2016 17 / 41
Approach
Record Converter Record Reconstructor TCP Reader Anomaly Detector Flow Record Filter Record Distributor Record Converter Anomaly Detector
Sören Henning Visualization of Performance Anomalies September 8, 2016 17 / 41
Approach
Threshold Filter Threshold Filter Anomaly Score Calculator Measurement Forecast Decorator Forecaster Time Series Loader Distributor Normalizer Storager Distributor Threshold Filter Sliding Window
In Memory
Database Adapter Interface
Anomaly Detection Stage Sören Henning Visualization of Performance Anomalies September 8, 2016 18 / 41
Approach Sören Henning Visualization of Performance Anomalies September 8, 2016 19 / 41
Approach
update
Client (Web Browser)
DB
Server
CanvasPlot
Anomaliz.js
Data Updating UI Handling update request every x sec. send new data
Sören Henning Visualization of Performance Anomalies September 8, 2016 20 / 41
Evaluation
Sören Henning Visualization of Performance Anomalies September 8, 2016 21 / 41
Evaluation
100 200 10000 20000 30000 40000
Time in ms Response Time in ms
100 200 300 400 500 10000 20000 30000 40000
Time in ms Response Time in ms
100 200 300 10000 20000 30000 40000
Time in ms Response Time in ms
100 200 300 400 500 10000 20000 30000 40000
Time in ms Response Time in ms Sören Henning Visualization of Performance Anomalies September 8, 2016 22 / 41
Evaluation
100 200 300 10000 20000 30000 40000
Time in ms Response Time in ms
−0.6 −0.4 −0.2 0.0 0.2 0.4 0.6 10000 20000 30000 40000
Time in ms Anomaly Score
ARIMAForecaster ExponentialWeightedForecaster LinearWeightedForecaster LogarithmicWeightedForecaster MeanForecaster RegressionForecaster
Sören Henning Visualization of Performance Anomalies September 8, 2016 23 / 41
Evaluation
Sören Henning Visualization of Performance Anomalies September 8, 2016 24 / 41
Evaluation
Sören Henning Visualization of Performance Anomalies September 8, 2016 24 / 41
Evaluation
Sören Henning Visualization of Performance Anomalies September 8, 2016 24 / 41
Evaluation
Sören Henning Visualization of Performance Anomalies September 8, 2016 25 / 41
Conclusion and Future Work
Sören Henning Visualization of Performance Anomalies September 8, 2016 26 / 41
Conclusion and Future Work
◮ Proving infrastructure via Docker containers ◮ Immediately record processing Sören Henning Visualization of Performance Anomalies September 8, 2016 27 / 41
Conclusion and Future Work
◮ Proving infrastructure via Docker containers ◮ Immediately record processing
Sören Henning Visualization of Performance Anomalies September 8, 2016 27 / 41
Conclusion and Future Work
◮ Proving infrastructure via Docker containers ◮ Immediately record processing
Sören Henning Visualization of Performance Anomalies September 8, 2016 27 / 41
Conclusion and Future Work
◮ Proving infrastructure via Docker containers ◮ Immediately record processing
Sören Henning Visualization of Performance Anomalies September 8, 2016 27 / 41
Conclusion and Future Work
◮ Aggregate before analysis (ΘPAD) ◮ Cache time series operations Sören Henning Visualization of Performance Anomalies September 8, 2016 28 / 41
Conclusion and Future Work
◮ Aggregate before analysis (ΘPAD) ◮ Cache time series operations
◮ Is or will be supported by TeeTime Sören Henning Visualization of Performance Anomalies September 8, 2016 28 / 41
Conclusion and Future Work
◮ Aggregate before analysis (ΘPAD) ◮ Cache time series operations
◮ Is or will be supported by TeeTime
Sören Henning Visualization of Performance Anomalies September 8, 2016 28 / 41
Conclusion and Future Work
◮ Aggregate before analysis (ΘPAD) ◮ Cache time series operations
◮ Is or will be supported by TeeTime
Sören Henning Visualization of Performance Anomalies September 8, 2016 28 / 41
References
Sören Henning Visualization of Performance Anomalies September 8, 2016 29 / 41
References
Sören Henning Visualization of Performance Anomalies September 8, 2016 30 / 41
References
Sören Henning Visualization of Performance Anomalies September 8, 2016 31 / 41
Feasibility Evaluation
100 200 10000 20000 30000 40000
Time in ms Response Time in ms
−0.4 −0.2 0.0 0.2 0.4 0.6 0.8 1.0 10000 20000 30000 40000
Time in ms Anomaly Score
ARIMAForecaster ExponentialWeightedForecaster LinearWeightedForecaster LogarithmicWeightedForecaster MeanForecaster RegressionForecaster
Sören Henning Visualization of Performance Anomalies September 8, 2016 32 / 41
Feasibility Evaluation
100 200 29500 30000 30500
Time in ms Response Time in ms
−0.2 0.0 0.2 0.4 0.6 0.8 1.0 29500 30000 30500
Time in ms Anomaly Score
ARIMAForecaster ExponentialWeightedForecaster LinearWeightedForecaster LogarithmicWeightedForecaster MeanForecaster RegressionForecaster
Sören Henning Visualization of Performance Anomalies September 8, 2016 33 / 41
Feasibility Evaluation
100 200 300 400 500 10000 20000 30000 40000
Time in ms Response Time in ms
0.0 0.2 0.4 10000 20000 30000 40000
Time in ms Anomaly Score
ARIMAForecaster ExponentialWeightedForecaster LinearWeightedForecaster LogarithmicWeightedForecaster MeanForecaster RegressionForecaster
Sören Henning Visualization of Performance Anomalies September 8, 2016 34 / 41
Feasibility Evaluation
400 500 29000 29500 30000 30500
Time in ms Response Time in ms
0.0 0.2 0.4 29000 29500 30000 30500
Time in ms Anomaly Score
ARIMAForecaster ExponentialWeightedForecaster LinearWeightedForecaster LogarithmicWeightedForecaster MeanForecaster RegressionForecaster
Sören Henning Visualization of Performance Anomalies September 8, 2016 35 / 41
Feasibility Evaluation
100 200 300 10000 20000 30000 40000
Time in ms Response Time in ms
−0.6 −0.4 −0.2 0.0 0.2 0.4 0.6 10000 20000 30000 40000
Time in ms Anomaly Score
ARIMAForecaster ExponentialWeightedForecaster LinearWeightedForecaster LogarithmicWeightedForecaster MeanForecaster RegressionForecaster
Sören Henning Visualization of Performance Anomalies September 8, 2016 36 / 41
Feasibility Evaluation
100 200 29000 29500 30000 30500
Time in ms Response Time in ms
−0.4 −0.2 0.0 0.2 0.4 29000 29500 30000 30500
Time in ms Anomaly Score
ARIMAForecaster ExponentialWeightedForecaster LinearWeightedForecaster LogarithmicWeightedForecaster MeanForecaster RegressionForecaster
Sören Henning Visualization of Performance Anomalies September 8, 2016 37 / 41
Feasibility Evaluation
100 200 300 400 500 10000 20000 30000 40000
Time in ms Response Time in ms
−0.2 0.0 0.2 0.4 0.6 0.8 1.0 10000 20000 30000 40000
Time in ms Anomaly Score
ARIMAForecaster ExponentialWeightedForecaster LinearWeightedForecaster LogarithmicWeightedForecaster MeanForecaster RegressionForecaster
Sören Henning Visualization of Performance Anomalies September 8, 2016 38 / 41
Feasibility Evaluation
200 300 400 500 30000 32500 35000 37500 40000
Time in ms Response Time in ms
0.0 0.2 0.4 0.6 0.8 30000 32500 35000 37500 40000
Time in ms Anomaly Score
ARIMAForecaster ExponentialWeightedForecaster LinearWeightedForecaster LogarithmicWeightedForecaster MeanForecaster RegressionForecaster
Sören Henning Visualization of Performance Anomalies September 8, 2016 39 / 41
Feasibility Evaluation Sören Henning Visualization of Performance Anomalies September 8, 2016 40 / 41
Feasibility Evaluation Sören Henning Visualization of Performance Anomalies September 8, 2016 41 / 41