On Learning Parametric Dependencies from Monitoring Data Johannes - - PowerPoint PPT Presentation

on learning parametric dependencies from monitoring data
SMART_READER_LITE
LIVE PREVIEW

On Learning Parametric Dependencies from Monitoring Data Johannes - - PowerPoint PPT Presentation

On Learning Parametric Dependencies from Monitoring Data Johannes Grohmann, Simon Eismann, Samuel Kounev Symposium on Software Performance (SSP) 2019 05.11.2019 https://se.informatik.uni-wuerzburg.de/ Software Performance Models Introduction


slide-1
SLIDE 1

https://se.informatik.uni-wuerzburg.de/

On Learning Parametric Dependencies from Monitoring Data

Johannes Grohmann, Simon Eismann, Samuel Kounev Symposium on Software Performance (SSP) 2019 05.11.2019

slide-2
SLIDE 2

Johannes Grohmann – On Learning Parametric Dependencies 2

Software Performance Models

  • Performance models are a common approach to predict software performance

Server system

Introduction Related Work Approach Evaluation Conclusion

slide-3
SLIDE 3

Johannes Grohmann – On Learning Parametric Dependencies 3

Software Performance Models

  • Performance models are a common approach to predict software performance

maximize Efficiency Server system

Introduction Related Work Approach Evaluation Conclusion

slide-4
SLIDE 4

Johannes Grohmann – On Learning Parametric Dependencies 4

Software Performance Models

  • Performance models are a common approach to predict software performance

maximize Efficiency Server system Performance model e.g. Auto-scaler

Introduction Related Work Approach Evaluation Conclusion

slide-5
SLIDE 5

Johannes Grohmann – On Learning Parametric Dependencies 5

Software Performance Models

  • Performance models are a common approach to predict software performance
  • However, correctly modeling a software system is difficult

maximize Efficiency Server system Performance model e.g. Auto-scaler

Introduction Related Work Approach Evaluation Conclusion

slide-6
SLIDE 6

Johannes Grohmann – On Learning Parametric Dependencies 6

Parametric Dependencies

Recommender UI Image Provider Database

Introduction Related Work Approach Evaluation Conclusion

slide-7
SLIDE 7

Johannes Grohmann – On Learning Parametric Dependencies 7

Parametric Dependencies

QoS-metrics

(Quality of Service)

Workload 20 Users Recommender UI Image Provider Database

Introduction Related Work Approach Evaluation Conclusion

slide-8
SLIDE 8

Johannes Grohmann – On Learning Parametric Dependencies 8

Parametric Dependencies

QoS-metrics

(Quality of Service)

Workload 20 Users

?

Recommender UI Image Provider Database Workload 50 Users

Introduction Related Work Approach Evaluation Conclusion

slide-9
SLIDE 9

Johannes Grohmann – On Learning Parametric Dependencies 9

  • One parameter of performance models are parametric dependencies

Parametric Dependencies

QoS-metrics

(Quality of Service)

Workload 20 Users

ResourceDemand(Recommender)

[milliseconds]

?

Recommender UI Image Provider Database Workload 50 Users

Introduction Related Work Approach Evaluation Conclusion

slide-10
SLIDE 10

Johannes Grohmann – On Learning Parametric Dependencies 10

  • One parameter of performance models are parametric dependencies

Parametric Dependencies

QoS-metrics

(Quality of Service)

Workload 20 Users

ResourceDemand(Recommender) = 17 * currentItems.size()

[milliseconds]

?

Recommender UI Image Provider Database Workload 50 Users

Introduction Related Work Approach Evaluation Conclusion

slide-11
SLIDE 11

Johannes Grohmann – On Learning Parametric Dependencies 11

  • One parameter of performance models are parametric dependencies

Parametric Dependencies

QoS-metrics

(Quality of Service)

Workload 20 Users

ResourceDemand(Recommender) = 17 * currentItems.size() + 0 * user ID

[milliseconds]

?

Recommender UI Image Provider Database Workload 50 Users

Introduction Related Work Approach Evaluation Conclusion

slide-12
SLIDE 12

Johannes Grohmann – On Learning Parametric Dependencies 12

  • One parameter of performance models are parametric dependencies

Parametric Dependencies

Goal: Autonomically detect such parametric dependencies

QoS-metrics

(Quality of Service)

Workload 20 Users

?

Recommender UI Image Provider Database Workload 50 Users

Introduction Related Work Approach Evaluation Conclusion

slide-13
SLIDE 13

Johannes Grohmann – On Learning Parametric Dependencies 13

Example

Introduction Related Work Approach Evaluation Conclusion

slide-14
SLIDE 14

Johannes Grohmann – On Learning Parametric Dependencies 15

Related Work

Introduction Related Work Approach Evaluation Conclusion

  • Krogmann et al. [KKR10] or Mazkatli and Koziolek [MK18] require source code for

detection of dependencies. In contrast, our approch is solely based on monitoring data.

slide-15
SLIDE 15

Johannes Grohmann – On Learning Parametric Dependencies 16

Related Work

Introduction Related Work Approach Evaluation Conclusion

  • Krogmann et al. [KKR10] or Mazkatli and Koziolek [MK18] require source code for

detection of dependencies. In contrast, our approch is solely based on monitoring data.

Monitoring data

slide-16
SLIDE 16

Johannes Grohmann – On Learning Parametric Dependencies 17

Related Work

Introduction Related Work Approach Evaluation Conclusion

  • Krogmann et al. [KKR10] or Mazkatli and Koziolek [MK18] require source code for

detection of dependencies. In contrast, our approch is solely based on monitoring data.

Monitoring data

Model Extraction

[BHK11, WS+17, HW+99, IL+05, MF11]

Performance Model Parameterized Performance Model

Parameterization

[SC+15, BHK11, SG+19, RV95, KP+09]

slide-17
SLIDE 17

Johannes Grohmann – On Learning Parametric Dependencies 18

Related Work

Introduction Related Work Approach Evaluation Conclusion

  • Krogmann et al. [KKR10] or Mazkatli and Koziolek [MK18] require source code for

detection of dependencies. In contrast, our approch is solely based on monitoring data.

Monitoring data

Model Extraction

[BHK11, WS+17, HW+99, IL+05, MF11]

Performance Model Parameterized Performance Model Identified Dependencies

Parameterization

[SC+15, BHK11, SG+19, RV95, KP+09]

slide-18
SLIDE 18

Johannes Grohmann – On Learning Parametric Dependencies 19

Related Work

Introduction Related Work Approach Evaluation Conclusion

  • Krogmann et al. [KKR10] or Mazkatli and Koziolek [MK18] require source code for

detection of dependencies. In contrast, our approch is solely based on monitoring data.

Monitoring data

Model Extraction

[BHK11, WS+17, HW+99, IL+05, MF11]

Performance Model Parameterized Performance Model Identified Dependencies Parameterized Dependencies

Dependency Characterization

[BH11, CW00, AG+18]

Parameterization

[SC+15, BHK11, SG+19, RV95, KP+09]

slide-19
SLIDE 19

Johannes Grohmann – On Learning Parametric Dependencies 20

In a nutshell

Introduction Related Work Approach Evaluation Conclusion

slide-20
SLIDE 20

Johannes Grohmann – On Learning Parametric Dependencies 21

In a nutshell

Introduction Related Work Approach Evaluation Conclusion

Problem Manual identification of parametric dependencies is not always possible, time-intensive and error-prone

slide-21
SLIDE 21

Johannes Grohmann – On Learning Parametric Dependencies 22

In a nutshell

Introduction Related Work Approach Evaluation Conclusion

Idea Problem Manual identification of parametric dependencies is not always possible, time-intensive and error-prone Learning of dependencies using standard monitoring data collected during production

slide-22
SLIDE 22

Johannes Grohmann – On Learning Parametric Dependencies 23

In a nutshell

Introduction Related Work Approach Evaluation Conclusion

Idea Problem Benefit Manual identification of parametric dependencies is not always possible, time-intensive and error-prone Learning of dependencies using standard monitoring data collected during production Increase model accuracy and expressiveness, additional step towards autonomic model learning

slide-23
SLIDE 23

Johannes Grohmann – On Learning Parametric Dependencies 24

In a nutshell

Introduction Related Work Approach Evaluation Conclusion

Idea Problem Benefit Action Manual identification of parametric dependencies is not always possible, time-intensive and error-prone Learning of dependencies using standard monitoring data collected during production Increase model accuracy and expressiveness, additional step towards autonomic model learning Use feature selection techniques for detecting, regression for characterizing the dependencies

slide-24
SLIDE 24

Johannes Grohmann – On Learning Parametric Dependencies 25

APPROACH

slide-25
SLIDE 25

Johannes Grohmann – On Learning Parametric Dependencies 26

Required monitoring information

Introduction Related Work Approach Evaluation Conclusion

  • Monitoring data per invocation through Kieker [vHWH12] monitoring
  • Parameter values and types
  • Return value and type
  • Resource demand

Identification parameters

slide-26
SLIDE 26

Johannes Grohmann – On Learning Parametric Dependencies 27

Required monitoring information

Introduction Related Work Approach Evaluation Conclusion

  • Monitoring data per invocation through Kieker [vHWH12] monitoring
  • Parameter values and types
  • Return value and type
  • Resource demand
  • Method signature
  • Entity

Identification parameters Parameter-related information

slide-27
SLIDE 27

Johannes Grohmann – On Learning Parametric Dependencies 28

Required monitoring information

Introduction Related Work Approach Evaluation Conclusion

  • Monitoring data per invocation through Kieker [vHWH12] monitoring
  • Parameter values and types
  • Return value and type
  • Resource demand
  • Method signature
  • Entity
  • Trace id
  • Execution order index (EOI)
  • Execution stack size (ESS)

Identification parameters Trace reconstruction Parameter-related information

slide-28
SLIDE 28

Johannes Grohmann – On Learning Parametric Dependencies 29

Required monitoring information

Introduction Related Work Approach Evaluation Conclusion

  • Monitoring data per invocation through Kieker [vHWH12] monitoring
  • Parameter values and types
  • Return value and type
  • Resource demand
  • Method signature
  • Entity
  • Trace id
  • Execution order index (EOI)
  • Execution stack size (ESS)
  • Enables reconstruction of call-path trace for resolving aggregations

Identification parameters Trace reconstruction Parameter-related information

slide-29
SLIDE 29

Johannes Grohmann – On Learning Parametric Dependencies 30

Overview

Introduction Related Work Approach Evaluation Conclusion

Monitoring data

Model Extraction

[BHK11, WS+17, HW+99, IL+05, MF11]

Performance Model Parameterized Performance Model Identified Dependencies Parameterized Dependencies

Dependency Characterization

[BH11, CW00, AG+18]

Parameterization

[SC+15, BHK11, SG+19, RV95, KP+09]

slide-30
SLIDE 30

Johannes Grohmann – On Learning Parametric Dependencies 31

Overview

Introduction Related Work Approach Evaluation Conclusion

Monitoring data

Model Extraction

[BHK11, WS+17, HW+99, IL+05, MF11]

Performance Model Parameterized Performance Model Identified Dependencies Parameterized Dependencies

Dependency Characterization

[BH11, CW00, AG+18]

Parameterization

[SC+15, BHK11, SG+19, RV95, KP+09]

slide-31
SLIDE 31

Johannes Grohmann – On Learning Parametric Dependencies 32

Identification approaches

Introduction Related Work Approach Evaluation Conclusion

3 7 3 ... 8 23 95 ... 65 32 41 ...

65 32 41 ... Monitoring Values Model var.

slide-32
SLIDE 32

Johannes Grohmann – On Learning Parametric Dependencies 33

Identification approaches

Introduction Related Work Approach Evaluation Conclusion

3 7 3 ... 8 23 95 ... 65 32 41 ...

65 32 41 ... Monitoring Values Model var.  Target  Features

slide-33
SLIDE 33

Johannes Grohmann – On Learning Parametric Dependencies 34

Identification approaches

Introduction Related Work Approach Evaluation Conclusion

Feature space

Subspace 3 7 3 ... 8 23 95 ... 65 32 41 ...

65 32 41 ... Monitoring Values Model var.  Target  Features

slide-34
SLIDE 34

Johannes Grohmann – On Learning Parametric Dependencies 35

Identification approaches

Introduction Related Work Approach Evaluation Conclusion

  • Embedded
  • Evaluate feature importance during training
  • Selection based on comparison with „noise feature“
  • Algorithm: Random forest [H95]

Feature space

Subspace 3 7 3 ... 8 23 95 ... 65 32 41 ...

65 32 41 ... Monitoring Values Model var.  Target  Features

slide-35
SLIDE 35

Johannes Grohmann – On Learning Parametric Dependencies 36

Identification approaches

Introduction Related Work Approach Evaluation Conclusion

  • Embedded
  • Evaluate feature importance during training
  • Selection based on comparison with „noise feature“
  • Algorithm: Random forest [H95]
  • Wrapper
  • Selection based on accuracy error for a feature subset, compared with a

baseline regressor

  • Algorithm: M5 trees [Q+92] and Linear regression

Feature space

Subspace 3 7 3 ... 8 23 95 ... 65 32 41 ...

65 32 41 ... Monitoring Values Model var.  Target  Features

slide-36
SLIDE 36

Johannes Grohmann – On Learning Parametric Dependencies 37

Identification approaches

Introduction Related Work Approach Evaluation Conclusion

  • Embedded
  • Evaluate feature importance during training
  • Selection based on comparison with „noise feature“
  • Algorithm: Random forest [H95]
  • Wrapper
  • Selection based on accuracy error for a feature subset, compared with a

baseline regressor

  • Algorithm: M5 trees [Q+92] and Linear regression
  • Filter: Correlation-based
  • Pearson product-moment correlation coefficient (PPMCC)
  • Selection based on threshold for correlation

Feature space

Subspace 3 7 3 ... 8 23 95 ... 65 32 41 ...

65 32 41 ... Monitoring Values Model var.  Target  Features

slide-37
SLIDE 37

Johannes Grohmann – On Learning Parametric Dependencies 38

Evaluation

Introduction Related Work Approach Evaluation Conclusion

  • Distributed deployment of TeaStore [vKE+18] application
slide-38
SLIDE 38

Johannes Grohmann – On Learning Parametric Dependencies 39

Evaluation

Introduction Related Work Approach Evaluation Conclusion

  • Distributed deployment of TeaStore [vKE+18] application
  • Locust as load driver with typical behavior of customer
  • Login & logout
  • Browse for products
  • Add products to cart
  • Checkout cart
slide-39
SLIDE 39

Johannes Grohmann – On Learning Parametric Dependencies 40

Selection Thresholds

Introduction Related Work Approach Evaluation Conclusion

  • Filter approach outperforms other approaches
  • Results are threshold-independent
slide-40
SLIDE 40

Johannes Grohmann – On Learning Parametric Dependencies 41

Filter Application

Introduction Related Work Approach Evaluation Conclusion

Filtering Step Relevant Irrelevant Invalid Total None 11 94 5 110 Identical (1) 11 45 5 61 (1) + Correlating (2) 11 35 1 47 (1) + (2) + Graph-based (3) 11 8 1 20

In total, 86 irrelvant and and 4 invalid dependencies are deleted. This results in a precision (11 relevant to 1 invalid) of 91.7 %.

slide-41
SLIDE 41

Johannes Grohmann – On Learning Parametric Dependencies 42

Overview

Introduction Related Work Approach Evaluation Conclusion

Monitoring data

Model Extraction

[BHK11, WS+17, HW+99, IL+05, MF11]

Performance Model Parameterized Performance Model Identified Dependencies Parameterized Dependencies

Dependency Characterization

[BH11, CW00, AG+18]

Parameterization

[SC+15, BHK11, SG+19, RV95, KP+09]

slide-42
SLIDE 42

Johannes Grohmann – On Learning Parametric Dependencies 43

Overview

Introduction Related Work Approach Evaluation Conclusion

Monitoring data

Model Extraction

[BHK11, WS+17, HW+99, IL+05, MF11]

Performance Model Parameterized Performance Model Identified Dependencies Parameterized Dependencies

Dependency Characterization

[BH11, CW00, AG+18]

Parameterization

[SC+15, BHK11, SG+19, RV95, KP+09]

Dependency Characterization

[BH11, CW00, AG+18]

slide-43
SLIDE 43

Johannes Grohmann – On Learning Parametric Dependencies 44

Dataset Characteristics I

Introduction Related Work Approach Evaluation Conclusion

slide-44
SLIDE 44

Johannes Grohmann – On Learning Parametric Dependencies 45

Dataset Characteristics II

SortArray SubsetSum Fibonacci Colors encode defined sum Recursive Optimized recursive Iterative

Introduction Related Work Approach Evaluation Conclusion

slide-45
SLIDE 45

Johannes Grohmann – On Learning Parametric Dependencies 46

Dataset Characteristics II

SortArray SubsetSum Fibonacci The datasets are diverse and varying in terms of number and types of parameters, distribution of runtime (resource demand) and type of dependency. Colors encode defined sum Recursive Optimized recursive Iterative

Introduction Related Work Approach Evaluation Conclusion

slide-46
SLIDE 46

Johannes Grohmann – On Learning Parametric Dependencies 47

No Free Lunch

Introduction Related Work Approach Evaluation Conclusion

slide-47
SLIDE 47

Johannes Grohmann – On Learning Parametric Dependencies 48

No Free Lunch

Introduction Related Work Approach Evaluation Conclusion

slide-48
SLIDE 48

Johannes Grohmann – On Learning Parametric Dependencies 49

No Free Lunch

No Free Lunch for ML approaches! [8] We need a meta-classifier to select the appropriate algorithm.

Introduction Related Work Approach Evaluation Conclusion

slide-49
SLIDE 49

Johannes Grohmann – On Learning Parametric Dependencies 50

Meta-Classifier

  • Using Classification and Regression Trees (CART)

to train a Decision Tree on the following features:

Introduction Related Work Approach Evaluation Conclusion

slide-50
SLIDE 50

Johannes Grohmann – On Learning Parametric Dependencies 51

Meta-Classifier

  • Using Classification and Regression Trees (CART)

to train a Decision Tree on the following features:

  • Number of training instances (Size)
  • Number of parameters (NumParam)
  • Range of runtime values (RuntimeRange)
  • Coefficient of variance of runtime (RuntimeCV)
  • Highest linear correlation between any input parameter and runtime (HighestCorrelation)
  • Lowest linear correlation between any input parameter and runtime (LowestCorrelation)
  • Coefficient of determination (R2) (R2LinReg)

Introduction Related Work Approach Evaluation Conclusion

slide-51
SLIDE 51

Johannes Grohmann – On Learning Parametric Dependencies 52

Meta-Classifier II

Introduction Related Work Approach Evaluation Conclusion

slide-52
SLIDE 52

Johannes Grohmann – On Learning Parametric Dependencies 53

Meta-Classifier II

Introduction Related Work Approach Evaluation Conclusion

Improves overall MAE by 30% in comparison to always using SVR.

slide-53
SLIDE 53

Johannes Grohmann – On Learning Parametric Dependencies 54

OPEN CHALLENGES

slide-54
SLIDE 54

Johannes Grohmann – On Learning Parametric Dependencies 55

  • Monitoring data per invocation through Kieker [vHWH12] monitoring
  • Parameter values and types
  • Return value and type
  • Resource demand
  • Method signature
  • Entity
  • Trace id
  • Execution order index (EOI)
  • Execution stack size (ESS)
  • Enables reconstruction of call-path trace for resolving aggregations

Monitoring Challenges

Introduction Related Work Approach Evaluation Conclusion

Identification parameters Trace reconstruction Parameter-related information

slide-55
SLIDE 55

Johannes Grohmann – On Learning Parametric Dependencies 56

  • Monitoring data per invocation through Kieker [vHWH12] monitoring
  • Parameter values and types
  • Return value and type
  • Resource demand
  • Method signature
  • Entity
  • Trace id
  • Execution order index (EOI)
  • Execution stack size (ESS)
  • Enables reconstruction of call-path trace for resolving aggregations

Monitoring Challenges

Introduction Related Work Approach Evaluation Conclusion

Identification parameters Trace reconstruction Parameter-related information

What are the important features of each parameter? How can the features be extracted?

slide-56
SLIDE 56

Johannes Grohmann – On Learning Parametric Dependencies 57

  • Monitoring data per invocation through Kieker [vHWH12] monitoring
  • Parameter values and types
  • Return value and type
  • Resource demand
  • Method signature
  • Entity
  • Trace id
  • Execution order index (EOI)
  • Execution stack size (ESS)
  • Enables reconstruction of call-path trace for resolving aggregations

Monitoring Challenges

Introduction Related Work Approach Evaluation Conclusion

Identification parameters Trace reconstruction Parameter-related information

What are the important features of each parameter? How can the features be extracted? We can only observe the response time. How can the resource demands be measured?

slide-57
SLIDE 57

Johannes Grohmann – On Learning Parametric Dependencies 58

Stability in higher load scenarios

Introduction Related Work Approach Evaluation Conclusion

slide-58
SLIDE 58

Johannes Grohmann – On Learning Parametric Dependencies 59

Stability in higher load scenarios

Introduction Related Work Approach Evaluation Conclusion

  • Higher loads lead to less accuracy; however the effect is light
slide-59
SLIDE 59

Johannes Grohmann – On Learning Parametric Dependencies 60

Stability in higher load scenarios

Introduction Related Work Approach Evaluation Conclusion

  • Higher loads lead to less accuracy; however the effect is light
  • All relevant dependencies are still found
slide-60
SLIDE 60

Johannes Grohmann – On Learning Parametric Dependencies 61

Evaluation Challenges

Introduction Related Work Approach Evaluation Conclusion

  • Distributed deployment of TeaStore [vKE+18] application
  • Locust as load driver with typical behavior of customer
  • Login & logout
  • Browse for products
  • Add products to cart
  • Checkout cart
slide-61
SLIDE 61

Johannes Grohmann – On Learning Parametric Dependencies 62

Evaluation Challenges

Introduction Related Work Approach Evaluation Conclusion

  • Distributed deployment of TeaStore [vKE+18] application
  • Locust as load driver with typical behavior of customer
  • Login & logout
  • Browse for products
  • Add products to cart
  • Checkout cart

We need dependencies as gold standard. How can they be achieved? Comparison with other paradigms required?

slide-62
SLIDE 62

Johannes Grohmann – On Learning Parametric Dependencies 63

Integration Challenges

Introduction Related Work Approach Evaluation Conclusion

Monitoring data

Model Extraction

[BHK11, WS+17, HW+99, IL+05, MF11]

Performance Model Parameterized Performance Model Identified Dependencies Parameterized Dependencies

Dependency Characterization

[BH11, CW00, AG+18]

Parameterization

[SC+15, BHK11, SG+19, RV95, KP+09]

slide-63
SLIDE 63

Johannes Grohmann – On Learning Parametric Dependencies 64

Integration Challenges

Introduction Related Work Approach Evaluation Conclusion

Monitoring data

Model Extraction

[BHK11, WS+17, HW+99, IL+05, MF11]

Performance Model Parameterized Performance Model Identified Dependencies Parameterized Dependencies

Dependency Characterization

[BH11, CW00, AG+18]

Parameterization

[SC+15, BHK11, SG+19, RV95, KP+09]

slide-64
SLIDE 64

Johannes Grohmann – On Learning Parametric Dependencies 65

Conclusion

Introduction Related Work Approach Evaluation Conclusion

slide-65
SLIDE 65

Johannes Grohmann – On Learning Parametric Dependencies 66

Conclusion

Introduction Related Work Approach Evaluation Conclusion

Problem Manual identification of parametric dependencies is not always possible, time-intensive and error-prone

slide-66
SLIDE 66

Johannes Grohmann – On Learning Parametric Dependencies 67

Conclusion

Introduction Related Work Approach Evaluation Conclusion

Idea Problem Manual identification of parametric dependencies is not always possible, time-intensive and error-prone Learning of dependencies using standard monitoring data collected during production

slide-67
SLIDE 67

Johannes Grohmann – On Learning Parametric Dependencies 68

Conclusion

Introduction Related Work Approach Evaluation Conclusion

Idea Problem Benefit Manual identification of parametric dependencies is not always possible, time-intensive and error-prone Learning of dependencies using standard monitoring data collected during production Increase model accuracy and expressiveness, additional step towards autonomic model learning

slide-68
SLIDE 68

Johannes Grohmann – On Learning Parametric Dependencies 69

Conclusion

Introduction Related Work Approach Evaluation Conclusion

Idea Problem Benefit Action Manual identification of parametric dependencies is not always possible, time-intensive and error-prone Learning of dependencies using standard monitoring data collected during production Increase model accuracy and expressiveness, additional step towards autonomic model learning Use feature selection techniques for detecting, regression for characterizing the dependencies

slide-69
SLIDE 69

Johannes Grohmann – On Learning Parametric Dependencies 70

References

  • [KKR10] K. Krogmann, M. Kuperberg, and R. Reussner. “Using genetic search for reverse engineering of parametric behavior models for performance

prediction”. In: IEEE Transactions on Software Engineering 36.6 (2010), pp. 865–877.

  • [MK18] M. Mazkatli and A. Koziolek. “Continuous Integration of Performance Model”. In: Companion of the 2018 ACM/SPEC International Conference on

Performance Engineering. ICPE ’18. Berlin, Germany: ACM, 2018, pp. 153–158.

  • [BHK11] F. Brosig, N. Huber, and S. Kounev, “Automated extraction of architecture-level performance models of distributed component-based systems,” in 26th

IEEE/ACM International Conference On Automated Software Engineering (ASE 2011), Oread, Lawrence, Kansas, November 2011.

  • [WS+17] J. Walter, C. Stier, H. Koziolek, and S. Kounev, “An Expandable Extraction Framework for Architectural Performance Models,” in Proceedings of the 3rd

International Workshop on Quality-Aware DevOps (QUDOS’17). ACM, April 2017.

  • [HW+99] C. E. Hrischuk, C. M. Woodside, J. A. Rolia, and R. Iversen, “Tracebased load characterization for generating performance software models,” IEEE Trans.
  • Softw. Eng., vol. 25, no. 1, pp. 122–135, Jan. 1999.
  • [IL+05] T. A. Israr, D. H. Lau, G. Franks, and M. Woodside, “Automatic generation of layered queuing software performance models from commonly available

traces,” in Proceedings of the 5th International Workshop on Software and Performance, ser. WOSP ’05. New York, USA: ACM, 2005, pp. 147–158.

  • [MF11] A. Mizan and G. Franks, “An automatic trace based performance evaluation model building for parallel distributed systems,” SIGSOFT Softw. Eng. Notes,
  • vol. 36, no. 5, pp. 61–72, Sep. 2011.
  • [CW00] M. Courtois and M. Woodside, “Using regression splines for software performance analysis,” in Proceedings of the 2nd International Workshop on Software

and Performance, 2000, pp. 105–114.

  • [AG+18] V. Ackermann, J. Grohmann, S. Eismann, and S. Kounev, “Black-box learning of parametric dependencies for performance models,” in 13th International

Workshop on Models@run.time (MRT), co-located with ACM/IEEE 21st International Conference on Model Driven Engineering Languages and Systems (MODELS 2018), ser. CEUR Workshop Proceedings, October 2018.

slide-70
SLIDE 70

Johannes Grohmann – On Learning Parametric Dependencies 71

References

  • [SG+19] S. Spinner, J. Grohmann, S. Eismann, and S. Kounev, “Online model learning for self-aware computing infrastructures,” Journal of Systems and Software,
  • vol. 147, pp. 1 – 16, 2019.
  • [SC+15] S. Spinner, G. Casale, F. Brosig, and S. Kounev, “Evaluating Approaches to Resource Demand Estimation,” Perform. Evaluation, vol. 92, pp. 51 – 71,

October 2015.

  • [RV95] J. Rolia and V. Vetland, “Parameter estimation for performance models of distributed application systems,” in CASCON ’95. IBM Press, 1995, p. 54.
  • [KP+09] S. Kraft, S. Pacheco-Sanchez, G. Casale, and S. Dawson, “Estimating service resource consumption from response time measurements,” in

VALUETOOLS ’09, 2009, pp. 1–10.

  • [vHWH12] A. van Hoorn, J. Waller, and W. Hasselbring, “Kieker: A framework for application performance monitoring and dynamic software analysis,” in

Proceedings of the 3rd joint ACM/SPEC International Conference on Performance Engineering, 2012, pp. 247–248.

  • [WF+16] I. H. Witten, E. Frank, M. A. Hall, and C. J. Pal, Data Mining, Fourth Edition: Practical Machine Learning Tools and Techniques, 4th ed. San Francisco, CA,

USA: Morgan Kaufmann Publishers Inc., 2016.

  • [H95] T. K. Ho, “Random decision forests,” in Proceedings of the Third International Conference on Document Analysis and Recognition (Volume 1), ser. ICDAR

’95. Washington, DC, USA: IEEE Computer Society, 1995.

  • [Q+92] J. R. Quinlan et al., “Learning with continuous classes,” in 5th Australian joint conference on artificial intelligence, vol. 92. Singapore, 1992, pp. 343–348.
  • [vKE+18] J. von Kistowski, S. Eismann, N. Schmitt, A. Bauer, J. Grohmann, and S. Kounev, “Teastore: A micro-service reference application for benchmarking,

modeling and resource management research,” in Proceedings of the 26th IEEE International Symposium on the Modelling, Analysis, and Simulation of Computer and Telecommunication Systems, ser. MASCOTS ’18, September 2018.

  • [GE+19] Johannes Grohmann, Simon Eismann, Sven Elflein, Manar Mazkatli, Jóakim von Kistowski, and Samuel Kounev. Detecting Parametric Dependencies for

Performance Models Using Feature Selection Techniques. In Proceedings of the 27th IEEE International Symposium on the Modelling, Analysis, and Simulation of Computer and Telecommunication Systems, Rennes, France, October 2019, MASCOTS '19.

slide-71
SLIDE 71

https://se.informatik.uni-wuerzburg.de/

Thank you for your attention! 