bungee an elasticity benchmark for self adaptive iaas
play

BUNGEE An Elasticity Benchmark for Self-Adaptive IaaS Cloud - PowerPoint PPT Presentation

BUNGEE An Elasticity Benchmark for Self-Adaptive IaaS Cloud Environments Andreas Weber, Nikolas Herbst, Henning Groenda, Samuel Kounev Munich - November 5th, 2015 Symposium on Software Performance 2015 http://descartes.tools/bungee Elasticity


  1. BUNGEE An Elasticity Benchmark for Self-Adaptive IaaS Cloud Environments Andreas Weber, Nikolas Herbst, Henning Groenda, Samuel Kounev Munich - November 5th, 2015 Symposium on Software Performance 2015 http://descartes.tools/bungee

  2. Elasticity 2

  3. Characteristics of ... Rubber Bands IaaS Clouds ■ Base Length ■ Performance (1 resource unit) ■ Width / Thickness / Force ■ Quality Criteria / SLA Contract Contract Contract ■ Strechability ■ Scalability ■ Elasticity ■ Elasticity Size Size Time Time 3

  4. Comparing Elastic Behavior of ... Rubber Bands IaaS Clouds Measure … 2 cm Elasticity Independent demand supply time of … Performance 2 cm and 4 cm Scalability demand supply time 4

  5. Motivation & Related Work 5

  6. Why measure Elasticity? Elasticity ■ Major quality attribute of clouds [Gartner09] ■ Many strategies exist [Galante12, Jannings14] ■ Industry ■ Academia  Benchmark for comparability! “You can’t control what you can’t measure?” (DeMarco) “ If you cannot measure it, you cannot improve it” (Lord Kelvin) 6

  7. Related Work: Elasticity Benchmarking Approaches ■ Specialized approaches [Binning09, Li10, Dory11, Almeida13] ■ Measure technical provisioning time ■ Measure SLA compliance ■ Focus on scale up/out ■ Business perspective [Weimann11, Folkerts12, Islam12, Moldovan13, Tinnefeld14] ■ What is the financial impact? ■ Disadvantage: Mix-up of elasticity technique and business model 7

  8. Concept & Implementation 8

  9. Elasticity Benchmarking Concept Analyze the performance of underlying resources System System & Analysis Analysis scaling behavior Benchmark Calibration Measurement Elasticity Evaluation 9

  10. Analyze System ■ Approach ■ Evaluate system separately at each scale resource amount ■ Find maximal load intensity that the system can withstand without violating SLOs (binary search) ■ Derive demand step function: f(intensity ) resourceDemand = f(intensity) load intensity ■ Benefit ■ Derive resource demand for arbitrary load intensity variations # resources intensity f(intensity) time time 10

  11. Elasticity Benchmarking Concept Analyze the performance of underlying resources System & Analysis scaling behavior Benchmark Benchmark Adjust load profile Calibration Calibration Measurement Elasticity Evaluation 11

  12. Benchmark Calibration intensity intensity f(intensity) f(intensity) time time … … resources resources time demand time demand ■ Approach: Adjust load intensity profile to overcome ■ Different performance of underlying resources ■ Different scalability 12

  13. Elasticity Benchmarking Concept Analyze the performance of underlying resources System & Analysis scaling behavior Benchmark Adjust load profile Calibration Expose cloud system to varying load Measurement Measurement & monitor resource supply & demand Elasticity Evaluation 13

  14. Measurement ■ Requirement: Stress SUT in a representative manner ■ Realistic variability of load intensity ■ Adaptability of load profiles to suit different domains ■ Approach: http://descartes.tools/limbo ■ Open workload model [Schroeder06] ■ Model Load Variations with the LIMBO toolkit [SEAMS15Kistowski] Facilitates creation of new load profiles ■ Derived from existing traces ■ With desired properties (e.g. seasonal pattern, bursts) ■ Execute load profile using JMeter Timer-Plugin delays requests according to timestamp file https://github.com/andreaswe/JMeterTimestampTimer created by LIMBO 14

  15. Elasticity Benchmarking Concept Analyze the performance of underlying resources System & Analysis scaling behavior Benchmark Adjust load profile Calibration Expose cloud system to varying load Measurement Measurement & CloudStack monitor resource supply & demand Evaluate elasticity aspects Elasticity Elasticity accuracy & timing Evaluation Evaluation with metrics 15

  16. Metrics: Accuracy (1/3) resources O 2 U 3 O 3 U 2 O 1 U 1 T resource demand resource supply 𝑃 𝑉 accuracy O : accuracy U : 𝑈 𝑈 16

  17. Metrics: Timeshare (2/3) B 1 A 1 A 2 A 3 B 2 B 3 resources O 2 U 3 O 3 U 2 O 1 U 1 T resource demand resource supply 𝐶 𝐵 timeshare O : timeshare U : 𝑈 𝑈 17

  18. Metrics: Jitter (3/3) resources resource demand resource supply resources resource demand resource supply 𝐹 𝑇 − 𝐹 𝐸 jitter: 𝐹 𝐸 : # demand adaptations, 𝐹 𝑇 : # supply adaptations 𝑈 18

  19. Elasticity Benchmarking Concept Analyze the performance of underlying resources System & Analysis scaling behavior Benchmark Adjust load profile Calibration Expose cloud system to varying load Measurement & CloudStack monitor resource supply & demand Evaluate elasticity aspects Elasticity accuracy & timing Evaluation with metrics 19

  20. BUNGEE Implementation ■ Java-based elasticity benchmarking framework ■ Extensible with respect to ■ Components ■ new cloud management software ■ new resource types ■ Harness (Benchmark Node) ■ new metrics ■ Cloud-side load generation application (CSUT) ■ Automates the four benchmarking activities CloudStack Benchmark Calibration System Analysis Measurement Elasticity Evaluation ■ Analysis of horizontally scaling clouds based on ■ CloudStack ■ AWS ■ http://descartes.tools/bungee ■ Code is Open Source ■ Quick Start Guide available 20

  21. Evaluation & Case Study 21

  22. Evaluation & Case Study ■ Evaluation (private cloud) ■ Reproducibility of system analysis Err rel < 5%, confidence 95% for first scaling stage ■ Simplified system analysis Linearity assumption holds for test system ■ Consistent ranking by metrics Separate evaluation for each metric, min. 4 configurations per metric ■ Case Study (private & public cloud) ■ Applicability in real scenario ■ Different performance of underlying resources ■ Metric Aggregation 22

  23. Case Study: Configuration F - 1Core accuarcy O accuracy U timeshare O timeshare U jitter elastic violations Configuration [res. units] [res. units] [%] [%] [adap/min.] speedup [%] F – 1Core 2.423 0.067 66.1 4.8 -0.067 1.046 7.6 23

  24. Case Study: Config. F - 2Core not adjusted accuarcy O accuracy U timeshare O timeshare U jitter elastic violations Configuration [res. units] [res. units] [%] [%] [adap/min.] speedup [%] F – 1Core 2.423 0.067 66.1 4.8 -0.067 1.046 7.6 F – 2Core no adjustment 1.811 0.001 63.8 0.1 -0.033 1.291 2.1 24

  25. Case Study: Config. F - 2Core adjusted accuarcy O accuracy U timeshare O timeshare U jitter elastic violations Configuration [res. units] [res. units] [%] [%] [adap/min.] speedup [%] F – 1Core 2.423 0.067 66.1 4.8 -0.067 1.046 7.6 F – 2Core no adjustment 1.811 0.001 63.8 0.1 -0.033 1.291 2.1 F – 2Core adjusted 2.508 0.061 67.1 4.5 -0.044 1.025 8.2 25

  26. Conclusion ■ Goal: Evaluate elastic behavior independent of ■ Performance of underlying resources and scaling behavior ■ Business model ■ Contribution: ■ Elasticity benchmark concept for IaaS cloud platforms ■ Refined set of elasticity metrics ■ Concept implementation: BUNGEE - framework for elasticity benchmarking ■ Evaluation: ■ Consistent ranking of elastic behavior by metrics ■ Case study on AWS and CloudStack http://descartes.tools/bungee ■ Future Work: ■ BUNGEE: Distributed load generation, scale vertically, dif. resource types ■ Experiments: Tuning of elasticity parameters, evaluate proactive controllers 26

  27. Literature (1/2) Gartner09: D.C. Plume, D. M. Smith, T.J. Bittman, D.W. Cearley, D.J. Cappuccio, D. Scott, R. Kumar, and B. Robertson . Study: “Five Refining Attributes of Public and Private Cloud Computing", Tech. rep., Gartner, 2009. G. Galante and L. C. E. d. Bona, “A Survey on Cloud Computing Elasticity" in Proceedings of the 2012 Galante12: IEEE/ACM Fifth International Conference on Utility and Cloud Computing, Washington, 2012 Jennings14: B. Jennings and R. Stadler, “Resource management in clouds: Survey and research challenges“, Journal of Network and Systems Management, pp. 1-53, 2014 C. Binnig, D. Kossmann, T. Kraska, and S. Loesing , “How is the weather tomorrow?: towards a benchmark Binning09: for the cloud" in Proceedings of the Second International Workshop on Testing Database Systems, 2009 A. Li, X. Yang, S. Kandula , and M. Zhang, “ CloudCmp: Comparing Public Cloud Providers" in Proceedings Li10: of the 10th ACM SIGCOMM Conference on Internet Measurement, 2010 T. Dory, B. Mejías, P. V. Roy, and N.- L. Tran, “Measuring Elasticity for Cloud Databases" in Proceedings Dory11: of the The Second International Conference on Cloud Computing, GRIDs, and Virtualization, 2011 R.F. Almeida, F.R.C. Sousa, S. Lifschitz, and J.C. Machado: “ On defining metrics for Almeida13: elasticity of cloud databases “, Simpósio Brasileiro de Banco de Dados - SBBD 2013, http://www.lbd.dcc.ufmg.br/colecoes/sbbd/2013/0012.pdf, last consulted Nov. 2015 J. Weinman , “Time is Money: The Value of “On - Demand”,” 2011, Weimann11: http://www.joeweinman.com/resources/Joe_Weinman_Time_Is_Money.pdf, last consulted Nov. 2015 27

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend