performance benchmarking with cloud workbench cwb
play

Performance Benchmarking with Cloud Workbench (CWB) Presenters - PowerPoint PPT Presentation

Performance Benchmarking with Cloud Workbench (CWB) Presenters Joel Scheuner Philipp Leitner https://icet-lab.eu @IcetLab Chalmers 2 Performance Matters Chalmers 3 Benchmarking IaaS Clouds Chalmers 4 Capacity Planning in the


  1. Performance Benchmarking with Cloud Workbench (CWB) Presenters Joel Scheuner Philipp Leitner

  2. https://icet-lab.eu @IcetLab Chalmers � 2

  3. Performance Matters Chalmers � 3

  4. Benchmarking IaaS Clouds Chalmers � 4

  5. Capacity Planning in the Cloud is hard 180 à Impractical to Test 160 Number of Instance Type all Instance Types 140 t2.nano 0.05-1 vCPU 120 0.5 GB RAM 100 $0.006/h 80 60 40 x1e.32xlarge 20 128 vCPUs 3904 GB RAM 0 $26.688 hourly 6 7 8 9 0 1 2 3 4 5 6 7 8 0 0 0 0 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 2 2 2 2 2 2 2 2 2 2 2 2 2 Source: https://aws.amazon.com/blogs/aws/ec2-instance-history/ Chalmers � 5

  6. Capacity Planning in the Cloud is hard “The instance type itself is a very major tunable parameter” � @brendangregg re:Invent’17 https://youtu.be/89fYOo1V2pA?t=5m4s Chalmers � 6

  7. What cloud provider should I choose? Should I go for many small or few large instances? ➡ Need for General-purpose or *-optimized? Benchmarking Pay for better IOPS or not? …………… Chalmers � 7

  8. Basic Cloud Benchmarking Approach Provider API provision Benchmark start benchmark Instance Manager results destroy Chalmers � 8

  9. Basic Cloud Benchmarking Approach acquire Provider Scheduler API Vagrant IaaS Provider start-up DRIVER SUT JMeter Slave request CWB Server Test Plan Chef Client response results JMeter CWB Client AcmeAir Master MongoDB JMeter Slave Webapplication Chef Client Chef Client Chef Client Chef Client JMeter Slave Chef Client provision provision provision provision CCGrid 2017 “An Approach and Chef Server Case Study of Cloud Instance Type Selection for Multi-Tier Web Applications” Chalmers � 9

  10. Benchmark Types Application Micro Benchmarks Benchmarks Memory CPU I/O Overall performance Network (e.g., response time) Generic Specific Artificial Real-World Resource-specific Resource- heterogeneous Chalmers � 10

  11. Micro Benchmark Examples Micro Benchmarks File I/O: 4k random read CPU Memory I/O Network Bandwidth 1) Prepare I/O Network Server 2) Run 3) Extract Result 4) Cleanup 3.5793 MiB/sec Client Result 972 Mbits/sec Chalmers � 11

  12. Application Benchmark Examples Application Benchmarks Overall performance (e.g., response time) Molecular Dynamics WordPress Benchmark (WPBench) Simulation (MDSim) 100 Number of Concurrent Threads 80 60 40 20 0 00:00 01:00 02:00 03:00 04:00 05:00 06:00 07:00 08:00 Elapsed Time [min] Multiple short blogging session scenarios (read, search, comment) Chalmers � 12

  13. Cloud Workbench Tool for scheduling cloud experiments Demo: https://www.youtube.com/watch? v=0yGFGvHvobk Code: https://github.com/sealuzh/cloud-workbench CloudCom 2014 “Cloud Work Bench - Infrastructure-as-Code Based Cloud Benchmarking” Chalmers � 13

  14. Planned Schedule My First CWB Benchmark [~ 30 mins] CWB Architecture and Selected Previous Results [~ 30 mins] ~ Coffee Break ~ 🎊 Building and Running a Benchmark from Ground Up [~ 90 mins] Wrap-Up and Outlook [5 mins] Chalmers � 14

  15. My First CWB Benchmark Interactive Session Chalmers � 15

  16. Online Material 
 http://bit.ly/cwb-tutorial Chalmers � 16

  17. Benchmarking with CWB 1. Write benchmark config to setup environment 
 (optional for simple benchmarks) 2. Declare IaaS resources and parametrize benchmark config 3. Trigger execution or define periodic schedule 4. Download metrics as CSV file 5. Analyze results Chalmers � 17

  18. 1. Write benchmark config Setup environment Write CWB execution hook Chalmers � 18

  19. 2. Declare IaaS resources and parametrize BM config Chalmers � 19

  20. 3. Trigger or schedule execution Chalmers � 20

  21. 4. Download metrics as CSV file Chalmers � 21

  22. CWB Architecture Chalmers � 22

  23. Experimenter Upload Configuration REST Configurations Provisioning Service Chalmers � 23

  24. CWB Server REST Web Interface Access Web Interface Scheduler Business Logic Experimenter Relational Database Upload Configuration REST Configurations Provisioning Service Chalmers � 24

  25. CWB Server IaaS Providers IaaS Provider IaaS Provider REST Web Interface Access Provider API Web Interface Provider Manage VMs Plugin REST Scheduler Business Logic Experimenter Relational Database Upload Configuration REST Configurations Provisioning Service Chalmers � 25

  26. CWB Server IaaS Providers IaaS Provider IaaS Provider REST Web Interface Access Provider API Web Interface Provider Manage VMs Execution Environment Plugin REST Benchmark Scheduler Business Logic Provision VMs + Execute Commands SSH Cloud VM Cloud VMs Experimenter Relational Database Upload Fetch Configuration Configuration REST REST Configurations Provisioning Service Chalmers � 26

  27. CWB Server IaaS Providers Notify State + IaaS Provider IaaS Provider Submit Metrics REST Web Interface REST CWB Client Library Access Provider API Web Interface Provider Manage VMs Execution Environment Plugin REST Benchmark Scheduler Business Logic Provision VMs + Execute Commands SSH Cloud VM Cloud VMs Experimenter Relational Database Upload Fetch Configuration Configuration REST REST Configurations Provisioning Service Chalmers � 27

  28. CWB Server IaaS Providers Notify State + IaaS Provider IaaS Provider Submit Metrics REST Web Interface REST CWB Client Library Access Provider API Web Interface Provider Manage VMs Execution Environment Plugin REST Benchmark Scheduler Business Logic Provision VMs + Execute Ruby DSL Commands SSH Cloud VM Cloud VMs for defining Experimenter infrastructure (mostly VMs) Relational Database Upload Fetch Configuration Configuration Ruby DSL for REST REST configuring Configurations Provisioning Service machines Chalmers � 28

  29. Benchmark Execution Lifecycle Experimenter / Provisioning CWB Server Provider API Cloud VM Scheduler Service Trigger Execution Acquire Resources Provision VM Fetch VM Configurations Apply VM Configurations Start Benchmark Run Run Benchmark Notify Benchmark Completed Postprocess Results Submit Metric(s) Notify Postprocessing Completed Release Resources Chalmers � 29

  30. Selected Previous Results Chalmers � 30

  31. Example Study 1 - Performance Testing of the Cloud Study setup Benchmarked 22 cloud configurations using 5 benchmarks Two types of experiments Isolated: 300 - 500 repetitions Continuous: 15 repetitions per configuration TOIT 2016 “Patterns in the Chaos - A Study of Performance Variation and Predictability in Public IaaS Clouds” Chalmers � 31

  32. Chalmers � 32

  33. Results Summary TOIT 2016 “Patterns in the Chaos - A Study of Performance Variation and Predictability in Public IaaS Clouds” Chalmers � 33

  34. Observed CPU Models (for m1.small and Azure TOIT 2016 “Patterns in the Chaos - A Study of Small in North America) Performance Variation and Predictability in Public IaaS Clouds” Chalmers � 34

  35. Impact of Different Days / Times 50 60 50 50 50 50 50 50 50 50 50 50 50 IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] IO Bandwidth [Mb/s] 40 40 40 40 40 40 40 40 40 40 40 40 40 30 30 30 30 30 30 30 30 30 30 30 30 20 20 20 20 20 20 20 20 20 20 20 20 20 10 10 10 10 10 10 10 10 10 10 10 10 Mon Tue Wed Thu Fri Sat Sun 0 0 0 0 0 0 0 0 0 0 0 0 0 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 00:00 04:00 08:00 12:00 16:00 20:00 Mon Tue Wed Thu Mon Tue Wed Thu Mon Tue Wed Thu Mon Tue Wed Thu Mon Tue Wed Thu Fri Fri Fri Fri Fri Sat Sat Sat Sat Sat Sun Sun Sun Sun Sun Day of the Week 00:00 04:00 08:00 12:00 16:00 20:00 Mon Mon Mon Mon Tue Tue Tue Tue Wed Wed Wed Wed Thu Thu Thu Thu Fri Fri Fri Fri Sat Sat Sat Sat Sun Sun Sun Sun Mon Tue Wed Thu Fri Sat Sun Time of the Day Time of the Day Time of the Day Time of the Day Time of the Day Time of the Day Time of the Day Time of the Day Time of the Day Time of the Day Time of the Day Time of the Day Day of the Week Day of the Week Day of the Week Day of the Week Day of the Week Time of the Day Day of the Week Day of the Week Day of the Week Day of the Week Mon Tue Wed Thu Fri Sat Sun Day of the Week (for m3.large in Europe) TOIT 2016 “Patterns in the Chaos - A Study of Performance Variation and Predictability in Public IaaS Clouds” Chalmers � 35

  36. Recent Results (Feb 2019) 2015 (unpublished data) Chalmers � 36

  37. Instance Runtime (Feb 2019) 2015 Continuous io azure D2s 5.5 Benchmark Value 5.0 4.5 4.0 3.5 0 20 40 60 Benchmark Runtime [h] (unpublished data) Chalmers � 37

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend