the m 3 measure measure model tool chain for performance
play

The M 3 (Measure-Measure-Model) Tool-Chain for Performance - PowerPoint PPT Presentation

The M 3 (Measure-Measure-Model) Tool-Chain for Performance Prediction of Multi-tier Applications Devidas Gawali Varsha Apte Department of Computer Science and Engineering Indian Institute of Technology, Bombay, India. { devidas,varsha }


  1. The M 3 (Measure-Measure-Model) Tool-Chain for Performance Prediction of Multi-tier Applications Devidas Gawali Varsha Apte Department of Computer Science and Engineering Indian Institute of Technology, Bombay, India. { devidas,varsha } @cse.iitb.ac.in QUDOS 2016 Saarland University, Saarbrucken, Germany. July 21, 2016 Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 1 / 23

  2. Overview Problem Statement 1 Approach 2 The M 3 Toolchain 3 Validation 4 Experiments 5 Conclusion and Future Work 6 Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 2 / 23

  3. Application Performance Prediction Production hardware most often different from testbed hardware Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 3 / 23

  4. Problem Statement Given the results of performance measurement of an application A on a testbed platform X, Predict the resource utilization, response time and throughput of application A on a target platform Y . Challenges: May be difficult/impossible to deploy entire original application on target and load test it Modeling also requires resource service demands of application on target Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 4 / 23

  5. Key idea: Performance “Clone” Generate a simple “clone” (A program that mimics the performance of the application on the testbed.) Deploy and measure the clone (instead of the original application) on the target Measure resource demand of clone on the target Use this as an estimate of resource demand in a queueing system model of the application on the target In this work focus is on CPU-intensive Web tier Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 5 / 23

  6. The Clone - Architecture Mimics the application, matches CPU service demand Works with only a “dummy” back-end tier Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 6 / 23

  7. The Clone Code Figure: Clone Template Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 7 / 23

  8. The M 3 Toolchain We use a “chain” of measurement and modeling tools developed in-house to implement the clone-based performance prediction approach. Figure: Three-step hybrid measurement and modeling approach AutoPerf CloneGen AutoPerf Model Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 8 / 23

  9. The M 3 Toolchain Tool pipeline 1 AutoPerf: Profile the application performance on the testbed platform. 2 CloneGen: Generates clones of the application server-side request codes that are easier to run on the target platform. 3 AutoPerf: Measure the clones performance on the target platform. 4 PerfCenter: Produce application performance metrics on the target. Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 9 / 23

  10. Toolchain: Measure Application Service Demand on Target Figure: Profile the application Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 10 / 23

  11. Generating the Clone Figure: Desired Service demand achieved by tuning loopvalue Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 11 / 23

  12. The M 3 Toolchain: CloneGen Input 1 Number of back end(DB) calls. 2 Sent and received bytes exchanged between web and DB server. 3 Service time of the Application on web and DB server. 4 Type of instructions of code snippet used in the Clone benchmark. Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 12 / 23

  13. Inputs to the CloneGen sample input file number of BackEnd Calls = 4 serviceDemand of App on Web= 0.0016 serviceDemad of App on db = 0.0019 bytes from web to db = 120 bytes from db to web = 400 bytes from web to client = 1100 type of instructions = string client machine server address = 10.129.X.X testbed machine server address = 10.129.X.X target machine server address = 10.129.X.X Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 13 / 23

  14. Toolchain: Measuring Clone Service Demand on Target, using it in a model Figure: Measure Clones Service demand Figure: Predict App Performance on Target Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 14 / 23

  15. The M 3 Toolchain: PerfCenter: Modeling, Prediction Input 1 Measured Service demand: Clone’s CPU Service Demand - from Toolchain 2 Hardware details of Target: Number of devices, Number of CPU’s - manually specified 3 Message flow details: bytes exchange between servers - measured separately and specified Output 1 Modeled Application Throughput 2 Modeled Utilization of host CPU 3 Modeled Application Response Time Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 15 / 23

  16. The M 3 Toolchain: PerfCenter: input file Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 16 / 23

  17. Validation We used two standard Web benchmarks(DellDVD, RUBiS) with various combinations of testbed and target platforms. We compared measured vs modeled metrics by using our measure-measure-model approach. Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 17 / 23

  18. Experiment Combinations Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 18 / 23

  19. Measured vs Modelled: Throughput, Utilization Measured Throughput Measured Utilization Measured Throughput Measured Utilization Modelled Throughput Modelled Utilization Modelled Throughput Modelled Utilization 2500 2 Avg Number of cores Used 2 2500 Avg Number of cores Used Throughput (req/sec) Throughput (req/sec) 2000 2000 1.5 1.5 1500 1500 1 1 1000 1000 0.5 0.5 500 500 0 0 0 0 100 200 300 400 500 600 700 800 900 1000 100 200 300 400 500 600 700 800 Number of Users Number of Users Figure: Combination 1 Figure: Combination 2 6000 3500 4 2 3.5 3000 Avg Number of cores Used Avg Number of cores Used 5000 Throughput (req/sec) Throughput (req/sec) 3 1.5 2500 4000 2.5 2000 3000 1 2 1500 1.5 2000 1000 1 0.5 1000 500 0.5 0 0 0 0 100 200 300 400 500 600 700 800 900 1000 1100 1200 100 200 400 600 800 1000 1200 1400 1600 1900 2100 Number of Users Number of Users Figure: Combination 3 Figure: Combination 4 Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 19 / 23

  20. Measured vs Modelled: Response Time Measured Response Time Modelled Response Time Measured Response Time Modelled Response Time 100 60 50 80 40 Time (ms) Time (ms) 60 30 40 20 20 10 0 0 100 200 300 400 500 600 700 800 100 200 300 400 500 600 700 800 Number of Users Number of Users Figure: Combination 1 Figure: Combination 2 60 60 50 50 40 40 Time (ms) Time (ms) 30 30 20 20 10 10 0 0 200 400 600 800 1000 200 400 600 800 1000 1200 1400 1600 1800 Number of Users Number of Users Figure: Combination 3 Figure: Combination 4 Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 20 / 23

  21. Measured vs Modeled: Error frequency distribution 50 Utilization Throughput 40 Frequency 30 20 10 0 0-5 5-10 10-15 15-20 20> Measured vs Modeled Error % Figure: Histogram of Measured vs Modeled Error % Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 21 / 23

  22. Conclusion and Future work Conclusion Proposed the Measure-Measure-Model ( M 3 ) methodology for application performance prediction. Demonstrated how a tool-chain of measurement, clone generation and modeling tools can be built for the purpose of automating this methodology (partially). Validated our approach on two standard Web benchmarks with various combinations of testbed and target platforms. Future work Support a range of resource demands over a certain frequency distribution. Validate and if required extend our approach to Web applications written in Java. Predict application performance in a virtualized environment. Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 22 / 23

  23. Thank you. Devidas Gawali , Varsha Apte (IIT Bombay) Toolchain July 21, 2016 23 / 23

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend