how to waste time and money testing the performance of a
play

How to Waste Time and Money Testing the Performance of a Software - PowerPoint PPT Presentation

How to Waste Time and Money Testing the Performance of a Software Product David Daly | Lead Engineer -- Performance | @daviddaly44 | https://daviddaly.me/ - 22 % 2 3 Understand the performance of our software and when it changes. 4 How


  1. How to Waste Time and Money Testing the Performance of a Software Product David Daly | Lead Engineer -- Performance | @daviddaly44 | https://daviddaly.me/

  2. - 22 % 2

  3. 3

  4. Understand the performance of our software and when it changes. 4

  5. How NOT to How to Machines with Personality Automate everything Run tests by hand Minimize noise Wait until release time Involve everyone Have a dedicated team separate from dev Always be testing 5

  6. Performance Use Cases • Detect performance impacting commits (Waterfall) • Test impact of proposed code change (Patch Test) • Diagnose performance regressions (Diagnostics, Profiling) • Release support (how do we compare to previous stable?) • Add test coverage • Performance exploration 6

  7. Performance Use Cases • Detect performance impacting commits (Waterfall) • Test impact of proposed code change (Patch Test) • Diagnose performance regressions (Diagnostics, Profiling) • Release support (how do we compare to previous stable?) • Add test coverage • Performance exploration 7

  8. Detect performance impacting commits (Waterfall)

  9. Performance Testing in Continuous Integration Setup a system under test Run a workload Report the results Visualize the result Decide (and alert) if the performance changed Automate everything/Keep Noise Down 9

  10. Performance Testing in Continuous Integration Setup a system under test Run a workload Report the results Visualize the result Decide (and alert) if the performance changed Automate everything/Keep Noise Down 10

  11. System Level Tests (Sys-perf) Multi node clusters in the cloud with end to end tests. Expensive ($s and hours), run least frequently Microbenchmarks Levels Single-node cpu-bound tests. Dedicated hardware. Unit Level Performance Tests Google Benchmark framework. Some dedicated hardware. Least expensive ($s and hours) 11

  12. The focus for DSI was serving the more complex requirements of end-to-end system performance tests on real clusters, automating every step including provisioning of hardware, and generating consistent, repeatable results. 12

  13. DSI Goals • Full end-to-end automation • Support both CI and manual testing • Elastic, public cloud infrastructure • Everything configurable • All configuration via YAML • Diagnosability • Repeatability 13

  14. DSI Modules • Bootstrap • Infrastructure provisioning • System setup • Workload setup • MongoDB setup • Test Control • Analysis • Infrastructure teardown 14

  15. Configuration Files <Put Henrik’s examples here> 15

  16. Performance Testing in Continuous Integration Setup a system under test Run a workload Report the results Visualize the result Decide (and alert) if the performance changed Automate everything/Keep Noise Down 16

  17. 17

  18. 18

  19. 19

  20. 20

  21. Performance Testing in Continuous Integration Setup a system under test Run a workload Report the results Visualize the result Decide (and alert) if the performance changed • See ICPE Paper : Change Point Detection in Software Performance Testing (video, slides) Automate everything/Keep Noise Down 21

  22. 22

  23. 23

  24. 24

  25. Release support

  26. How is the performance? Compared to the last release. Can we How many open issues are there? release? Are they getting fixed? Are they stuck? Do we have coverage for new features? 26

  27. 27

  28. 28

  29. 29

  30. Humans in the loop Periodically review everything. (Weekly, Monthly) • Is everything important ticketed? • Are the top issues being worked? • Surface trade-offs that need to be addressed (e.g., New Feature X makes everything else 3% slower) Put people on the hard parts, then see what can be automated next. 30

  31. Ongoing Work

  32. Work with Us We have real world problems and would love to work with the community • Noise Reduction work • Dbtest.io: “Automated System Performance Testing at MongoDB” • ICPE Paper: “The Use of Change Point Detection to Identify Software Performance Regressions in a Continuous Integration System” (video) Our code is open source: signal-processing-algorithms, infrastructure code Our regression environment is open, and the platform is open source Our performance data is not open source, but we’re working to share it with academics 32

  33. Thank you David Daly | Lead Engineer -- Performance | @daviddaly44 | https://daviddaly.me/

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend