vdbench a benchmarking toolkit for thin client based
play

VDBench : A Benchmarking Toolkit for Thin- client based Virtual - PowerPoint PPT Presentation

VDBench : A Benchmarking Toolkit for Thin- client based Virtual Desktop Environments Alex Berryman , berryman@oar.net In collaboration with: Dr. Prasad Calyam (OSC/OARnet), Prof. Albert Lai (OSUMC), Matt Honigford (VMware) IEEE CloudCom,


  1. VDBench : A Benchmarking Toolkit for Thin- client based Virtual Desktop Environments Alex Berryman , berryman@oar.net In collaboration with: Dr. Prasad Calyam (OSC/OARnet), Prof. Albert Lai (OSUMC), Matt Honigford (VMware) IEEE CloudCom, Indianapolis, IN December 2 nd , 2010

  2. Topics of Discussion • Background and Motivation • VDBench Components and Data Flows – Architecture, Metrics, Techniques • VDBench Experiment Results – User Load Simulation based Benchmarking – Slow-motion Application Interaction based Benchmarking • VDBench Use Cases – Performance Mapping to User Application and Group Profiles – Resource Location Selection for Performance Balancing 2

  3. Virtual Desktop Infrastructure • VMware View is one of the popularly used VDI solutions – Personalized user desktops, applications and data access while maintaining centralized control and security 3

  4. Research Context • Recent advances in thin clients and the numerous benefits in transitioning user desktops to cloud environments – Convenience, Cost savings, Green IT, Security, … • Need for “system-aware”, “network-aware”, “human-aware” frameworks and tools to deploy virtual desktop clouds – Existing work focuses mainly upon system (i.e., CPU and memory) measurements for server-side resource adaptation – Our focus is to couple client-and-server resource adaptation with measurements of network health and user experience • Minimize cloud resource over-commitment • Avoid guesswork in configuring thin client protocols • Deliver optimum user experience of virtual applications 4

  5. VDBench Components and Data Flows 5

  6. Application Tasks Progression 6

  7. User-Load Simulation Metrics • Controlled Variables: – Number of VMs concurrently running to simulate user-load – “Ceiling Response Time” is set to 30% increase from the ideal “Application Response Time” • Metrics: – Application Response Time • Application Open Time • Aggregate Inter-application task – Matlab surface visualization • Atomic tasks – Alt+tab, ‘Save As’, web-page loads – Memory Availability • (Memory Allocated - Memory Balloon) / Memory Allocated – Available Memory utilization • Memory Used / (Memory Allocated - Memory Balloon) 7

  8. Slow-Motion Benchmarking Metrics • Controlled Variables: – Network Health: • Latency, Loss, Available Bandwidth – Codec choice: • RDP, RGS, PCoIP • Metrics: – Render Time : time taken for a screen update to complete • Assume that the time between the first and last packet of a screen update is equal to the time taken to display the update – Coding Efficiency : amount of data to transfer screen information • Atomic: Single screen update • Aggregate: Many screen updates (e.g., Video playback) – Video Quality : ratio of actual video playback with ideal playback • Measures the information loss when a codec transmits a video file 8

  9. Slow-Motion Benchmarking Technique 9

  10. VDBench Control Logic 10

  11. Research Goals and Contributions • Develop VDBench techniques to simulate realistic user workflows under synthetic system loads and network health impairments – Useful to measure corresponding user-perceived ‘interactive response times’ (e.g., application launch time, web-page download time, video quality) • Correlate thin-client user events with server-side resource performance events and develop novel performance metrics – Proposed the use of ‘marker packets’ that leverage and extend earlier research on slow-motion benchmarking of thin-client performance • Generate resource utilization profiles of different applications and user groups based on VDBench measurements – Useful to intelligently map pools of desktops to resources such that user satisfaction is ensured with minimal resource over-provisioning 11

  12. Related Work • Slow-motion benchmarking of thin-client display protocols was developed by [Nieh, et. al.] – Focus is on measuring user perceived performance by monitoring client-side of a remote desktop session – Lai, et. al. used slow-motion benchmarking to investigate thin-client display protocol characteristics over WAN connections • Existing virtual desktop benchmarking tools such as “Login VSI” developed by [Sprujit, et. al.] simulate realistic user workflows – They neglect the distinction between client-side rendering and server- side processing and hence cannot measure thin-client user experience • VNCPlay [Zeldovich et. al.] and Deskbench [Rhee et. al.] focus on thin-client workflow replay – They do not address event-synchronization with server side performance measurements 12

  13. VDBench Experiments Overview • VDI Scalability – Stress test VMs under various user application work loads • User Applications: MS Excel, IE Browser, Windows Media Player, Matlab – We used “Autoit” user application work loads (scripts for repeatable and automated GUI interactions with key presses, mouse movements) • VDI Reliability – Evaluate performance of remote desktop protocols • Protocols Evaluated: Microsoft Remote Desktop Protocol (RDP), HP Remote Graphics Software (RGS), Teradici PC-over-IP (PCoIP) – We used Netem for network (i.e., bandwidth, delay, loss) emulation 13

  14. User Load Simulation based Benchmarking • Memory utilization shows a linear increase in Memory Balloon size as the number of VMs increase • Increasing Memory Balloon size on each VM leads to an increase of user-perceived application open and task times 14

  15. Slow-motion Application based Benchmarking • Slow-motion benchmarking results give the bandwidth consumption of each protocol for a specific type of screen content and network health condition 15

  16. VDBench Use Cases • Performance Mapping to User Application and Group Profiles • Resource Location Selection for Performance Balancing 16

  17. Thank you for your attention!

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend