Performance Evaluation and Modeling of SaaS Web Services in the - - PowerPoint PPT Presentation

performance evaluation and modeling of saas web services
SMART_READER_LITE
LIVE PREVIEW

Performance Evaluation and Modeling of SaaS Web Services in the - - PowerPoint PPT Presentation

Performance Evaluation and Modeling of SaaS Web Services in the Cloud Abdallah Ali Zainelabden Abdallah Ibrahim PhD Defense / University of Luxembourg (UL) January 10 th , 2020 Dissertation Defense Committee : Chairman A-Prof. Dr. Ulrich


slide-1
SLIDE 1

Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Abdallah Ali Zainelabden Abdallah Ibrahim PhD Defense / University of Luxembourg (UL)

January 10th, 2020 Dissertation Defense Committee: Chairman A-Prof. Dr. Ulrich Sorger University of Luxembourg, Luxembourg Vice-Chairman

  • Prof. El-Ghazali Talbi

INRIA, University of Lille, France Jury Member

  • Dr. Dzmitry Kliazovich

Oply Mobility, Luxembourg Ph.D supervisor

  • Prof. Dr. Pascal Bouvry

University of Luxembourg, Luxembourg Ph.D advisor

  • Dr. Sébastien Varrette

University of Luxembourg, Luxembourg 1 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-2
SLIDE 2

Summary

1 Context & Motivations 2 PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module 3 Conclusion and Perspectives

2 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-3
SLIDE 3

Context & Motivations

Summary

1 Context & Motivations 2 PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module 3 Conclusion and Perspectives

3 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-4
SLIDE 4

Context & Motivations

Introduction

MainFrame Super Computers PC / Network Computing

Service Application Providers

1980s Virtualization Frameworks 1990s Cloud Computing 2005s Fog / Edge IoT / Sensors 2020s High Performance Computing 2007s 2010s 1960s 1970s

4 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-5
SLIDE 5

Context & Motivations

Introduction

MainFrame Super Computers PC / Network Computing

Service Application Providers

1980s Virtualization Frameworks 1990s Cloud Computing 2005s Fog / Edge IoT / Sensors 2020s High Performance Computing 2007s 2010s 1960s 1970s

Cloud Computing (CC)

[Source : NIST (National Institute of Standards & Technology)]

Network access to a shared pool of configurable computing resources* which is:

֒ → ubiquitous ֒ → convenient ֒ → on-demand

* e.g., networks, servers, storage, applications, and services 4 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-6
SLIDE 6

Context & Motivations

Cloud Computing Deployment Models

Infrastructure as a Service (IaaS)

Amazon WS, Google Cloud, Microsoft Azure...

5 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-7
SLIDE 7

Context & Motivations

Cloud Computing Deployment Models

Platform as a Service (PaaS)

Microsoft Azure, Google AppEngine, Heroku, SalesForce

5 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-8
SLIDE 8

Context & Motivations

Cloud Computing Deployment Models

Software as a Service (SaaS)

GMail, Office 365, GoogleApps, MongoDB, ...

5 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-9
SLIDE 9

Context & Motivations

Cloud Computing Deployment Models

Whatever as a Service (<x>aaS)

... as soon as it runs in a pay-per-use model over Cloud resources

5 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-10
SLIDE 10

Context & Motivations

Public Cloud Market Share: SaaS

Public cloud market (US $ bilions) 100 200 300 400 2018 2019 2020 2021 2022 SaaS BpaaS IaaS PaaS [GAR19] K. Costello & al. Gartner Forecasts Worldwide Public Cloud Revenue to Grow in 2020 , Gartner, 2019. 6 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-11
SLIDE 11

Context & Motivations

Public Cloud Market Share: SaaS

Public cloud market (US $ bilions) 100 200 300 400 2018 2019 2020 2021 2022 SaaS BpaaS IaaS PaaS [GAR19] K. Costello & al. Gartner Forecasts Worldwide Public Cloud Revenue to Grow in 2020 , Gartner, 2019. 6 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • SaaS Share

45% in 2022

slide-12
SLIDE 12

Context & Motivations

Service Level Agreement (SLA)

Negotiation Process Cloud Services

Cloud Services Provider (CSP)

Customers

Service Level Agreement (SLA)

A contract which defines exactly what services a Cloud Services Provider (CSP) provide

֒ → the required level or standard for those services [SLA09]

[SLA09] P. Patel & al. Service level agreement in cloud computing, 2009. 7 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-13
SLIDE 13

Context & Motivations

Service Level Agreement (SLA)

Cloud Services Provider (CSP) Cloud Services Customer (CSC) Cloud Services Negotiation Process SLA Document Services Levels Services Penalties SLA Metrics Services Credits Service Level Objectives (SLOs) Throughput Response Time Quality Metrics Latency

[ICDECT17] S. Anithakumari & al. Negotiation and Monitoring of Service Level Agreements in Cloud Computing Services, Springer. 8 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-14
SLIDE 14

Context & Motivations

Service Level Agreement (SLA)

Cloud Services Provider (CSP) Cloud Services Customer (CSC) Cloud Services Negotiation Process SLA Document Services Levels Services Penalties SLA Metrics Services Credits Service Level Objectives (SLOs) Throughput Response Time Quality Metrics Latency

[ICDECT17] S. Anithakumari & al. Negotiation and Monitoring of Service Level Agreements in Cloud Computing Services, Springer. 8 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-15
SLIDE 15

Context & Motivations

Service Level Agreement (SLA)

Cloud Services Provider (CSP) Cloud Services Customer (CSC) Cloud Services Negotiation Process SLA Document Services Levels Services Penalties SLA Metrics Services Credits Service Level Objectives (SLOs) Throughput Response Time Quality Metrics Latency

[ICDECT17] S. Anithakumari & al. Negotiation and Monitoring of Service Level Agreements in Cloud Computing Services, Springer. 8 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-16
SLIDE 16

Context & Motivations

Service Level Agreement (SLA)

Cloud Services Provider (CSP) Cloud Services Customer (CSC) Cloud Services Negotiation Process SLA Document Services Levels Services Penalties SLA Metrics Services Credits Service Level Objectives (SLOs) Throughput Response Time Quality Metrics Latency

[ICDECT17] S. Anithakumari & al. Negotiation and Monitoring of Service Level Agreements in Cloud Computing Services, Springer. 8 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-17
SLIDE 17

Context & Motivations

Service Level Agreement (SLA)

Cloud Services Provider (CSP) Cloud Services Customer (CSC) Cloud Services Negotiation Process SLA Document Services Levels Services Penalties SLA Metrics Services Credits Service Level Objectives (SLOs) Throughput Response Time Quality Metrics Latency

[ICDECT17] S. Anithakumari & al. Negotiation and Monitoring of Service Level Agreements in Cloud Computing Services, Springer. 8 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-18
SLIDE 18

Context & Motivations

Motivations

No Guarantee for SLOs Pay-as-you-go Cloud Services Cloud Services Provider (CSP) Customers

Problem Statement

Quality of the provided services are defined using SLAs YET No standard mechanism to verify and assure that the delivered services satisfy the signed SLA agreement in

֒ → an automatic way ֒ → outside of Cloud Service Providers awareness

measure accurately the Quality of Service (QoS)

9 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-19
SLIDE 19

Context & Motivations

Motivations

No Guarantee for SLOs Pay-as-you-go Cloud Services Cloud Services Provider (CSP) Customers

Problem Statement

Quality of the provided services are defined using SLAs YET No standard mechanism to verify and assure that the delivered services satisfy the signed SLA agreement in

֒ → an automatic way ֒ → outside of Cloud Service Providers awareness

measure accurately the Quality of Service (QoS) . . . without giving the chance to the CSP to change the allocated resources

9 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-20
SLIDE 20

Context & Motivations

Motivations: SLA Violations

On simple (easy to detect) Key Performance Indicator: Downtime

Cloud Vendors Availability (%) Downtime (H) Avg.Downtime (H) Cost ($/H) Downtime cost ($) YouTube 99.999 0.17 0.024 200 k 34 k Cisco 99.97 5.33 0.761 200 k 1066 k Facebook 99.951 8.5 1.214 200 k 1700 k VMware 99.943 10 1.429 336 k 3360 k Dropbox 99.903 17 2.429 200 k 3400 k Twitter 99.871 22.68 3.24 200 k 4536 k Netflix 99.863 24 3.429 200 k 4800 k Google 99.661 59.31 8.473 300 k 17739 k Apple 99.583 73.05 10.436 200 k 14610 k Yahoo 99.475 92 13.143 200 k 18400 k SalesForce 99.32 119.08 17.012 200 k 23816 k OVH 98.963 181.63 25.947 336 k 61027 k IBM 98.727 223 31.857 336 k 74928 k Amazon 98.382 292.893 41.841 336 k 98411 k Microsoft Azure 97.811 383.54 54.791 336 k 128869 k

[IWGCCR13] C. Cerin & al. Downtime statistics of current cloud solutions. IWGCCR, 2013. 10 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-21
SLIDE 21

Context & Motivations

Motivations: SLA Violations

On simple (easy to detect) Key Performance Indicator: Downtime

Cloud Vendors Availability (%) Downtime (H) Avg.Downtime (H) Cost ($/H) Downtime cost ($) YouTube 99.999 0.17 0.024 200 k 34 k Cisco 99.97 5.33 0.761 200 k 1066 k Facebook 99.951 8.5 1.214 200 k 1700 k VMware 99.943 10 1.429 336 k 3360 k Dropbox 99.903 17 2.429 200 k 3400 k Twitter 99.871 22.68 3.24 200 k 4536 k Netflix 99.863 24 3.429 200 k 4800 k Google 99.661 59.31 8.473 300 k 17739 k Apple 99.583 73.05 10.436 200 k 14610 k Yahoo 99.475 92 13.143 200 k 18400 k SalesForce 99.32 119.08 17.012 200 k 23816 k OVH 98.963 181.63 25.947 336 k 61027 k IBM 98.727 223 31.857 336 k 74928 k Amazon 98.382 292.893 41.841 336 k 98411 k Microsoft Azure 97.811 383.54 54.791 336 k 128869 k

[IWGCCR13] C. Cerin & al. Downtime statistics of current cloud solutions. IWGCCR, 2013. 10 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-22
SLIDE 22

Context & Motivations

SLA Violations: SaaS

[Google19] Google. G Suite Status Dashboard. 2019. 11 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-23
SLIDE 23

Context & Motivations

SLA Violations: SaaS

[Google19] Google. G Suite Status Dashboard. 2019. 11 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-24
SLIDE 24

Context & Motivations

Research Motivations

State-of-the-Art: SaaS Performance Evaluation

Relatively few research work done:

֒ → mostly focus on quality of software services [SP18] ֒ → quality models: rough sets, MCDA, prediction and fuzzy logic, Mean Opinion Score ֒ → different attributes of software quality:

functionality, reliability, usability, efficiency, maintainability, and portability [ISO01]

֒ → OR report easy to detect KPIs (Ex: downtime)

[SP18] D. Jagli & .al, A quality model for evaluating saas on the cloud computing environment, Springer, 2018 [ISO01] ISO/IEC 9126-1:2001 Software engineering-Product quality-Part 1: Quality model 12 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-25
SLIDE 25

Context & Motivations

Research Motivations

State-of-the-Art: SaaS Performance Evaluation

Relatively few research work done:

֒ → mostly focus on quality of software services [SP18] ֒ → quality models: rough sets, MCDA, prediction and fuzzy logic, Mean Opinion Score ֒ → different attributes of software quality:

functionality, reliability, usability, efficiency, maintainability, and portability [ISO01]

֒ → OR report easy to detect KPIs (Ex: downtime)

⇒ no actual automated way to check QoS

[SP18] D. Jagli & .al, A quality model for evaluating saas on the cloud computing environment, Springer, 2018 [ISO01] ISO/IEC 9126-1:2001 Software engineering-Product quality-Part 1: Quality model 12 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-26
SLIDE 26

Context & Motivations

Research Motivations

State-of-the-Art: SLA Assurance

There are few works in SLA metrics monitoring/measurement:

֒ → Based on Black-box metrics evaluation [CloudCom17]:

CloudHarmony, Monitis, CloudWatch, CloudStatus, . . . Test-as-a-Service (TaaS) on the cloud

  • ther frameworks, CLOUDQUAL [TI14]

[CloudCom17] S. Wagle & .al, Service performance pattern analysis and prediction of available providers, 2017 [TI14] X. Zheng & .al, Cloudqual: A quality model for cloud services, IEEE Trans. on Informatics, 2014 13 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-27
SLIDE 27

Context & Motivations

Research Motivations

State-of-the-Art: SLA Assurance

There are few works in SLA metrics monitoring/measurement:

֒ → Based on Black-box metrics evaluation [CloudCom17]:

CloudHarmony, Monitis, CloudWatch, CloudStatus, . . . Test-as-a-Service (TaaS) on the cloud

  • ther frameworks, CLOUDQUAL [TI14]

⇒ no automated and standard way of measuring the SLA compliance

[CloudCom17] S. Wagle & .al, Service performance pattern analysis and prediction of available providers, 2017 [TI14] X. Zheng & .al, Cloudqual: A quality model for cloud services, IEEE Trans. on Informatics, 2014 13 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-28
SLIDE 28

Context & Motivations

Research Motivations

Ph.D. Objectives

Propose a systematic & optimized framework for evaluating:

֒ → QoS and SLA compliance of cloud SaaS services offered ֒ → Across several CSPs

(allowing to propose a pertinent ranking between them) 14 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-29
SLIDE 29

Context & Motivations

Research Motivations

Ph.D. Objectives

Propose a systematic & optimized framework for evaluating:

֒ → QoS and SLA compliance of cloud SaaS services offered ֒ → Across several CSPs

(allowing to propose a pertinent ranking between them)

The framework should assess SaaS services:

֒ → Pertinent benchmarking/monitoring involving multiple metrics using distributed agents

14 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-30
SLIDE 30

Context & Motivations

Research Motivations

Ph.D. Objectives

Propose a systematic & optimized framework for evaluating:

֒ → QoS and SLA compliance of cloud SaaS services offered ֒ → Across several CSPs

(allowing to propose a pertinent ranking between them)

The framework should assess SaaS services:

֒ → Pertinent benchmarking/monitoring involving multiple metrics using distributed agents ֒ → Automatic and stealth (i.e., obfuscated) way

prevent CSPs to improve their results (by adapting the allocated resource) upon detection of evaluation

֒ → Defeat benchmarking detection

hidden as a “normal” client behaviour

14 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-31
SLIDE 31

Context & Motivations

Research Motivations

Ph.D. Objectives

Propose a systematic & optimized framework for evaluating:

֒ → QoS and SLA compliance of cloud SaaS services offered ֒ → Across several CSPs

(allowing to propose a pertinent ranking between them)

The framework should assess SaaS services:

֒ → Pertinent benchmarking/monitoring involving multiple metrics using distributed agents ֒ → Automatic and stealth (i.e., obfuscated) way

prevent CSPs to improve their results (by adapting the allocated resource) upon detection of evaluation

֒ → Defeat benchmarking detection

hidden as a “normal” client behaviour

⇒ PRESEnCE framework

14 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-32
SLIDE 32

Context & Motivations

Summary of Ph.D. Contributions

CSPs SaaS Cloud Web Services SLAs

Analysis KPIs

Cloud Computing

Evaluating & Monitoring Stealth Testing Metrics Modeling Prediction Model for Metrics Sensitivity Analysis Assurance & Verification SLOs / Metrics Analysis Probability-based Model for detecting Breaches QoS Analysis MCDA-based Ranking Service-Levels-based Ranking

PRESENCE

15 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-33
SLIDE 33

Context & Motivations

Summary of Ph.D. Contributions

CSPs SaaS Cloud Web Services SLAs

Analysis KPIs

Cloud Computing

Evaluating & Monitoring Stealth Testing Metrics Modeling Prediction Model for Metrics Sensitivity Analysis Assurance & Verification SLOs / Metrics Analysis Probability-based Model for detecting Breaches QoS Analysis MCDA-based Ranking Service-Levels-based Ranking

PRESENCE

15 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-34
SLIDE 34

Context & Motivations

Summary of Ph.D. Contributions

CSPs SaaS Cloud Web Services SLAs

Analysis KPIs

Cloud Computing

Evaluating & Monitoring Stealth Testing Metrics Modeling Prediction Model for Metrics Sensitivity Analysis Assurance & Verification SLOs / Metrics Analysis Probability-based Model for detecting Breaches QoS Analysis MCDA-based Ranking Service-Levels-based Ranking

PRESENCE

15 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-35
SLIDE 35

Context & Motivations

Summary of Ph.D. Contributions

CSPs SaaS Cloud Web Services SLAs

Analysis KPIs

Cloud Computing

Evaluating & Monitoring Stealth Testing Metrics Modeling Prediction Model for Metrics Sensitivity Analysis Assurance & Verification SLOs / Metrics Analysis Probability-based Model for detecting Breaches QoS Analysis MCDA-based Ranking Service-Levels-based Ranking

PRESENCE

15 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-36
SLIDE 36

Context & Motivations

Summary of Ph.D. Contributions

CSPs SaaS Cloud Web Services SLAs

Analysis KPIs

Cloud Computing

Evaluating & Monitoring Stealth Testing Metrics Modeling Prediction Model for Metrics Sensitivity Analysis Assurance & Verification SLOs / Metrics Analysis Probability-based Model for detecting Breaches QoS Analysis MCDA-based Ranking Service-Levels-based Ranking

PRESENCE

15 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-37
SLIDE 37

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Summary

1 Context & Motivations 2 PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module 3 Conclusion and Perspectives

16 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-38
SLIDE 38

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Proposed Framework

PRESEnCE Framework Objective

Evaluate the QoS and SLA compliance of Web Services offered

֒ → And across several Cloud Service Providers (CSPs).

Methodology

Quantify in a fair & stealth way the SaaS WS performance

֒ → including scalability of the delivered Web Services.

Assess the claimed SLA and the corresponding QoS

֒ → using a set of relevant performance metrics (response time).

Provide a multi-objective analysis of the gathered performance metrics

֒ → to be able to classify cloud brokers.

17 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-39
SLIDE 39

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking WS Performance Evaluation Stealth module dynamic load adaptation Modeling module predictive monitoring SLA checker module virtual QoS aggregator Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached, MongoDB, PostgreSQL etc. Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1 Client cA2 Client cAn Client cB1 Client cB2 Client cBm [Distributed] PRESEnCE Client c’ (Auditor)

18 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-40
SLIDE 40

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking WS Performance Evaluation Stealth module dynamic load adaptation Modeling module predictive monitoring SLA checker module virtual QoS aggregator Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached, MongoDB, PostgreSQL etc. Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1 Client cA2 Client cAn Client cB1 Client cB2 Client cBm [Distributed] PRESEnCE Client c’ (Auditor)

18 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-41
SLIDE 41

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking WS Performance Evaluation Stealth module dynamic load adaptation Modeling module predictive monitoring SLA checker module virtual QoS aggregator Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached, MongoDB, PostgreSQL etc. Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1 Client cA2 Client cAn Client cB1 Client cB2 Client cBm [Distributed] PRESEnCE Client c’ (Auditor)

18 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-42
SLIDE 42

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking WS Performance Evaluation Stealth module dynamic load adaptation Modeling module predictive monitoring SLA checker module virtual QoS aggregator Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached, MongoDB, PostgreSQL etc. Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1 Client cA2 Client cAn Client cB1 Client cB2 Client cBm [Distributed] PRESEnCE Client c’ (Auditor)

18 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-43
SLIDE 43

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking WS Performance Evaluation Stealth module dynamic load adaptation Modeling module predictive monitoring SLA checker module virtual QoS aggregator Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached, MongoDB, PostgreSQL etc. Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1 Client cA2 Client cAn Client cB1 Client cB2 Client cBm [Distributed] PRESEnCE Client c’ (Auditor)

18 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-44
SLIDE 44

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

19 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-45
SLIDE 45

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module

֒ → monitoring & modeling the Cloud services performance metrics

19 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-46
SLIDE 46

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module

֒ → monitoring & modeling the Cloud services performance metrics

Stealth Module

֒ → providing obfuscated and optimized benchmarking scenarios

19 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-47
SLIDE 47

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module

֒ → monitoring & modeling the Cloud services performance metrics

Stealth Module

֒ → providing obfuscated and optimized benchmarking scenarios

SLA checker Module

֒ → assessing & assuring SLA metrics

19 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-48
SLIDE 48

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Summary

1 Context & Motivations 2 PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module 3 Conclusion and Perspectives

20 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-49
SLIDE 49

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Modeling Module

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module monitoring/modeling

PRESEnCE modeling module objectives

֒ → Analysis of SaaS Metrics ֒ → Evaluating & monitoring SaaS Web Services ֒ → Collecting Data for the Metrics ֒ → Modeling the performance metrics

21 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-50
SLIDE 50

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Cloud Critical KPIs

KPIs Metrics Availability Response time, Up time, Down time Scalability

  • Avg. assigned resources, Avg. number of users, Capacity

Reliability Accuracy of Service, Fault Tolerance, Maturity Efficiency Utilization of Resource, Ratio of waiting time Reusability Readability, Publicity, Coverage of variability Composability Service Modularity, Service interoperability Adaptability Completeness of Variant Set, Coverage of Variability Usability Operability, Attractiveness, Learnability Elasticity Suspend Time, Delete Time, Provision Time Network and Communication Packet Loss Frequency, Connection Error Rate, Throughput, Latency Security Security Standards, Data Integrity, Sensitivity, Confidentiality Cost Total Cost, FLOP Cost (cent /FLOP, GFLOP )

[CC12] S. Sinung & al. Performance measurement of cloud computing services, 2012. 22 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module monitoring/modeling

slide-51
SLIDE 51

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Cloud Critical KPIs

KPIs Metrics Availability Response time, Up time, Down time Scalability

  • Avg. assigned resources, Avg. number of users, Capacity

Reliability Accuracy of Service, Fault Tolerance, Maturity Efficiency Utilization of Resource, Ratio of waiting time Reusability Readability, Publicity, Coverage of variability Composability Service Modularity, Service interoperability Adaptability Completeness of Variant Set, Coverage of Variability Usability Operability, Attractiveness, Learnability Elasticity Suspend Time, Delete Time, Provision Time Network and Communication Packet Loss Frequency, Connection Error Rate, Throughput, Latency Security Security Standards, Data Integrity, Sensitivity, Confidentiality Cost Total Cost, FLOP Cost (cent /FLOP, GFLOP )

[CC12] S. Sinung & al. Performance measurement of cloud computing services, 2012. [TR10] Guiding Metrics, The cloud service industry’s 10 most critical metrics, 2019. 22 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module monitoring/modeling

slide-52
SLIDE 52

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Agents

WS Performance Evaluation Stealth module dynamic load adaptation Modeling module predictive monitoring SLA checker module virtual QoS aggregator Agent / metric 1 Agent / metric 2 Agent / metric k

Benchmark Tool Version Targeted SaaS Web Services YCSB 0.12.0 Redis, MongoDB, Memcached, DynamoDB, ..etc Memtire-Bench 1.2.8 Redis, Memcached Redis-Bench 2.4.2 Redis Twitter RPC-Perf 2.0.3-pre Redis, Memcached, Apache PgBench 9.4.12 Postgresql, MySQl, SQLServer, Oracle DB Apache AB 2.3 Apache, Nginx, Jexus HTTP Load 1 Apache, Nginx, Jexus Iperf v1, v3 Iperf Server

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 23 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module monitoring/modeling

slide-53
SLIDE 53

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics) Parameters Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB Memtire-Bench Redis-Bench Twitter RPCPerf PgBench Apache AB HTTP Load Iperf Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-54
SLIDE 54

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics) Parameters Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB

  • Memtire-Bench

Redis-Bench Twitter RPCPerf PgBench Apache AB HTTP Load Iperf Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-55
SLIDE 55

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics) Parameters Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB

  • Memtire-Bench
  • Redis-Bench

Twitter RPCPerf PgBench Apache AB HTTP Load Iperf Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-56
SLIDE 56

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics) Parameters Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB

  • Memtire-Bench
  • Redis-Bench
  • Twitter RPCPerf

PgBench Apache AB HTTP Load Iperf Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-57
SLIDE 57

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics) Parameters Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB

  • Memtire-Bench
  • Redis-Bench
  • Twitter RPCPerf
  • PgBench

Apache AB HTTP Load Iperf Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-58
SLIDE 58

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics) Parameters Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB

  • Memtire-Bench
  • Redis-Bench
  • Twitter RPCPerf
  • PgBench
  • Apache AB

HTTP Load Iperf Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-59
SLIDE 59

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics) Parameters Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB

  • Memtire-Bench
  • Redis-Bench
  • Twitter RPCPerf
  • PgBench
  • Apache AB
  • HTTP Load

Iperf Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-60
SLIDE 60

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics) Parameters Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB

  • Memtire-Bench
  • Redis-Bench
  • Twitter RPCPerf
  • PgBench
  • Apache AB
  • HTTP Load
  • Iperf

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-61
SLIDE 61

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics) Parameters Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB

  • Memtire-Bench
  • Redis-Bench
  • Twitter RPCPerf
  • PgBench
  • Apache AB
  • HTTP Load
  • Iperf
  • Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-62
SLIDE 62

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi INPUTs Bi OUTPUTs (Measured Metrics) Parameters #Transactions #Requests #Operations #Records #Fetches #Parallel Clients #Pipes #Threads Workload Size Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB

  • Memtire-Bench
  • Redis-Bench
  • Twitter RPCPerf
  • PgBench
  • Apache AB
  • HTTP Load
  • Iperf
  • Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-63
SLIDE 63

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi INPUTs Bi OUTPUTs (Measured Metrics) Parameters #Transactions #Requests #Operations #Records #Fetches #Parallel Clients #Pipes #Threads Workload Size Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits Benchmark Bi YCSB

  • Memtire-Bench
  • Redis-Bench
  • Twitter RPCPerf
  • PgBench
  • Apache AB
  • HTTP Load
  • Iperf
  • Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 24 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-64
SLIDE 64

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Deployed Web Services

SaaS Services Type Version Used by Redis NoSQL Database 2.8.17 GitHub, Twitter, Pinterest MongoDB NoSQL Database 3.4 Google, Facebook, Cisco, ebay, Uber Memcached NoSQL Database 1.5.0 Amazone, Netflix, Instagram, Slack, Dropbox PostrgreSQL SQL Database 9.4 Nokia, BMW, Netflix, Skybe, Apple Apache HTTP 2.2.22.13 Linkedin, Slack, Accenture Iperf server Network V1, V3 –

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 25 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module monitoring/modeling

slide-65
SLIDE 65

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Monitoring

Cloud Services CSC 1 PRESENCE Auditor Normal Trace Monitoring Metrics Evaluations CSC 2 CSC n CSP n CSP 1 Normal Trace (Workload) Benchmarking Scenarios PRESENCE Agents Deployed Services

26 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module monitoring/modeling

slide-66
SLIDE 66

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Monitoring Results

2000 4000 6000 8000 10000 2000 4000 6000 8000 10000 Number of Operations Throughput(ops/sec) Servers(Throughput): Redis MongoDB Memcached 2000 4000 6000 8000 10000 Number of Records

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 27 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-67
SLIDE 67

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Monitoring Results

2000 4000 6000 8000 10000 10000 20000 30000 40000 50000 Number of Operations Update Latency(us) Servers(Update Latency): Redis MongoDB Memcached 2000 4000 6000 8000 10000 Number of Records 2000 4000 6000 8000 10000 20000 40000 60000 Number of Operations Read Latency(us) Servers(Read Latency): Redis MongoDB Memcached 2000 4000 6000 8000 10000 Number of Records

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 27 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-68
SLIDE 68

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Monitoring Results

50000 100000 150000 200000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Fetches Normalized Throughput (Fetches/sec) 0.0 0.2 0.4 0.6 0.8 1.0 300 650 1000 Number of Parallel Clients Normalized Latency HTTP LOAD Throughput Latency 200 400 600 800 1000 20 40 60 80 100 120 Number of Parallel Clients Average Latency (msec) 20 40 60 80 100 120 30000 120000 190000 Number of Fetches Average Latency Connection (msec) HTTP LOAD: Latency (Avg (msec)) Connection Latency (AVG (ms))

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 27 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-69
SLIDE 69

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Monitoring Results

2000 4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Transactions per Client Normalized TPS 0.0 0.2 0.4 0.6 0.8 1.0 Normalized Response Time (Latency) 20 50 80 Number of Parallel Clients Pgbench TPS Response Time

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. 27 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • Response Time
slide-70
SLIDE 70

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Performance Modeling

Arena: Input Analyser Appropriate Distribution Cloud Services CSC 1 PRESENCE Auditor Normal Trace Monitoring Metrics Evaluations CSC 2 CSC n CSP n CSP 1 Normal Trace (Workload) PRESENCE Agents Deployed Services Benchmarking Scenarios

Collecting Data Modeling

28 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-71
SLIDE 71

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Performance Modeling

Arena: Input Analyser Appropriate Distribution Cloud Services CSC 1 PRESENCE Auditor Normal Trace Monitoring Metrics Evaluations CSC 2 CSC n CSP n CSP 1 Normal Trace (Workload) PRESENCE Agents Deployed Services Benchmarking Scenarios

Collecting Data Modeling

28 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-72
SLIDE 72

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Models Validation

Necessity to validate the PRESEnCE models generated from:

֒ → The monitoring data from PRESEnCE agents ֒ → Generated data from the obtained models

NORMALITY TEST (Kologorov-Smiron test) NORMAL VARIABLES (mean comparisons, parametric tests) NON-NORMAL VARIABLES (median comparisons, non-parametric tests) Student t-test ANOVA (Analysis of Variance) Wilcoxon test Friedman test

True False

2 data 2 data > 2 data > 2 data

DATA SETS

[ITOR13] E. Alba & al. Parallel meta-heuristics: recent advances and new trends. ITOR 2013 29 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-73
SLIDE 73

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Models Validation

Necessity to validate the PRESEnCE models generated from:

֒ → The monitoring data from PRESEnCE agents ֒ → Generated data from the obtained models

NORMALITY TEST (Kologorov-Smiron test) NORMAL VARIABLES (mean comparisons, parametric tests) NON-NORMAL VARIABLES (median comparisons, non-parametric tests) Student t-test ANOVA (Analysis of Variance) Wilcoxon test Friedman test

True False

2 data 2 data > 2 data > 2 data

DATA SETS

[ITOR13] E. Alba & al. Parallel meta-heuristics: recent advances and new trends. ITOR 2013

Statistically significant: Confidence level > 95% , (p-value < 0.05)

29 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-74
SLIDE 74

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Modeling Results

Ex: Redis SaaS Web service

Metric Distribution Model Expression Throughput Beta −0.001 + 1 ∗ BETA(3.63, 3.09) where BETA(β, α) β = 3.63 α = 3.09 Offset = −0.001 f (x) =

  

xβ−1(1−x)α−1 B(β,α)

for 0 < x < 1

  • therwise

where β is the complete beta function given by B(β, α) = 1

0 tβ−1(1 − t)α−1dt

Latency Read Gamma −0.001 + GAMM(0.0846, 2.39) where GAMM(β, α) β = 0.0846 α = 2.39 Offset = −0.001 f (x) =

  

β−αxα−1e

− x β

Γ(α)

for x > 0

  • therwise

where Γ is the complete gamma function given by Γ(α) = inf tα−1e−1dt Latency Update Erlang −0.001 + ERLA(0.0733, 3) where ERLA(β, k) k = 3 β = 0.0733 Offset = −0.001 f (x) =

  

β−kxk−1e

− x β

(k−1)!

for x > 0

  • therwise

[CLOUD18] A. Ibrahim & al. Performance metrics models for cloud SaaS web services, IEEE CLOUD 2018. 30 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-75
SLIDE 75

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Modeling Results

Ex: MongoDB SaaS Web service

Metric Distribution Model Expression Throughput Beta −0.001 + 1 ∗ BETA(3.65, 2.11) where BETA(β, α) β = 3.65 α = 2.11 Offset = −0.001 f (x) =

  

xβ−1(1−x)α−1 B(β,α)

for 0 < x < 1

  • therwise

where β is the complete beta function given by B(β, α) = 1

0 tβ−1(1 − t)α−1dt

Latency Read Beta −0.001 + 1 ∗ BETA(1.6, 2.48) where BETA(β, α) β = 1.6 α = 2.48 Offset = −0.001 f (x) =

  

xβ−1(1−x)α−1 B(β,α)

for 0 < x < 1

  • therwise

where β is the complete beta function given by B(β, α) = 1

0 tβ−1(1 − t)α−1dt

Latency Update Erlang −0.001 + ERLA(0.0902, 2) where ERLA(β, k) k = 2 β = 0.0902 Offset = −0.001 f (x) =

  

β−kxk−1e

− x β

(k−1)!

for x > 0

  • therwise

[CLOUD18] A. Ibrahim & al. Performance metrics models for cloud SaaS web services, IEEE CLOUD 2018. 30 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-76
SLIDE 76

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Modeling Results

Bi OUTPUTs (Measured Metrics) Throughput Latency Read Latency Update Latency CleanUp Latency Transfer Rate Response Time Miss Hits

SaaS WS Performance Models Summary

19 models were generated:

֒ → represent the performance metrics for the SaaS Web Service

15 out of 19 models are proved accurate

֒ → i.e., 78.9% of the analyzed models have Confidence level > 95%

[CLOUD18] A. Ibrahim & al. Performance metrics models for cloud SaaS web services, IEEE CLOUD 2018. 31 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-77
SLIDE 77

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Modeling Module Summary

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module monitoring/modeling Analysis Performance Metrics Evaluate & Monitoring Metrics Collecting Data for the Metrics Generate Distribution Models 32 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-78
SLIDE 78

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Modeling Module Summary

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Modeling Module monitoring/modeling Analysis Performance Metrics Evaluate & Monitoring Metrics Collecting Data for the Metrics Generate Distribution Models 32 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • Agent / metric 1

Agent / metric 2 Agent / metric k

slide-79
SLIDE 79

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Summary

1 Context & Motivations 2 PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module 3 Conclusion and Perspectives

33 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-80
SLIDE 80

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Stealth Module

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

PRESEnCE stealth module objectives Provide benchmark scenarios which ensure :

֒ → accurate and stealth (i.e., obfuscated) testing

CSP should not adapt the allocated resource. Ex: to improve evaluation results

34 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-81
SLIDE 81

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Stealth Module

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

PRESEnCE stealth module objectives Provide benchmark scenarios which ensure :

֒ → accurate and stealth (i.e., obfuscated) testing

CSP should not adapt the allocated resource. Ex: to improve evaluation results

֒ → defeating potential benchmarking detection

hidden as a “normal” client behaviour

34 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-82
SLIDE 82

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Stealth Module

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

PRESEnCE stealth module objectives Provide benchmark scenarios which ensure :

֒ → accurate and stealth (i.e., obfuscated) testing

CSP should not adapt the allocated resource. Ex: to improve evaluation results

֒ → defeating potential benchmarking detection

hidden as a “normal” client behaviour

֒ → Exploiting PRESEnCE models previously generated

34 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-83
SLIDE 83

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Stealth Module Overview

Under Evaluation

Arena: Input Analyser Appropriate Distribution Cloud Services CSC 1 PRESENCE Auditor Testing Model (1) Monitoring Metrics Evaluations CSC 2 CSC n CSP n CSP 1 Normal Trace (Workload) PRESENCE Agents Deployed Services Benchmarking Scenarios

Collecting Data Modeling

Stealth Module

Testing Model (2) Testing Model (n)

ORACLE

Expected Normal Trace Testing Scenario Trace Not Stealth, you cannot Test Testing Scenario Distinguishable

NO

35 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-84
SLIDE 84

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Stealth Module Overview

Under Evaluation

Arena: Input Analyser Appropriate Distribution Cloud Services CSC 1 PRESENCE Auditor Testing Model (1) Monitoring Metrics Evaluations CSC 2 CSC n CSP n CSP 1 Normal Trace (Workload) PRESENCE Agents Deployed Services Benchmarking Scenarios

Collecting Data Modeling

Stealth Module

Testing Model (2) Testing Model (n)

ORACLE

Expected Normal Trace Testing Scenario Trace

YES

It's stealth, you can Test

35 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-85
SLIDE 85

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

The Stealth Problem

Given an [estimated] aggregated SaaS customer behaviour

֒ → Find the best benchmarking scenario matching this behaviour

time-sequence of carefully selected benchmarks adaptation/optimisation of input parameters

36 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-86
SLIDE 86

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

The Stealth Problem

Given an [estimated] aggregated SaaS customer behaviour

֒ → Find the best benchmarking scenario matching this behaviour

time-sequence of carefully selected benchmarks adaptation/optimisation of input parameters

Ex: A possible solution (benchmarking scenario)

֒ → based on benchmarks models ˆ Bi, for time period [T0 = 5, Tend = 340]

Time start tstart Time end tend Benchmark Bi or ˆ Bi Inputs parameters 5 120 Bench1 X1 45 220 Bench2 X2 130 280 Bench3 X3 190 340 Bench4 X4

36 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-87
SLIDE 87

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Illustration

50 100 150 200 250 2000 3000 4000 5000 6000 Starting Time Throughput 100 150 200 250 300 350 Termination Time

Bench1{X1} Bench2{X2} Bench3{X3} Bench4{X4}

Normal Trace Testing Trace

37 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-88
SLIDE 88

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Illustration

For a given estimated benchmark ˆ Bi

֒ → Find optimized input parameters X ∗

i minimizing RSS distance

Obj: defeat ORACLE detection scheme

Time start tstart Time end tend Benchmark Bi or ˆ Bi Inputs parameters 5 120 Bench1 X1 → X ∗

1

45 220 Bench2 X1 → X ∗

2

130 280 Bench3 X1 → X ∗

3

190 340 Bench4 X1 → X ∗

4

38 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-89
SLIDE 89

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Illustration

50 100 150 200 250 2000 3000 4000 5000 6000 Starting Time Throughput 100 150 200 250 300 350 Termination Time

Minimizing Minimizing Bench1{X1} Bench2{X2} Bench3{X3} Bench4{X4}

Normal Trace Testing Trace Distance Minimization

39 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-90
SLIDE 90

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Illustration

50 100 150 200 250 2000 3000 4000 5000 6000 Starting Time Throughput 100 150 200 250 300 350 Termination Time

Minimized Minimized Bench1{X1

*}

Bench2{X2

*}

Bench3{X3

*}

Bench4{X4

*} Normal Trace Distance Minimization Optimized Testing Trace

39 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-91
SLIDE 91

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Summary

Multi-layer optimisation

Optimizing Benchmarking scenario

֒ → finding appropriate parameters for each benchmark ˆ Bi

minimizing the RSS distance underlying detection heuristic of the Oracle

֒ → for each ∆t:

find the best estimated benchmark ˆ Bi

֒ → for the global time period :

derive an optimized benchmarking scenario

  • ptimized sequence of benchmarks, incl.

input parameters, start & end time

Oracle

IF RSS < Threshold

Calculate Distance RSS between Normal & Benchmarks Traces

Emulating the CSP View Expected Normal Usage Model PRESENCE Benchmarks

Yes

No

YES

NO

PRESENCE

Non-Distinguishable Optimize the Distance

40 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-92
SLIDE 92

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Resolution

Solving Approaches Exact Methods Metaheuristics

Scalability Issue Best fit Approximate fit Scalable

[Wiley09] EG Talbi, Meta-heuristics: from design to implementation 41 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-93
SLIDE 93

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Resolution

Solving Approaches Exact Methods Metaheuristics

Scalability Issue Best fit Approximate fit Scalable

[Wiley09] EG Talbi, Meta-heuristics: from design to implementation 41 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • Proposed approach

֒ → Genetic Algorithm (GA) ֒ → Hybrid Algorithm (GA + ML)

slide-94
SLIDE 94

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Optimisation Model

Stealth problem objective

For each time period and each ˆ Bi

֒ → Optimize set of inputs Xi ֒ → Obj: defeat oracle detection

Optimize benchmark set over time { ˆ Bi}t

֒ → Note: Benchs may overlap ֒ → Yet without loss of generality:

no overlap between Benchmarks min

θ∈Θ max i

∆(Y, ˆ Y ) where ∆(Y, ˆ Y ) =

n

  • i,k
  • Y − ˆ

Y

  • (1)

s.t. min

θ∈Θ

z z ≥ y1 − ˆ y1(Θ)

  • z ≥

y2 − ˆ y2(Θ)

  • z ≥

y3 − ˆ y3(Θ)

  • .

. . z ≥ yi − ˆ yi(Θ)

                            (2) i ∈ {1, 2, 3, ..., n} (3) where ˆ yi(Θ) → Prediction Model

42 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-95
SLIDE 95

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Application: FIFA Web Services

Deployed during one of the most popular worldwide event

֒ → squad and venue information, live matches etc.

[NET] A. Martin & al. Workload Characterization of the 1998 World Cup Web Site. IEEE Network. 43 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-96
SLIDE 96

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Application: FIFA Web Services

Deployed during one of the most popular worldwide event

֒ → squad and venue information, live matches etc.

43 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-97
SLIDE 97

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Application: FIFA Web Services

Deployed during one of the most popular worldwide event

֒ → squad and venue information, live matches etc.

43 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-98
SLIDE 98

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Experimental Setup

Application of PRESEnCE stealth module against FIFA WS traces

֒ → comparison of the two proposed approaches (GA and Hybrid)

Configurations [1, 2, 3] Configurations [4, 5, 6] Expected normal trace FIFA FIFA Number of generations 1000 10000 Population size [20, 50, 100] [20, 50, 100] Number of evaluations [50, 20, 10] [500, 200, 100] Selection process Bi-Tournament Bi-Tournament Crossover operator 2-point crossover 2-point crossover Crossover rate 0.8 0.8 Mutation operator uniform uniform Mutation rate 0.01 0.01 Number of executions 30 30

44 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-99
SLIDE 99

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Experimental Setup

Application of PRESEnCE stealth module against FIFA WS traces

֒ → comparison of the two proposed approaches (GA and Hybrid)

Configurations [1, 2, 3] Configurations [4, 5, 6] Expected normal trace FIFA FIFA Number of generations 1000 10000 Population size [20, 50, 100] [20, 50, 100] Number of evaluations [50, 20, 10] [500, 200, 100] Selection process Bi-Tournament Bi-Tournament Crossover operator 2-point crossover 2-point crossover Crossover rate 0.8 0.8 Mutation operator uniform uniform Mutation rate 0.01 0.01 Number of executions 30 30

Performance Indicator for PRESEnCE stealth module ⇒ Convergence

44 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-100
SLIDE 100

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Results - Convergence

Config 1 GA Config 1 Hybrid Config 2 GA Config 2 Hybrid Config 3 GA Config 3 Hybrid Config 4 GA Config 4 Hybrid Config 5 GA Config 5 Hybrid Config 6 GA Config 6 Hybrid Convergence (Residual) 20000 22000 24000 26000 28000 30000 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ev = 10 Ev = 10 Ev = 20 Ev = 20 Ev = 50 Ev = 50 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 500 Ev = 500 STD StdErr 95% Confidence Interval Ev −> Evaluations Ex −> Executions

45 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-101
SLIDE 101

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Results - Convergence

Config 1 GA Config 1 Hybrid Config 2 GA Config 2 Hybrid Config 3 GA Config 3 Hybrid Config 4 GA Config 4 Hybrid Config 5 GA Config 5 Hybrid Config 6 GA Config 6 Hybrid Convergence (Residual) 20000 22000 24000 26000 28000 30000 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ev = 10 Ev = 10 Ev = 20 Ev = 20 Ev = 50 Ev = 50 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 500 Ev = 500 STD StdErr 95% Confidence Interval Ev −> Evaluations Ex −> Executions

45 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

Lower Convergence is better

slide-102
SLIDE 102

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Results - Convergence

Config 1 GA Config 1 Hybrid Config 2 GA Config 2 Hybrid Config 3 GA Config 3 Hybrid Config 4 GA Config 4 Hybrid Config 5 GA Config 5 Hybrid Config 6 GA Config 6 Hybrid Convergence (Residual) 20000 22000 24000 26000 28000 30000 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ev = 10 Ev = 10 Ev = 20 Ev = 20 Ev = 50 Ev = 50 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 500 Ev = 500 STD StdErr 95% Confidence Interval Ev −> Evaluations Ex −> Executions Config 1 GA Config 1 Hybrid Config 2 GA Config 2 Hybrid Config 3 GA Config 3 Hybrid Config 4 GA Config 4 Hybrid Config 5 GA Config 5 Hybrid Config 6 GA Config 6 Hybrid Convergence (Residual) 20000 22000 24000 26000 28000 30000 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ev = 10 Ev = 10 Ev = 20 Ev = 20 Ev = 50 Ev = 50 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 500 Ev = 500 STD StdErr 95% Confidence Interval Ev −> Evaluations Ex −> Executions ORACLE Threshold

45 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

Lower Convergence is better

slide-103
SLIDE 103

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

GA vs. Hybrid

Necessity to validate the produced benchmarking scenarios

֒ → Stealth results from PRESEnCE GA and Hybrid approach

NORMALITY TEST (Kologorov-Smiron test) NORMAL VARIABLES (mean comparisons, parametric tests) NON-NORMAL VARIABLES (median comparisons, non-parametric tests) Student t-test ANOVA (Analysis of Variance) Wilcoxon test Friedman test

True False

2 data 2 data > 2 data > 2 data

DATA SETS

[ITOR13] E. Alba & al. Parallel meta-heuristics: recent advances and new trends. ITOR 2013 46 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-104
SLIDE 104

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

GA vs. Hybrid

Necessity to validate the produced benchmarking scenarios

֒ → Stealth results from PRESEnCE GA and Hybrid approach

NORMALITY TEST (Kologorov-Smiron test) NORMAL VARIABLES (mean comparisons, parametric tests) NON-NORMAL VARIABLES (median comparisons, non-parametric tests) Student t-test ANOVA (Analysis of Variance) Wilcoxon test Friedman test

True False

2 data 2 data > 2 data > 2 data

DATA SETS

[ITOR13] E. Alba & al. Parallel meta-heuristics: recent advances and new trends. ITOR 2013

Statistically significant: Confidence level > 99% , (p-value < 0.01)

46 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-105
SLIDE 105

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Compare: GA & Hybrid Approach

GA Hybrid Configurations Mean Std Mean Std p-value Configuration 1 29043.1 276.6091 25619.51 272.6941 4.114e − 05 Configuration 2 29306.89 167.5696 26643.73 332.9266 1.455e − 07 Configuration 3 30021.77 255.2807 25818.35 293.3159 2.2e − 16 Configuration 4 28675.97 94.29658 20898.48 83.75769 2.2e − 16 Configuration 5 29044.13 146.8183 25619.51 272.6941 2.2e − 16 Configuration 6 29095.63 314.8841 26733.43 184.4088 2.2e − 16 47 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-106
SLIDE 106

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Compare: GA & Hybrid Approach

GA Hybrid Configurations Mean Std Mean Std p-value Configuration 1 29043.1 276.6091 25619.51 272.6941 4.114e − 05 Configuration 2 29306.89 167.5696 26643.73 332.9266 1.455e − 07 Configuration 3 30021.77 255.2807 25818.35 293.3159 2.2e − 16 Configuration 4 28675.97 94.29658 20898.48 83.75769 2.2e − 16 Configuration 5 29044.13 146.8183 25619.51 272.6941 2.2e − 16 Configuration 6 29095.63 314.8841 26733.43 184.4088 2.2e − 16

  • Config 1

GA Config 1 R+GA Config 2 GA Config 2 R+GA Config 3 GA Config 3 R+G 26000 27000 28000 29000 30000 31000 Convergence (Residual)

  • Config 4

GA Config 4 R+GA Config 5 GA Config 5 R+GA Config 6 GA Config 6 R+GA 22000 24000 26000 28000 30000 Convergence (Residual)

47 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-107
SLIDE 107

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Compare: GA & Hybrid Approach

GA Hybrid Configurations Mean Std Mean Std p-value Configuration 1 29043.1 276.6091 25619.51 272.6941 4.114e − 05 Configuration 2 29306.89 167.5696 26643.73 332.9266 1.455e − 07 Configuration 3 30021.77 255.2807 25818.35 293.3159 2.2e − 16 Configuration 4 28675.97 94.29658 20898.48 83.75769 2.2e − 16 Configuration 5 29044.13 146.8183 25619.51 272.6941 2.2e − 16 Configuration 6 29095.63 314.8841 26733.43 184.4088 2.2e − 16

  • Config 1

GA Config 1 R+GA Config 2 GA Config 2 R+GA Config 3 GA Config 3 R+G 26000 27000 28000 29000 30000 31000 Convergence (Residual)

  • Config 4

GA Config 4 R+GA Config 5 GA Config 5 R+GA Config 6 GA Config 6 R+GA 22000 24000 26000 28000 30000 Convergence (Residual)

47 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-108
SLIDE 108

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Module Summary

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation PRESENCE Testing Trace Blackbox Optimization Stealth Testing Expected Normal Trace ORACLE

NO YES

New Scenario

48 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-109
SLIDE 109

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Module Summary

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation PRESENCE Testing Trace Blackbox Optimization Stealth Testing Expected Normal Trace ORACLE

NO YES

New Scenario

48 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-110
SLIDE 110

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Summary

1 Context & Motivations 2 PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module 3 Conclusion and Perspectives

49 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-111
SLIDE 111

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: SLA checker Module

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE SLA checker module objectives

֒ → Analysis the SLOs and QoS metrics ֒ → SLA verification & assurance ֒ → Services-levels-based ranking for the CSPs

50 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-112
SLIDE 112

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Service Level Objectives (SLO)

[UCC14] S. Wagle & .al, SLA assured brokering (SAB) and CSP certification in cloud computing, UCC, 2014 [CCGrid16] A. Ibrahim & .al, SLA Assurance between Cloud Services Providers and Cloud Customers, IEEE CCGrid, 2016 51 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-113
SLIDE 113

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Service Level Objectives (SLO)

[UCC14] S. Wagle & .al, SLA assured brokering (SAB) and CSP certification in cloud computing, UCC, 2014 [CCGrid16] A. Ibrahim & .al, SLA Assurance between Cloud Services Providers and Cloud Customers, IEEE CCGrid, 2016 51 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-114
SLIDE 114

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Assurance

Under Evaluation

Modeling Cloud Services CSC 1 PRESENCE Auditor Monitoring Metrics Evaluations CSC 2 CSC n CSP n CSP 1 Normal Trace (Workload) Deployed Services Testing Model (2)

ORACLE

Testing Model (1) Testing Model (n) Stealth Testing

SLAs

Real Quality Metrics

QoS

Assurance

[CLOUD16] A. Ibrahim & .al, On SLA Assurance in Cloud Computing Data Centers, IEEE CLOUD, 2016 [CCGrid16] A. Ibrahim & .al, SLA Assurance between Cloud Services Providers and Cloud Customers, IEEE CCGrid, 2016 52 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-115
SLIDE 115

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Breaches

Probability-based model for detecting SLA Breaches

PRESEnCE monitoring for the SLA Metrics

֒ → Identifying the possibility of breaches:

Ex: read latency > average read latency Ex: throughput < average throughput

PRESEnCE modeling for the SLA metrics

֒ → Models are used to find the probability P(x) of a breach

If the probability of a breach P(x) > Threshold:

֒ → Assumed SLA violation

legitimate request for penalization/compensation from CSP PRESEnCE allows for argued complain

53 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-116
SLIDE 116

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Breaches

  • 2000

4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Operations Latency (Standardized)

  • Redis

Latency

  • 2000

4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Operations Throughput (Standardized)

  • MongoDB

Throughput

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017 54 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-117
SLIDE 117

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Breaches

  • 2000

4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Operations Read Latency (Standardized)

  • Memcached

Latency

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017 54 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-118
SLIDE 118

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Breaches

  • 2000

4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Operations Read Latency (Standardized)

  • Memcached

Latency

  • 2000

4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Operations Read Latency (Standardized)

  • Memcached

Read Latency Breach of Contract

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017 54 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-119
SLIDE 119

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Breaches

  • 2000

4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Operations Latency (Standardized)

  • Redis

Latency Breach of Contract

  • 2000

4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Operations Throughput (Standardized)

  • MongoDB

Throughput Breach of Contract

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017 54 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-120
SLIDE 120

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Penalization

  • 2000

4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Operations Read Latency (Standardized)

  • Redis

Breach in Latency Probability of Breach Penalization

  • 6000

7000 8000 9000 10000 0.2 0.3 0.4 0.5 0.6 Number of Operations Throughput (Standardized)

  • MongoDB

Breach in Throughput Probability of Breach Penalization

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017 54 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-121
SLIDE 121

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Penalization

  • 2000

4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Operations Latency (Standardized)

  • Memcached

Breach in Latency Probability of Breach Penalization

55 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-122
SLIDE 122

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Penalization

  • 2000

4000 6000 8000 10000 0.0 0.2 0.4 0.6 0.8 1.0 Number of Operations Latency (Standardized)

  • Memcached

Breach in Latency Probability of Breach Penalization

Probability-based model for detecting SLA Breaches

Penalization has issued for :

֒ → Redis Web Services ֒ → MongoDB Web Services

No penalization has issued for :

֒ → Memcached Web services

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017 55 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-123
SLIDE 123

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

CSPs Ranking

PRESEnCE: Services-level-based Ranking

Use QoS criterion to rank the Cloud Service Providers (CSPs) Based on the assured SLAs vs. PRESEnCE-assessed SLAs:

֒ → comparative analysis between CSPs ֒ → allowed by PRESEnCE SaaS performance evaluation

Modeling

CSC 1

PRESENCE Auditor Monitoring

CSC 2 CSC n

CSP 1

Normal Trace (Workload)

ORACLE

Stealth

SLAs

QoS

Assurance

CSP n

QoS-based Ranking

CSP n CSP 1

Deployed Services Under Evaluation

Test Assure

Rank

56 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-124
SLIDE 124

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

CSPs Ranking

PRESEnCE: Services-level-based Ranking

Use QoS criterion to rank the Cloud Service Providers (CSPs) Based on the assured SLAs vs. PRESEnCE-assessed SLAs:

֒ → comparative analysis between CSPs ֒ → allowed by PRESEnCE SaaS performance evaluation

MCDA Objective AHP Pair-wise comparison of elements structured in a hierarchical relationship TOPSIS Criteria based selection of an alternative that is closest to the ideal solution

[CloudCom17] A. Ibrahim & .al, Self-regulated MCDA: An autonomous brokerage-based approach for service provider ranking in the cloud. 56 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

slide-125
SLIDE 125

Conclusion and Perspectives

Summary

1 Context & Motivations 2 PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module 3 Conclusion and Perspectives

57 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-126
SLIDE 126

Conclusion and Perspectives

Conclusion

Summary: Cloud Computing (CC) SaaS and SLAs

CC: successful and easy to use distributed computing paradigm

֒ → Typical deployment model: SaaS, PaaS, and IaaS ֒ → SaaS is the most used model in the cloud market

Cloud Services offered through Service Level Agreements

֒ → SLAs define both services levels and services penalties

58 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-127
SLIDE 127

Conclusion and Perspectives

Conclusion

Summary: Cloud Computing (CC) SaaS and SLAs

CC: successful and easy to use distributed computing paradigm

֒ → Typical deployment model: SaaS, PaaS, and IaaS ֒ → SaaS is the most used model in the cloud market

Cloud Services offered through Service Level Agreements

֒ → SLAs define both services levels and services penalties

YET No standard mechanism to verify and assure that the delivered services satisfy the signed SLA agreement in

֒ → an automatic way ֒ → outside of Cloud Service Providers awareness

measure accurately the Quality of Service (QoS) i.e., without giving the chance to the CSP to change the allocated resources

58 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-128
SLIDE 128

Conclusion and Perspectives

In this Ph.D Thesis

Extensive study of SaaS performance and SLA compliance Proposed PRESEnCE framework:

֒ → for monitoring, modeling and evaluating the performance of cloud SaaS web services

All frameworks implemented and validated on real case scenarios

59 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-129
SLIDE 129

Conclusion and Perspectives

In this Ph.D Thesis

Extensive study of SaaS performance and SLA compliance Proposed PRESEnCE framework:

֒ → for monitoring, modeling and evaluating the performance of cloud SaaS web services

All frameworks implemented and validated on real case scenarios

Summary of Contributions

Multi-criteria evaluation of SaaS Web Services

֒ → distributed PRESEnCE agents tied to KPIs ֒ → Accurate Models for the Web Services metrics

Optimization model obfuscating CSPs evaluation campaigns

֒ → using Meta-heuristics and hybrid GA/machine learning

Probability-based model for SLA breaches detection Services-levels-based ranking for the CSPs

59 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-130
SLIDE 130

Conclusion and Perspectives

Conclusion

Workload / SLA analysis Performance Evaluation On-demand evaluation of SaaS Web Services across Multi-Cloud Providers based on:

PRESEnCE

SLA/QoS Validator Predictive analytics Analyze Ranking WS Performance Evaluation Stealth module dynamic load adaptation Modeling module predictive monitoring SLA checker module virtual QoS aggregator Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached, MongoDB, PostgreSQL etc. Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1 Client cA2 Client cAn Client cB1 Client cB2 Client cBm [Distributed] PRESEnCE Client c’ (Auditor)

60 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-131
SLIDE 131

Conclusion and Perspectives

Conclusion

Modeling

CSC 1

PRESENCE Auditor Monitoring

CSC 2 CSC n

CSP 1

Normal Trace (Workload)

ORACLE

Stealth

SLAs

QoS

Assurance

CSP n

QoS-based Ranking

CSP n CSP 1

Deployed Services Under Evaluation

Test Assure

Rank

Stealth Testing Trace (Workload)

PRESENCE

60 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-132
SLIDE 132

Conclusion and Perspectives

Perspectives

PRESEnCE Framework

Extend testing for other cloud deployment models

֒ → PaaS, IaaS

Evaluate and monitor other KPIs (not only performance):

֒ → Security KPIs ֒ → Cost KPIs

61 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-133
SLIDE 133

Conclusion and Perspectives

Perspectives

PRESEnCE Framework

Extend testing for other cloud deployment models

֒ → PaaS, IaaS

Evaluate and monitor other KPIs (not only performance):

֒ → Security KPIs ֒ → Cost KPIs

SLAs

Smart SLA

֒ → smart detection for SLA breaches or violations ֒ → instant penalization & compensation

61 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-134
SLIDE 134

Conclusion and Perspectives

Perspectives

PRESEnCE Framework

Extend testing for other cloud deployment models

֒ → PaaS, IaaS

Evaluate and monitor other KPIs (not only performance):

֒ → Security KPIs ֒ → Cost KPIs

SLAs

Smart SLA

֒ → smart detection for SLA breaches or violations ֒ → instant penalization & compensation

⇒ PRESEnCE Launching as Commercial product

61 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-135
SLIDE 135

Conclusion and Perspectives

List of Publications I

Publication category Quantity Journal 1

  • Intl. Conferences

7 Book Chapters 1

  • Intl. Workshops

3 Technical Reports 1

Journal (1)

1.

  • A. A.Z.A. Ibrahim, M. U. Wasim, S. Varrette, and P. Bouvry. Presence: Monitoring and modelling the performance metrics of mobile cloud

SaaS web services. Mobile Information Systems, 2018.

International Conferences (7)

2.

  • A. A.Z.A. Ibrahim, D. Kliazovich, and P. Bouvry. Service Level Agreement Assurance between Cloud Services Providers and Cloud
  • Customers. 16th IEEE/ACM Intl. Symp. on Cluster, Cloud, and Grid Computing (CCGrid 2016), pages 588 - 591, 2016.

3.

  • A. A.Z.A. Ibrahim, D. Kliazovich, and P. Bouvry. On Service Level Agreement Assurance in Cloud Computing Data Centers. 9th IEEE Intl.
  • Conf. on Cloud Computing (Cloud 2016), pages 921 - 926, 2016.

62 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-136
SLIDE 136

Conclusion and Perspectives

List of Publications II

4.

  • A. A.Z.A. Ibrahim, D. Kliazovich, P. Bouvry, and A. Oleksiak. Using virtual desktop infrastructure to improve power efficiency in Grinfy
  • system. 8th IEEE Intl. Conf. on Cloud Computing Technology and Science (CloudCom’16), pages 85 - 89, 2016.

5.

  • M. U. Wasim, A. A.Z.A. Ibrahim, P. Bouvry, and T. Limba. Law as a Service (LaaS): Enabling legal protection over a blockchain network.

14th Intl. Conf. on Smart Cities: Improving Quality of Life Using ICT & IoT (HONET-ICT’17), pages 110 - 114. 2017. 6.

  • M. U. Wasim, A. A.Z.A. Ibrahim, P. Bouvry, and T. Limba. Self-regulated multi-criteria decision analysis: An autonomous brokerage-based

approach for service provider ranking in the cloud. 9th IEEE Intl. Conf. on Cloud Computing Technology and Science (CloudCom’17), pages 33 - 40. 2017. 7.

  • A. A.Z.A. Ibrahim, S. Varrette, and P. Bouvry. PRESEnCE: Toward a Novel Approach for Performance Evaluation of Mobile Cloud SaaS

Web Services. 32nd IEEE Intl. Conf. on Information Networking (ICOIN 2018), 2018. Best Paper Award 8.

  • A. A.Z.A. Ibrahim, M. U. Wasim, S. Varrette, and P. Bouvry. Performance metrics models for cloud SaaS web services. 11th IEEE Intl.
  • Conf. on Cloud Computing (CLOUD’18), pages 936 - 940. 2018.

Book Chapters (1)

9.

  • P. Bouvry, S. Varrette, M.U. Wasim, A. A.Z.A. Ibrahim, X. Besseron, and TA Trinh. Security, reliability and regulation compliance in

ultrascale computing system. Chapter 3. Ultrascale Computing Systems, pages 65 - 83. 2018. 63 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-137
SLIDE 137

Conclusion and Perspectives

List of Publications III

Workshops (3)

10.

  • A. A.Z.A. Ibrahim. PRESEnCE: A framework for monitoring, modelling and evaluating the performance of cloud SaaS web services. 48th

Annual IEEE/IFIP Intl. Conf. on Dependable Systems and Networks Workshops (DSN-W’18), pages 83 - 86. 2018. 11.

  • A. A.Z.A. Ibrahim, S. Varrette, and P. Bouvry. On verifying and assuring the cloud SLA by evaluating the performance of SaaS web services

across multi-cloud providers. DSN-W’18, pages 69 - 70. 2018. 12.

  • A. A.Z.A. Ibrahim, D. Kliazovich, P. Bouvry, and A. Oleksiak. Virtual desktop infrastructures: architecture, survey and green aspects proof of
  • concept. 7th Intl. Green and Sustainable Computing Conf. (IGSC’16), pages 1 - 8. 2016.

Technical Reports (1)

13.

  • A. A.Z.A. Ibrahim. Best Practices for Cloud Migration and Service Level Agreement Compliances. Dell EMC (Internally), 2019

64 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-138
SLIDE 138

Thank you for your attention

Abdallah Ali Zainelabden Abdallah Ibrahim University of Luxembourg, Belval Campus: Maison du Nombre, 4th floor 2, avenue de l’Université L-4365 Esch-sur-Alzette mail: abdallah.ibrahim@uni.lu

1

Context & Motivations

2

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module

3

Conclusion and Perspectives Publication category Quantity Journal 1

  • Intl. Conferences

7 Book Chapters 1

  • Intl. Workshops

3 Technical Reports 1 65 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-139
SLIDE 139

Backup Slides / Appendix

66 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-140
SLIDE 140

Backup Slides / Appendix

PRESEnCE Monitoring: Memtier

2000 4000 6000 8000 10000 10 20 30 40 Number of Requests Latency(s) Server Restarted Memtier−Bench Redis Memcached 2000 4000 6000 8000 10000 0e+00 1e+05 2e+05 3e+05 4e+05 5e+05 6e+05 Number of Requests Throughput(ops/sec) Server Restarted Memtier−Bench Redis Memcached 2000 4000 6000 8000 10000 5000 10000 15000 20000 Number of Requests Transfer Rate(Kb/sec) Server Restarted Memtier−Bench Redis Memcached

67 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • Throughput

Update Latency Transfer Rate

slide-141
SLIDE 141

Backup Slides / Appendix

PRESEnCE Models : ycsb

Metric Distribution Model Expression Throughput Beta −0.001 + 1 ∗ BETA(4.41, 2.48) where BETA(β, α) β = 4.41 α = 2.48 Offset = −0.001 f (x) =

  

xβ−1(1−x)α−1 B(β,α)

for 0 < x < 1

  • therwise

where β is the complete beta function given by B(β, α) = 1

0 tβ−1(1 − t)α−1dt

Latency Read Beta −0.001 + 1 ∗ BETA(1.64, 3.12) where BETA(β, α) β = 1.64 α = 3.12 Offset = −0.001 f (x) =

  

xβ−1(1−x)α−1 B(β,α)

for 0 < x < 1

  • therwise

where β is the complete beta function given by B(β, α) = 1

0 tβ−1(1 − t)α−1dt

Latency Update Normal NORM(0.311, 0.161) where NORM(meanµ, stdDevσ) µ = 0.311 σ = 0.161 f (x) =

1 σ √ 2π e − (x−µ)2

2σ2

for all real x

68 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • Memcached Server
slide-142
SLIDE 142

Backup Slides / Appendix

PRESEnCE Models : pgBench

Metric Distribution Model Expression Throughput

  • Latency

Log Normal −0.001 + LOGN(0.212, 0.202) where LOGN(logMeanµ, LogStdσ) µ = 0.212 σ = 0.202 Offset = −0.001 f (x) =

  • 1

σx √ 2π e − (ln(x)−µ)2 2σ2

for x > 0

  • therwise

69 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • Postgresql Server
slide-143
SLIDE 143

Backup Slides / Appendix

PRESEnCE Models : HTTP load

Metric Distribution Model Expression Throughput

  • Latency

Beta −0.001 + 1 ∗ BETA(1.55, 3.46) where BETA(β, α) β = 1.55 α = 3.46 Offset = −0.001 f (x) =

  

xβ−1(1−x)α−1 B(β,α)

for 0 < x < 1

  • therwise

where β is the complete beta function given by B(β, α) = 1

0 tβ−1(1 − t)α−1dt 70 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

  • Apache Server
slide-144
SLIDE 144

Backup Slides / Appendix

PRESEnCE Sensitivity Analysis

No_oper No_rec Threads 0.0 0.2 0.4 0.6 0.8 1.0 main effect interactions Throughput No_oper No_rec Threads 0.0 0.2 0.4 0.6 0.8 1.0 main effect interactions Read Latency No_oper No_rec Threads 0.0 0.2 0.4 0.6 0.8 1.0 main effect interactions Update Latency No_oper No_rec Threads 0.0 0.2 0.4 0.6 0.8 1.0 main effect interactions Clean−up Latency

71 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-145
SLIDE 145

Backup Slides / Appendix

PRESEnCE: Testing Campaign

Start time Disruption after Benchmark/Service # Operations # Records # Threads 4 79 ycsb-redis 100 1000 2 13 69 ycsb-Mongodb 100 500 2 23 27 ycsb-memcached 200 500 2 29 61 memtier-redis 200 300 2 35 19 ycsb-Mongodb 500 1000 2 53 75 ycsb-redis 500 300 2 62 29 memtier-Memcached 200 1000 2 65 23 twitter rpc-redis 100 300 2 68 30 twitter rpc- memcached 500 500 2

  • ● ● ● ● ● ●
  • ● ● ● ● ●
  • ● ● ●
  • ● ● ● ● ● ● ●
  • ● ●
  • 20

40 60 80 100 500 1000 1500 2000 2500 3000 Starting Time Throughput (operation/s) 500 1000 1500 2000 2500 3000 50 100 150 200 250 300 Termination Time

  • ● ● ● ● ● ● ● ●
  • ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
  • ● ● ● ● ● ●
  • ● ●
  • ● ● ● ●
  • ● ● ● ● ● ● ●
  • ● ● ● ●
  • ● ● ●
  • 5000

10000 15000 Latency (ms)

ycsb−redis ycsb−memcached ycsb−mongodb

72 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-146
SLIDE 146

Backup Slides / Appendix

Generated Workload: dstat CPU

  • 100

200 300 400 500 600 700 20 40 60 80 100 Time(s) CPU usage(%)

  • Test load

Normal load

73 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-147
SLIDE 147

Backup Slides / Appendix

PRESEnCE: The Oracle

Benchmarking Scenario / Testing Campaign / Testing Trace Model Expected Normal Trace Model / Behaviour of the CSP of SaaS Web Services Black-box Optimization (Meta-heuristics)

PRESENCE

Stealth Module

Oracle

Is it stealth ?

2 Traces Expected usage

NO

Distinguishable More optimization

YES

Non-Distinguishable

74 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-148
SLIDE 148

Backup Slides / Appendix

PRESEnCE: The Oracle

Oracle

IF RSS < Threshold

Calculate Distance RSS between Normal & Benchmarks Traces

Emulating the CSP View Expected Normal Usage Model PRESENCE Benchmarks

Yes

No

YES

NO

PRESENCE

Non-Distinguishable Optimize the Distance

74 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-149
SLIDE 149

Backup Slides / Appendix

Curve-Fitting Problem

Curve-fitting problem

The main objective

֒ → find and construct a curve or a mathematical model ֒ → has the best fit to a row data points and may based on some constraints [CUP12]

Optimization problem

֒ → minimizing the distance between two curves

Fitting a curve can involve : 1

interpolation,

2

smoothing, or

3

regression Yi = f (Xi, β) + ei

[CUP12] P. George & al. Numerical methods of curve fitting. Cambridge University Press, 2012 75 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-150
SLIDE 150

Backup Slides / Appendix

MinMax problem

Minimax

A rule in decision theory for minimizing the maximum loss

֒ → to define robust solutions [OR09]

Used in decision theory, game theory, statistics and optimization

֒ → for minimizing the the possible loss for a worst case (maximum loss) scenario ֒ → constructing solutions having the best possible performance in the worst case

For solution x ∈ X and scenario s ∈ S: min

x∈X max s∈S F(x, s)

[OR09] H. Aissi & al. Min-max and min-max regret versions of combinatorial optimization problems: A survey. Journal of operational research, 2009 76 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-151
SLIDE 151

Backup Slides / Appendix

Curve Definition

Y : Set of normal traces data, Y =

  • y1, y2, y3, ..., yn
  • ˆ

Y : Set of predicted data, ˆ Y =

  • ˆ

y1, ˆ y2, ˆ y3, ..., ˆ yn

  • X: Set of variables (PRESEnCE input paramters), X =

x1, x2, x3, ..., xn

  • Θ: Set of coefficients, Θ =

θ1, θ2, θ3, ..., θm

  • 77 / 65

Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-152
SLIDE 152

Backup Slides / Appendix

Meta-heuristics Algorithm

Genetic Algorithm (GA)

Genetic Algorithms (GAs) are one of the most famous and widely used nature-inspired algorithms in black-box optimization. GA operations starting from:

֒ → population initialization, ֒ → fitness function definition, selection of parents, ֒ → crossover, and mutation to obtain a new population of solutions.

GA: Convergence

how the residual between the two curves is reduced in each generation

78 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-153
SLIDE 153

Backup Slides / Appendix

Learning-heuristics Algorithm

Regression

improve the performance by:

֒ → feeding GA with high fitness solutions from the regression

RSS : differences between the actual data and the predicted data

10000 20000 30000 40000 50000 3000 4000 5000 6000 Operations/s Throughput 100 200 300 400 500 600 Time(s) Testing Trace GP Regression GPR +− STD RSS = 0.1302359

79 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-154
SLIDE 154

Backup Slides / Appendix

Distance (RSS)

2 4 6 8 10 0.2 0.4 0.6 0.8 1.0 GPR Iterations Residual Sum of Squares (RSS)

80 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-155
SLIDE 155

Backup Slides / Appendix

Stealth: Methodology

⇒ Proposed approaches for Stealth module Instances Single-objective Evolutionary Algorithms Performance Indicator Configurations {1,2,3,4,5,6,7,8,9} Meta-heuristics Algorithm: Genetic Algorithm (GA) Convergence Learning-heuristics Algorithm: Hybrid Algorithm Convergence

81 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-156
SLIDE 156

Backup Slides / Appendix

GA Convergence: Means configs 1:6

2 4 6 8 28800 29000 29200 29400 29600 Evaluations Convergence (Residual) 5 10 15 29200 29300 29400 29500 29600 29700 29800 Evaluations Convergence (Residual) 10 20 30 40 50 30000 30500 31000 Evaluations Convergence (Residual) 20 40 60 80 100 28800 29000 29200 29400 Evaluations Convergence (Residual) 50 100 150 200 29000 29500 30000 Evaluations Convergence (Residual) 100 200 300 400 500 29000 29500 30000 30500 31000 Evaluations Convergence (Residual)

82 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-157
SLIDE 157

Backup Slides / Appendix

GA Convergence: STD configs 1:6

2 4 6 8 850 900 950 1000 Evaluations Convergence (Residual) 5 10 15 720 740 760 780 800 820 840 Evaluations Convergence (Residual) 10 20 30 40 50 1300 1400 1500 1600 1700 1800 1900 Evaluations Convergence (Residual) 20 40 60 80 100 720 740 760 780 800 Evaluations Convergence (Residual) 50 100 150 200 900 1000 1100 1200 1300 Evaluations Convergence (Residual) 100 200 300 400 500 800 1000 1200 1400 1600 Evaluations Convergence (Residual)

83 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-158
SLIDE 158

Backup Slides / Appendix

Hybrid Convergence: Means configs 1:6

2 4 6 8 25400 25600 25800 26000 26200 Evaluations Convergence (Residual) 5 10 15 26600 26800 27000 27200 27400 27600 Evaluations Convergence (Residual) 10 20 30 40 50 26000 26500 27000 Evaluations Convergence (Residual) 20 40 60 80 100 20900 21000 21100 21200 21300 21400 21500 Evaluations Convergence (Residual) 50 100 150 200 23200 23400 23600 23800 24000 24200 24400 Evaluations Convergence (Residual) 100 200 300 400 500 26500 27000 27500 28000 28500 Evaluations Convergence (Residual)

84 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-159
SLIDE 159

Backup Slides / Appendix

Hybrid Convergence: STD configs 1:6

2 4 6 8 1050 1100 1150 1200 1250 1300 Evaluations Convergence (Residual) 5 10 15 1100 1200 1300 1400 1500 Evaluations Convergence (Residual) 10 20 30 40 50 1200 1400 1600 1800 Evaluations Convergence (Residual) 20 40 60 80 100 900 950 1000 Evaluations Convergence (Residual) 50 100 150 200 1000 1050 1100 1150 1200 Evaluations Convergence (Residual) 100 200 300 400 500 800 1000 1200 1400 1600 Evaluations Convergence (Residual)

85 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-160
SLIDE 160

Backup Slides / Appendix

Paramters Tuning: Means & STD configuration 7

20 40 60 80 100 28600 28800 29000 29200 29400 29600 Evaluations Convergence (Residual) CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5 20 40 60 80 100 800 1000 1200 1400 Evaluations Convergence (Residual) CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5

86 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-161
SLIDE 161

Backup Slides / Appendix

Paramters Tuning: Means & STD configuration 8

50 100 150 200 29000 29500 30000 30500 Evaluations Convergence (Residual) CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5 50 100 150 200 900 1000 1100 1200 1300 Evaluations Convergence (Residual) CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5

87 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-162
SLIDE 162

Backup Slides / Appendix

Paramters Tuning: Means & STD configuration 9

100 200 300 400 500 29000 29500 30000 30500 31000 31500 32000 Evaluations Convergence (Residual) CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5 100 200 300 400 500 800 1000 1200 1400 1600 1800 Evaluations Convergence (Residual) CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5

88 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

slide-163
SLIDE 163

Backup Slides / Appendix

Parameters Tuning

Configuration 7 Configuration 8 Configuration 9 Expected normal trace FIFA FIFA FIFA Number of generations 10000 10000 10000 Population size 20 50 100 Number of evaluations 500 200 10 Selection process Bi-Tour Bi-Tour Bi-Tour Crossover operator 2-point 2-point 2-point Crossover rate [0.5, 0.6, 0.7, 0.8] [0.5, 0.6, 0.7, 0.8] [0.5, 0.6, 0.7, 0.8] Mutation operator uniform uniform uniform Mutation rate 0.001 0.001 0.001 Number of executions 30 30 30

89 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-164
SLIDE 164

Backup Slides / Appendix

Parameters Tuning

Configuration 7 Configuration 8 Configuration 9 Expected normal trace FIFA FIFA FIFA Number of generations 10000 10000 10000 Population size 20 50 100 Number of evaluations 500 200 10 Selection process Bi-Tour Bi-Tour Bi-Tour Crossover operator 2-point 2-point 2-point Crossover rate [0.5, 0.6, 0.7, 0.8] [0.5, 0.6, 0.7, 0.8] [0.5, 0.6, 0.7, 0.8] Mutation operator uniform uniform uniform Mutation rate 0.001 0.001 0.001 Number of executions 30 30 30

Performance Indicator for PRESEnCE stealth module ⇒ Convergence

89 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-165
SLIDE 165

Backup Slides / Appendix

Tuning: Results

CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5 CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5 CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5 Configuration 7 Configuration 8 Configuration 9 Convergence (Residual) 28000 28500 29000 29500 30000 30500 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ev = 100 Ev = 100 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 200 Ev = 200 Ev = 500 Ev = 500 Ev = 500 Ev = 500 STD StdErr 95% Confidence Interval Ev −> Evaluations Ex −> Executions

90 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

Lower Convergence is better

slide-166
SLIDE 166

Backup Slides / Appendix

Tuning: Results

CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5 CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5 CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5 Configuration 7 Configuration 8 Configuration 9 Convergence (Residual) 28000 28500 29000 29500 30000 30500 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ev = 100 Ev = 100 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 200 Ev = 200 Ev = 500 Ev = 500 Ev = 500 Ev = 500 STD StdErr 95% Confidence Interval Ev −> Evaluations Ex −> Executions

90 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

Lower Convergence is better

slide-167
SLIDE 167

Backup Slides / Appendix

Tuning: Results

CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5 Configurations Mean Std Mean Std Mean Std Mean Std p-value Configuration 7 28675.97 94.29658 29032.91 103.4181 28833.34 94.97001 29193.87 78.38976 2.2e − 16 Configuration 8 29044.13 146.8183 29323.41 114.111 29182.42 120.0906 29570.19 110.5912 2.2e − 16 Configuration 9 29095.63 314.8841 29771.38 167.7833 30179.23 242.6208 30528.43 214.5603 2.2e − 16

91 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

slide-168
SLIDE 168

Backup Slides / Appendix

Tuning: Results

CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5 Configurations Mean Std Mean Std Mean Std Mean Std p-value Configuration 7 28675.97 94.29658 29032.91 103.4181 28833.34 94.97001 29193.87 78.38976 2.2e − 16 Configuration 8 29044.13 146.8183 29323.41 114.111 29182.42 120.0906 29570.19 110.5912 2.2e − 16 Configuration 9 29095.63 314.8841 29771.38 167.7833 30179.23 242.6208 30528.43 214.5603 2.2e − 16

  • CXPB = 0.5

CXPB = 0.6 CXPB = 0.7 CXPB = 0.8 28600 28800 29000 29200 29400 29600 29800 Convergence (Residual)

  • CXPB = 0.5

CXPB = 0.6 CXPB = 0.7 CXPB = 0.8 29000 29500 30000 30500 Convergence (Residual)

  • CXPB = 0.5

CXPB = 0.6 CXPB = 0.7 CXPB = 0.8 29000 29500 30000 30500 31000 31500 32000 Convergence (Residual)

91 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

Configuration 7 Configuration 8 Configuration 9

slide-169
SLIDE 169

Backup Slides / Appendix

Tuning: Results

CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5 Configurations Mean Std Mean Std Mean Std Mean Std p-value Configuration 7 28675.97 94.29658 29032.91 103.4181 28833.34 94.97001 29193.87 78.38976 2.2e − 16 Configuration 8 29044.13 146.8183 29323.41 114.111 29182.42 120.0906 29570.19 110.5912 2.2e − 16 Configuration 9 29095.63 314.8841 29771.38 167.7833 30179.23 242.6208 30528.43 214.5603 2.2e − 16

  • CXPB = 0.5

CXPB = 0.6 CXPB = 0.7 CXPB = 0.8 28600 28800 29000 29200 29400 29600 29800 Convergence (Residual)

  • CXPB = 0.5

CXPB = 0.6 CXPB = 0.7 CXPB = 0.8 29000 29500 30000 30500 Convergence (Residual)

  • CXPB = 0.5

CXPB = 0.6 CXPB = 0.7 CXPB = 0.8 29000 29500 30000 30500 31000 31500 32000 Convergence (Residual)

Crossover Rate ⇒ 0.8

91 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module dynamic load adaptation Modeling Module monitoring/modeling SLA checker Module

virtual QoS aggregator

Stealth Module dynamic load adaptation

Configuration 7 Configuration 8 Configuration 9