Queuing Analysis Gregory (Grisha) Chockler, Zinovi Rabinovich, Ittai - - PowerPoint PPT Presentation

queuing analysis
SMART_READER_LITE
LIVE PREVIEW

Queuing Analysis Gregory (Grisha) Chockler, Zinovi Rabinovich, Ittai - - PowerPoint PPT Presentation

Queuing Analysis Gregory (Grisha) Chockler, Zinovi Rabinovich, Ittai Abraham Operating Systems Course, Spring 2003 Hebrew University of Jerusalem, Israel Queuing Analysis p.1 Plan Review of basic probability. Markov processes and


slide-1
SLIDE 1

Queuing Analysis

Gregory (Grisha) Chockler, Zinovi Rabinovich, Ittai Abraham Operating Systems Course, Spring 2003 Hebrew University of Jerusalem, Israel

Queuing Analysis – p.1

slide-2
SLIDE 2

Plan

  • Review of basic probability.
  • Markov processes and Poisson processes.
  • Models in Queuing Theory.
  • Analysis of the M/M/1 model.
  • Analysis of the M/M/1/b model.
  • Proof of Poisson process definition equivalence.

Queuing Analysis – p.2

slide-3
SLIDE 3

Random Variables

  • A discrete random variable X can get some

countable set of values S = {x1, x2, x3 . . . , }.

  • A function f : R → [0..1] such that

x∈S f(x) = 1,

and ∀x /

∈ S, f(x) = 0 defines the probability

function of X: Pr[X = x] = f(x).

  • A continuous random variable X can get values

in some intervals of numbers.

  • A function f : R → R+ such that

−∞ f(x)dx = 1,

defines the probability density function for X. Thus Pr[X ≥ x] =

x f(x)dx.

Queuing Analysis – p.3

slide-4
SLIDE 4

Conditional Probability, Independence and Sums

  • The conditional probability that A occurs given

that B occurs is defined to be Pr[A|B] = Pr[A∩B]

Pr[B] .

  • Bayes’ theorem: Pr[A|B] = Pr[B|A] Pr[A]

Pr[B]

.

  • Events A and B are independent if

Pr[A ∩ B] = Pr[A] Pr[B].

  • X, Y are independent if X ∈ x and Y ∈ y are

independent events for all x, y.

  • If X, Y are independent, with probability functions

fX, fY then for the sum Z = X + Y we have fZ(z) =

x∈X fX(x)fY (z − x) and

fZ(z) = ∞

−∞ fX(x)fY (z − x)dx.

Queuing Analysis – p.4

slide-5
SLIDE 5

Expectation

  • The expectation of a random variable X with

probability function f(x) is defined as

E(X) =

x|f(x)>0 f(x)x, and E(x) =

−∞ f(x)xdx.

  • Linearity of expectation:

E(aX + bY ) = aE(X) + bE(Y ).

  • Example: exponential distribution with

parameter θ: f(x) =

  • θe−θx

x ≥ 0 x < 0

  • Recall:
  • vdu = vu −
  • udv, take v = x, u = −θe−θx.
  • E(x) =

0 θe−θxxdx = −xe−θx|∞ 0 −

0 −e−θx =

0 + ∞

0 e−θx = −1 θe−θx|∞ 0 = 1 θ.

Queuing Analysis – p.5

slide-6
SLIDE 6

Variance

  • Variance is defined as:

var(X) = E

  • (X − E(X))2

= E(X2) − E(X)2 (if it

converges).

  • Standard deviation: σX =
  • var(X), coefficient
  • f variation: cv(X) =

σX E(x).

  • Covariance: cov(X, Y ) = E(XY ) − E(X)E(Y ),

cov(X, X) = var(X), if X and Y are independent

then cov(X, Y ) = 0.

  • var(aX) = a2var(X), and

var(X + Y ) = var(X) + var(Y ) + 2cov(X, Y ).

Queuing Analysis – p.6

slide-7
SLIDE 7

Markov Processes

  • A stochastic process is a series of random

variables X1, X2, . . . .

  • A markov process is a stochastic process in

which the probability of Xt is determined by only by

Xt−1.

  • Formally, Pr(Xt = it|Xt−1 = it−1) = Pr(Xt = it|Xt−1 =

it−1, Xt−2 = it−2, . . . , X1 = i1).

  • A markov process can be viewed as a weighted

directed graph in which ∀v ∈ V ,

  • (u→v)∈E d(u, v) = 1.

Queuing Analysis – p.7

slide-8
SLIDE 8

Example of a Markov Process

A B C D E 1/2 1/2 1/2 1/2 1/2 1/2 1 1/2 1/3 1/3 1/3

Queuing Analysis – p.8

slide-9
SLIDE 9

Stationary Distribution

  • Under certain conditions, markov process (V, E, d)

has a stationary distribution π. limt→∞ Xt = π

  • If Xt ∼ π then Xt+1 ∼ π. (

v∈V πv = 1)

  • Formally, Pr(Xt+1 = v) =

(u→v) Pr(Xt = u)d(u, v),

πv =

(u→v) πud(u, v).

  • Balance property: given a cut (U, ¯

U) the flow of

probability π is balanced.

  • Formally,
  • (u→v)|u∈U,v∈ ¯

U

πud(u, v) =

  • (v→u)|u∈U,v∈ ¯

U

πvd(v, u)

.

Queuing Analysis – p.9

slide-10
SLIDE 10

Example of a Stationary Distribution

A 1/4 B 1/4 C 1/6 D 1/6 E 1/6 1/2 1/2 1/2 1/2 1/2 1/2 1 1/2 1/3 1/3 1/3

1/6(½)+1/6(½)+1/6(½)=¼(1/3)+¼(1/3)+¼(1/3)

Queuing Analysis – p.10

slide-11
SLIDE 11

Poisson Process

  • Consider a stochastic process {N(T)|t ≥ 0},

N(0) = 0, N(t) counts the number of occurrences

  • f an event, thus N(t) ∈ N.
  • N(t) = n means that there have been n
  • ccurrences by time t. If N(t) = n now, from there it

can only go to state n + 1. This happens as soon as the next event takes place.

  • Let Tn denote the time the process spends in state

n.

  • Poisson process: If Tn are exponentially

distributed with the same mean, say 1

θ, for all n and

all the Tn’s are independent.

Queuing Analysis – p.11

slide-12
SLIDE 12

Poisson Process - Equivalent Definitions

  • Poisson process 1: All Tn ∼ exp(θ), and all the

Tn’s are independent.

  • Poisson process 2: N(t) ∼ Poisson(θt).

Pr[N(t) = k] = (θt)k

k! e−θt, and the number of events in

non-overlapping intervals are independent.

  • Poisson process 3: N(0) = 0, for any k as t → 0,
  • Pr[N(k + t) = N(k)] = 1 − θt + o(t).
  • Pr[N(k + t) = 1 + N(k)] = θt + o(t).
  • Pr[N(k + t) ≥ 2 + N(k)] = o(t)
  • Increments in non-overlapping intervals are
  • independent. (Recall f(x) = o(g(x)) means

limx→0

f(x) g(x) = 0).

Queuing Analysis – p.12

slide-13
SLIDE 13

Queuing Models

Things we need to decide about our model:

  • Job arrival, Job service time.
  • No. of processor, their qualities.
  • Queue size, dispatch discipline, No. of queues.
  • Kandell’s notation: B/D/n/k/dd, B=arrival,

D=service time, n=processors, k=queue length, dd=dispach discipline.

Queuing Analysis – p.13

slide-14
SLIDE 14

The M/M/1 model

  • Single FIFO queue, single processor.
  • Jobs arrive at rate λ, thus job births form a poisson

process with parameter λ.

  • The service time of a job is exponentially

distributed with parameter µ, thus job deaths form a poisson process with parameter µ.

Queuing Analysis – p.14

slide-15
SLIDE 15

Analysis of M/M/1

  • Examine time scale at at small intervals δ, 2δ, 3δ, . . . .
  • Analyze as δ → 0 so omit o(δ) factors.
  • (u→v)|u∈U,v∈ ¯

U

πud(u, v) =

  • (v→u)|u∈U,v∈ ¯

U

πvd(v, u)

Queuing Analysis – p.15

slide-16
SLIDE 16

Analysis of M/M/1 (part 2)

  • Let p be the stationary distribution. For any i, pi is

the probability of having i jobs in the system at the steady state.

  • By the balance property, at the steady state

pj(µδ + o(δ)) = pj−1(λδ + o(δ))

  • Taking δ → 0:

p1 = limδ→0

λδ+o(δ) µδ+o(δ)p0 = limδ→0 λδ+o(δ) δ µδ+o(δ) δ

p0 = λ

µp0

  • Same way for every j > 0 , pj = λ

µpj−1.

  • Denote the traffic intensity ρ = λ

µ.

  • Since pj = ρpj−1 so pj = ρjp0.

Queuing Analysis – p.16

slide-17
SLIDE 17

Analysis (part 3)

  • Recall k

i=0 ai = ak+1−1 a−1 , ∞ i=0 ai = 1 1−a,

i=0 iai = a (1−a)2 for a < 1.

  • Now 1 = ∞

i=0 pi. Using pj = ρjp0 we have

1 = ∞

i=0 ρip0 = p0 1−ρ so p0 = 1 − ρ.

  • The expected number of jobs in the system

E[n] =

  • i=0

i(1 − ρ)ρi = (1 − ρ)

  • i=0

iρi = ρ 1 − ρ

  • The expected number of waiting jobs in the system

is E[w] = ∞

i=0 i(1 − ρ)ρi+1 = ρ2 1−ρ.

Queuing Analysis – p.17

slide-18
SLIDE 18

Analysis (part 4)

  • Response time r = time waiting + time served.
  • Little’s law: E[r] = E[n]

λ

(targil).

  • So

E[r] = ρ (1 − ρ)λ =

λ µ

(1 − λ

µ)λ

= 1 µ − λ

Queuing Analysis – p.18

slide-19
SLIDE 19

The M/M/1/b model

  • Same as M/M/1 but now queue has bounded size

b.

  • When the queue is full, arriving jobs are dropped.
  • Makes the model a bit more realistic.

Queuing Analysis – p.19

slide-20
SLIDE 20

Analysis of M/M/1/b

  • Same as M/M/1 we have pj = ρjp0.
  • Recall k

i=0 ai = ak+1−1 a−1 .

  • 1 = b

i=0 ρip0 = p0 ρb+1−1 ρ−1 .

  • So p0 =

1−ρ 1−ρb+1

  • Recall k

i=0 iai−1 = (k+1)ak a−1

− ak+1−1

(a−1)2 .

  • The expected number of jobs:

E[n] = b

i=0 1−ρ 1−ρb+1ρii =

  • 1−ρ

1−ρb+1ρ

  • (b+1)ρb

ρ−1

− ρb+1−1

(ρ−1)2

  • =

ρ 1−ρ − ρ(b+1)ρb+1 1−ρb+1

Queuing Analysis – p.20

slide-21
SLIDE 21

Poisson Process - Equivalent Definitions

  • Poisson process 1: All Tn ∼ exp(θ), and all the

Tn’s are independent.

  • Poisson process 2: N(t) ∼ Poisson(θt).

Pr[N(t) = k] = (θt)k

k! e−θt, and the number of events in

non-overlapping intervals are independent.

  • Poisson process 3: N(0) = 0, for any k as t → 0,
  • Pr[N(k + t) = N(k)] = 1 − θt + o(t).
  • Pr[N(k + t) = 1 + N(k)] = θt + o(t).
  • Pr[N(k + t) ≥ 2 + N(k)] = o(t)
  • Increments in non-overlapping intervals are
  • independent. (Recall f(x) = o(g(x)) means

limx→0

f(x) g(x) = 0).

Queuing Analysis – p.21

slide-22
SLIDE 22

Poisson Process - Proof of (1 ⇒ 3)

  • Suppose for all n, Tn ∼ exp(θ) are independent.
  • Note that Pr[N(k + t) = b + N(k)] = Pr[N(t) = b] due

to independence.

  • We need to find

Pr[N(t) = 0], Pr[N(t) = 1], Pr[N(t) > 1].

  • Pr[N(t) = 0] = Pr[T0 ≥ t] =

t

θe−θxdx = e−θt.

  • Using L

’Hôpital’s rule:

lim

t→0

e−θt − 1 + θt t = lim

t→0 −θe−θt + θ = 0

.

  • So Pr[N(t) = 0] = 1 − θt + o(t).

Queuing Analysis – p.22

slide-23
SLIDE 23

Poisson Process - Proof of (1 ⇒ 3) (part 2)

  • Let Z be the sum of two exponentially distributed

random variables.

  • Z = T0 + T1 and for z ≥ 0 we have fZ(z) =

z

0 fT0(x)fT1(z − x)dx =

z

0 θe−θxθe−θ(z−x)dx = θ2ze−θz.

  • Pr[N(t) ≥ 2] = Pr[T0 + T1 ≤ t] =

t

0 Fz(x)dx =

θ2 t

0 ze−θz = 1 − (θt + 1)e−θt (integration by parts).

  • So Pr[N(t) ≥ 2] = o(t) since limt→0

1−(θt+1)e−θt t

= 0

(L ’Hôpital’s rule).

  • Using Pr[N(t) = 0] = 1 − θt + o(t) we get

Pr[N(t) = 1] = θt.

Queuing Analysis – p.23

slide-24
SLIDE 24

Poisson Process - Proof of (3 ⇒ 2)

  • Recall the binomial distribution

Pr(Bp

n = k) =

n

k

  • pk(1 − p)n−k.
  • How does N(t) behave ?
  • Divide t into small parts of size t/n, when the parts

are small enough then the probability of one event is close to θt/n. So the number of events till time t has a binomial distribution with n trails and

p = θt/n.

  • Pr[N(t) = k] equals the limit of the binomial

distribution of k successes where n → ∞, p → 0 while np = θt.

Queuing Analysis – p.24

slide-25
SLIDE 25

Poisson Process - Proof of (3 ⇒ 2) (part 2)

So Pr[N(t) = k] = limn→∞

n

k

  • (θt

n )k(1 − θt n )n−k

= lim

n→∞

n! k!(n − k)! (θt)k nk 1 − θt n n−k = lim

n→∞

(θt)k k! n! (n − k)!nk

  • 1 − θt

n n

θt(θt− kθt n )

= lim

n→∞

(θt)k k! k−1

  • i=0

n − i n

  • e−1θt− kθt

n

= (θt)k k! e−θt

.

Queuing Analysis – p.25

slide-26
SLIDE 26

Poisson Process - Proof of (2 ⇒ 1)

  • Given Pr[N(t) = k] = (θt)k

k! e−θt.

  • We have Pr[T0 ≥ t] = Pr[N(t) = 0] = eθt. So

T0 ∼ exp(θ).

  • For Ti Denote zi =

0≤j<i Tj.

  • So generally,

Pr[Ti ≥ t] = Pr[N(zi) = N(zi + t)] = Pr[N(t) = 0] = eθt.

  • And Ti ∼ exp(θ).

Queuing Analysis – p.26