A Lasserre-based (1 + ) -approximation for Makespan Scheduling - - PowerPoint PPT Presentation

a lasserre based 1 approximation for makespan scheduling
SMART_READER_LITE
LIVE PREVIEW

A Lasserre-based (1 + ) -approximation for Makespan Scheduling - - PowerPoint PPT Presentation

A Lasserre-based (1 + ) -approximation for Makespan Scheduling with Precedence Constraints Elaine Levey and Thomas Rothvoss Makespan Scheduling w. Precedence Constraints Input: Unit-size jobs J with precedence constraints Number m


slide-1
SLIDE 1

A Lasserre-based (1 + ε)-approximation for Makespan Scheduling with Precedence Constraints

Elaine Levey and Thomas Rothvoss

slide-2
SLIDE 2

Makespan Scheduling w. Precedence Constraints

Input:

◮ Unit-size jobs J with precedence constraints ◮ Number m of identical machines

1 2 3 4 5 6 7 8 9 10 11 12 13 machine 1 machine 2 machine m

slide-3
SLIDE 3

Makespan Scheduling w. Precedence Constraints

Input:

◮ Unit-size jobs J with precedence constraints ◮ Number m of identical machines

Goal: Schedule jobs as to minimize makespan 1 2 3 4 5 6 7 8 9 10 11 12 13 machine 1 machine 2 machine m 1 2 3 4 5 6 7 8 9 10 11 12 13 makespan

slide-4
SLIDE 4

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m

slide-5
SLIDE 5

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1

slide-6
SLIDE 6

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 9

slide-7
SLIDE 7

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 9

slide-8
SLIDE 8

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 9

slide-9
SLIDE 9

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 9

slide-10
SLIDE 10

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 9

slide-11
SLIDE 11

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 9

slide-12
SLIDE 12

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 7 9

slide-13
SLIDE 13

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 7 8 9

slide-14
SLIDE 14

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 7 8 9 10

slide-15
SLIDE 15

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 7 8 9 10 11

slide-16
SLIDE 16

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 7 8 9 10 11 12

slide-17
SLIDE 17

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 7 8 9 10 11 12 13

slide-18
SLIDE 18

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 7 8 9 10 11 12 13 Analysis:

◮ There is a chain active at all non-busy times!

slide-19
SLIDE 19

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 7 8 9 10 11 12 13 ≤ OPT Analysis:

◮ There is a chain active at all non-busy times! ◮ Length of chain ≤ OPT

slide-20
SLIDE 20

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 7 8 9 10 11 12 13 ≤ OPT ≤ OPT Analysis:

◮ There is a chain active at all non-busy times! ◮ Length of chain ≤ OPT ◮ Length of busy times ≤ OPT

slide-21
SLIDE 21

List Scheduling (Graham ’66)

Algorithm: (1) FOR slots at time t = 1 TO ∞ DO: (2) Select any job that has all predecessors completed 1 2 3 4 5 6 7 8 9 10 11 12 13 1 2 m 1 2 3 4 5 6 7 8 9 10 11 12 13 ≤ OPT ≤ OPT Analysis:

◮ There is a chain active at all non-busy times! ◮ Length of chain ≤ OPT ◮ Length of busy times ≤ OPT

⇒ 2-apx ((2 − 2

m) for pj = 1; 2 − 1 m for arbitrary pj)

slide-22
SLIDE 22

A linear programming formulation

T

  • t=1

xjt = 1 ∀j ∈ J

  • j∈J

xjt ≤ m ∀t ∈ [T]

  • t′≤t

xit′ ≤

  • t′≤t+1

xjt′ ∀i ≺ j ∀t ∈ [T] 0 ≤ xjt ≤ 1 ∀j ∈ J ∀t ∈ [T] Interpretation: xjt =

  • 1

if run j at time t

  • therwise
slide-23
SLIDE 23

A linear programming formulation

T

  • t=1

xjt = 1 ∀j ∈ J

  • j∈J

xjt ≤ m ∀t ∈ [T]

  • t′≤t

xit′ ≤

  • t′≤t+1

xjt′ ∀i ≺ j ∀t ∈ [T] 0 ≤ xjt ≤ 1 ∀j ∈ J ∀t ∈ [T] Interpretation: xjt =

  • 1

if run j at time t

  • therwise

1 2 3 4 5 6 time 1 2 3 4 5 6 1 2 3 4 5 6 M1 M2

◮ Integrality gap: 2 − Θ( 1 m)

slide-24
SLIDE 24

Main Theorem

Unit size job makespan minimization:

◮ (2 − Θ( 1 m))-apx in poly-time [Graham ’66]

slide-25
SLIDE 25

Main Theorem

Unit size job makespan minimization:

◮ (2 − Θ( 1 m))-apx in poly-time [Graham ’66] ◮ No (2 − ε)-apx under variant of UGC [Svensson ’10]

slide-26
SLIDE 26

Main Theorem

Unit size job makespan minimization:

◮ (2 − Θ( 1 m))-apx in poly-time [Graham ’66] ◮ No (2 − ε)-apx under variant of UGC [Svensson ’10]

Open problems:

◮ Is P3 | pj = 1, prec | Cmax NP-hard? [Garey Johnson ’79]

slide-27
SLIDE 27

Main Theorem

Unit size job makespan minimization:

◮ (2 − Θ( 1 m))-apx in poly-time [Graham ’66] ◮ No (2 − ε)-apx under variant of UGC [Svensson ’10]

Open problems:

◮ Is P3 | pj = 1, prec | Cmax NP-hard? [Garey Johnson ’79] ◮ Is there a PTAS for P3 | pj = 1, prec | Cmax?

[Part of # 1 of “Open problems in Scheduling”; Schuurman, Woeginger ’99]

slide-28
SLIDE 28

Main Theorem

Unit size job makespan minimization:

◮ (2 − Θ( 1 m))-apx in poly-time [Graham ’66] ◮ No (2 − ε)-apx under variant of UGC [Svensson ’10]

Open problems:

◮ Is P3 | pj = 1, prec | Cmax NP-hard? [Garey Johnson ’79] ◮ Is there a PTAS for P3 | pj = 1, prec | Cmax?

[Part of # 1 of “Open problems in Scheduling”; Schuurman, Woeginger ’99]

Theorem (Levey, R. 15)

The integrality gap of LP is 1 + ε after (log n)O(log log n)-Lasserre rounds (for constants m and ε > 0).

◮ Running time n(log n)O(log log n) ◮ Sherali-Adams suffices

slide-29
SLIDE 29

Lift-and-project methods K

  • int. hull
  • f K

b b b

slide-30
SLIDE 30

Lift-and-project methods K

  • int. hull
  • f K

b b b b

Las1(K)

b b b

slide-31
SLIDE 31

Lift-and-project methods K

  • int. hull
  • f K

b b b b

Las2(K)

b b b

slide-32
SLIDE 32

Lift-and-project methods K

  • int. hull
  • f K

b b b b Las3(K)

b b b

slide-33
SLIDE 33

Lift-and-project methods K

  • int. hull
  • f K

b b b b

Lasn(K)

b b b

slide-34
SLIDE 34

Round-t Lasserre relaxation

◮ Given: K = {x ∈ Rn | Ax ≥ b}.

slide-35
SLIDE 35

Round-t Lasserre relaxation

◮ Given: K = {x ∈ Rn | Ax ≥ b}. ◮ Introduce variables for I ⊆ [n], |I| ≤ t

yI ≡

  • i∈I

(xi = 1)

slide-36
SLIDE 36

Round-t Lasserre relaxation

◮ Given: K = {x ∈ Rn | Ax ≥ b}. ◮ Introduce variables for I ⊆ [n], |I| ≤ t

yI ≡

  • i∈I

(xi = 1)

◮ y{i} = xi

slide-37
SLIDE 37

Round-t Lasserre relaxation

◮ Given: K = {x ∈ Rn | Ax ≥ b}. ◮ Introduce variables for I ⊆ [n], |I| ≤ t

yI ≡

  • i∈I

(xi = 1)

◮ y{i} = xi ◮ y∅ = 1

slide-38
SLIDE 38

Round-t Lasserre relaxation

◮ Given: K = {x ∈ Rn | Ax ≥ b}. ◮ Introduce variables for I ⊆ [n], |I| ≤ t

yI ≡

  • i∈I

(xi = 1)

◮ y{i} = xi ◮ y∅ = 1

Round-t Lasserre relaxation

(yI∪J)|I|,|J|≤t

  • i∈[n] AℓiyI∪J∪{i} − bℓyI∪J
  • |I|,|J|≤t
  • ∀ℓ ∈ [m]

y∅ = 1

slide-39
SLIDE 39

Round-t Lasserre relaxation

◮ Given: K = {x ∈ Rn | Ax ≥ b}. ◮ Introduce variables for I ⊆ [n], |I| ≤ t

yI ≡

  • i∈I

(xi = 1)

◮ y{i} = xi ◮ y∅ = 1

Round-t Lasserre relaxation

(yI∪J)|I|,|J|≤t

  • i∈[n] AℓiyI∪J∪{i} − bℓyI∪J
  • |I|,|J|≤t
  • ∀ℓ ∈ [m]

y∅ = 1 moment matrix

slide-40
SLIDE 40

Round-t Lasserre relaxation

◮ Given: K = {x ∈ Rn | Ax ≥ b}. ◮ Introduce variables for I ⊆ [n], |I| ≤ t

yI ≡

  • i∈I

(xi = 1)

◮ y{i} = xi ◮ y∅ = 1

Round-t Lasserre relaxation

(yI∪J)|I|,|J|≤t

  • i∈[n] AℓiyI∪J∪{i} − bℓyI∪J
  • |I|,|J|≤t
  • ∀ℓ ∈ [m]

y∅ = 1 moment matrix slack moment matrix

slide-41
SLIDE 41

Round-t Lasserre relaxation

◮ Given: K = {x ∈ Rn | Ax ≥ b}. ◮ Introduce variables for I ⊆ [n], |I| ≤ t

yI ≡

  • i∈I

(xi = 1)

◮ y{i} = xi ◮ y∅ = 1

Round-t Lasserre relaxation

(yI∪J)|I|,|J|≤t

  • i∈[n] AℓiyI∪J∪{i} − bℓyI∪J
  • |I|,|J|≤t
  • ∀ℓ ∈ [m]

y∅ = 1 moment matrix slack moment matrix

◮ Solvable in time nO(t)mO(1)

slide-42
SLIDE 42

Inducing on one variable

Lemma

For y ∈ Last(K), t ≥ 1, i ∈ [n], y ∈ conv{z ∈ Last−1(K) | zi ∈ {0, 1}}. y Last 1 y{i} R2n−1

slide-43
SLIDE 43

Inducing on one variable

Lemma

For y ∈ Last(K), t ≥ 1, i ∈ [n], y ∈ conv{z ∈ Last−1(K) | zi ∈ {0, 1}}. y Last

b b

z(0) z(0)

{i} = 0

z(1) z(1)

{i} = 1

Last−1 1 y{i} R2n−1

slide-44
SLIDE 44

Inducing on one variable

Lemma

For y ∈ Last(K), t ≥ 1, i ∈ [n], y ∈ conv{z ∈ Last−1(K) | zi ∈ {0, 1}}. y Last

b b

z(0) z(0)

{i} = 0

z(1) z(1)

{i} = 1

Last−1 1 y{i} R2n−1 Operation: “Induce on xi = 1” y Last

b z

zi = 1 Last−1 1 y{i} R2n−1

slide-45
SLIDE 45

Inducing on one variable

Lemma

For y ∈ Last(K), t ≥ 1, i ∈ [n], y ∈ conv{z ∈ Last−1(K) | zi ∈ {0, 1}}. y Last

b b

z(0) z(0)

{i} = 0

z(1) z(1)

{i} = 1

Last−1 1 y{i} R2n−1 Operation: “Induce on xi = 1” y Last

b z

zi = 1 Last−1 1 y{i} R2n−1

◮ Observation: Conditioning only shrinks support!

slide-46
SLIDE 46

Partition

T

slide-47
SLIDE 47

Partition

T

◮ Partition time horizon into binary family

slide-48
SLIDE 48

Partition

T

◮ Partition time horizon into binary family

slide-49
SLIDE 49

Partition

T

◮ Partition time horizon into binary family

slide-50
SLIDE 50

Partition

T . . . log(T)

◮ Partition time horizon into binary family

slide-51
SLIDE 51

Partition

T . . . log(T) supp(j) supp(j′) supp(j′′)

◮ Partition time horizon into binary family

slide-52
SLIDE 52

Partition

T . . . log(T)

◮ Partition time horizon into binary family ◮ Assign jobs to min interval containing support

slide-53
SLIDE 53

The algorithm

. . . . . . time

slide-54
SLIDE 54

The algorithm

. . . . . . O(log log n)2 time (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T

slide-55
SLIDE 55

The algorithm

. . . . . . O(log log n)2 time (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T

slide-56
SLIDE 56

The algorithm

. . . . . . O(log log n) time bottom middle top (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T (2) Among first O(log log n)2 levels pick O(log log n) of consecutive middle levels and discard jobs

slide-57
SLIDE 57

The algorithm

I . . . . . . O(log log n) time bottom middle top (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T (2) Among first O(log log n)2 levels pick O(log log n) of consecutive middle levels and discard jobs (3) For each interval I directly below middle levels: Recursively call algorithm I with {j | supp(j) ⊆ I} → valid schedule for bottoms jobs

slide-58
SLIDE 58

The algorithm

I . . . . . . O(log log n) time bottom middle top (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T (2) Among first O(log log n)2 levels pick O(log log n) of consecutive middle levels and discard jobs (3) For each interval I directly below middle levels: Recursively call algorithm I with {j | supp(j) ⊆ I} → valid schedule for bottoms jobs

slide-59
SLIDE 59

The algorithm

I . . . . . . O(log log n) time bottom middle top (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T (2) Among first O(log log n)2 levels pick O(log log n) of consecutive middle levels and discard jobs (3) For each interval I directly below middle levels: Recursively call algorithm I with {j | supp(j) ⊆ I} → valid schedule for bottoms jobs

slide-60
SLIDE 60

The algorithm

I . . . . . . O(log log n) time bottom middle top (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T (2) Among first O(log log n)2 levels pick O(log log n) of consecutive middle levels and discard jobs (3) For each interval I directly below middle levels: Recursively call algorithm I with {j | supp(j) ⊆ I} → valid schedule for bottoms jobs

slide-61
SLIDE 61

The algorithm

I . . . . . . O(log log n) time bottom middle top (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T (2) Among first O(log log n)2 levels pick O(log log n) of consecutive middle levels and discard jobs (3) For each interval I directly below middle levels: Recursively call algorithm I with {j | supp(j) ⊆ I} → valid schedule for bottoms jobs

slide-62
SLIDE 62

The algorithm

I . . . . . . O(log log n) time bottom middle top (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T (2) Among first O(log log n)2 levels pick O(log log n) of consecutive middle levels and discard jobs (3) For each interval I directly below middle levels: Recursively call algorithm I with {j | supp(j) ⊆ I} → valid schedule for bottoms jobs

slide-63
SLIDE 63

The algorithm

. . . . . . O(log log n) time bottom middle top (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T (2) Among first O(log log n)2 levels pick O(log log n) of consecutive middle levels and discard jobs (3) For each interval I directly below middle levels: Recursively call algorithm I with {j | supp(j) ⊆ I} → valid schedule for bottoms jobs (4) Schedule top jobs

slide-64
SLIDE 64

The algorithm

. . . . . . O(log log n) time bottom middle top (1) Apply conditioning until max chain length of jobs in first O(log log n)2 levels down to ε′T (2) Among first O(log log n)2 levels pick O(log log n) of consecutive middle levels and discard jobs (3) For each interval I directly below middle levels: Recursively call algorithm I with {j | supp(j) ⊆ I} → valid schedule for bottoms jobs (4) Schedule top jobs

slide-65
SLIDE 65

Reducing chain length

Lemma

After conditioning 2O(log log n)2 · 1

ε′ times, maximum length chain

among top jobs is ≤ ε′ · T.

slide-66
SLIDE 66

Reducing chain length

Lemma

After conditioning 2O(log log n)2 · 1

ε′ times, maximum length chain

among top jobs is ≤ ε′ · T.

I I1 I2

◮ Pick an interval I

slide-67
SLIDE 67

Reducing chain length

Lemma

After conditioning 2O(log log n)2 · 1

ε′ times, maximum length chain

among top jobs is ≤ ε′ · T.

I I1 I2

◮ Pick an interval I and consider jobs assigned to it.

slide-68
SLIDE 68

Reducing chain length

Lemma

After conditioning 2O(log log n)2 · 1

ε′ times, maximum length chain

among top jobs is ≤ ε′ · T. j ≥ δ|I|

I I1 I2

◮ Pick an interval I and consider jobs assigned to it. ◮ Consider j with δ|I| successors

slide-69
SLIDE 69

Reducing chain length

Lemma

After conditioning 2O(log log n)2 · 1

ε′ times, maximum length chain

among top jobs is ≤ ε′ · T. j ≥ δ|I|

I I1 I2

◮ Pick an interval I and consider jobs assigned to it. ◮ Consider j with δ|I| successors ⇒ condition on xj,I2 = 1

slide-70
SLIDE 70

Reducing chain length

Lemma

After conditioning 2O(log log n)2 · 1

ε′ times, maximum length chain

among top jobs is ≤ ε′ · T. j ≥ δ|I|

I I1 I2

◮ Pick an interval I and consider jobs assigned to it. ◮ Consider j with δ|I| successors ⇒ condition on xj,I2 = 1 ◮ Needed ≤ 1 δ times per interval

slide-71
SLIDE 71

Reducing chain length

Lemma

After conditioning 2O(log log n)2 · 1

ε′ times, maximum length chain

among top jobs is ≤ ε′ · T. j ≥ δ|I|

I I1 I2

◮ Pick an interval I and consider jobs assigned to it. ◮ Consider j with δ|I| successors ⇒ condition on xj,I2 = 1 ◮ Needed ≤ 1 δ times per interval; 2O(log log n)2 intervals

slide-72
SLIDE 72

Reducing chain length

Lemma

After conditioning 2O(log log n)2 · 1

ε′ times, maximum length chain

among top jobs is ≤ ε′ · T. j ≥ δ|I|

I I1 I2

◮ Pick an interval I and consider jobs assigned to it. ◮ Consider j with δ|I| successors ⇒ condition on xj,I2 = 1 ◮ Needed ≤ 1 δ times per interval; 2O(log log n)2 intervals ◮ Loose ≤ 1 δ · 2O(log log n)2 Lasserre-rounds

slide-73
SLIDE 73

Reducing chain length

Lemma

After conditioning 2O(log log n)2 · 1

ε′ times, maximum length chain

among top jobs is ≤ ε′ · T. j ≥ δ|I|

I I1 I2

◮ Pick an interval I and consider jobs assigned to it. ◮ Consider j with δ|I| successors ⇒ condition on xj,I2 = 1 ◮ Needed ≤ 1 δ times per interval; 2O(log log n)2 intervals ◮ Loose ≤ 1 δ · 2O(log log n)2 Lasserre-rounds ◮ Longest chain in top jobs: δT · O(log log n)2

slide-74
SLIDE 74

Scheduling top jobs (I) . . .

j top: bottom:

slide-75
SLIDE 75

Scheduling top jobs (I) . . .

j top: bottom:

slide-76
SLIDE 76

Scheduling top jobs (I)

◮ For top job j set

[release time(j), deadline(j)] := fully contained bottom intervals

. . .

j top: bottom: release time(j) deadline(j)

slide-77
SLIDE 77

Scheduling top jobs (I)

◮ For top job j set

[release time(j), deadline(j)] := fully contained bottom intervals

◮ Set Q := polylog(n) as top/bottom gap

. . .

j top: bottom: release time(j) deadline(j) Q

slide-78
SLIDE 78

Scheduling top jobs (I)

Lemma

Can assign top jobs respecting release times/deadlines, ignoring

  • prec. constraints & discarding 2Tm

Q

jobs.

◮ For top job j set

[release time(j), deadline(j)] := fully contained bottom intervals

◮ Set Q := polylog(n) as top/bottom gap

. . .

j top: bottom: release time(j) deadline(j) Q

slide-79
SLIDE 79

Scheduling top jobs (I)

Lemma

Can assign top jobs respecting release times/deadlines, ignoring

  • prec. constraints & discarding 2Tm

Q

jobs.

◮ For top job j set

[release time(j), deadline(j)] := fully contained bottom intervals

◮ Set Q := polylog(n) as top/bottom gap ◮ Hall’s Theorem: Loss for jobs J′ is 2 Q-fraction of slots

. . .

top: bottom: Q

slide-80
SLIDE 80

EDF (1)

Setting: time

slide-81
SLIDE 81

EDF (1)

Setting:

◮ Jobs with precedence constraints

time

slide-82
SLIDE 82

EDF (1)

Setting:

◮ Jobs with precedence constraints ◮ Maximum chain length C

length C time

slide-83
SLIDE 83

EDF (1)

Setting:

◮ Jobs with precedence constraints ◮ Maximum chain length C ◮ Release times, deadlines only at beginning/end of p blocks

length C time 1 2 p

slide-84
SLIDE 84

EDF (1)

Setting:

◮ Jobs with precedence constraints ◮ Maximum chain length C ◮ Release times, deadlines only at beginning/end of p blocks ◮ Capacity cap(t) ∈ {0, . . . , m}

length C time 1 2 p

slide-85
SLIDE 85

EDF (1)

Setting:

◮ Jobs with precedence constraints ◮ Maximum chain length C ◮ Release times, deadlines only at beginning/end of p blocks ◮ Capacity cap(t) ∈ {0, . . . , m} ◮ There exists a schedule ignoring precedence constraints

length C time 1 2 p

slide-86
SLIDE 86

EDF (1)

Setting:

◮ Jobs with precedence constraints ◮ Maximum chain length C ◮ Release times, deadlines only at beginning/end of p blocks ◮ Capacity cap(t) ∈ {0, . . . , m} ◮ There exists a schedule ignoring precedence constraints

length C time 1 2 p Earliest Deadline First: (1) FOR t = 1 TO T DO

(2) Process available job with minimum deadline (3) If any job not done by deadline discard it

slide-87
SLIDE 87

EDF (2)

Analysis: 1 2 p possible release times / deadlines time

slide-88
SLIDE 88

EDF (2)

Analysis:

◮ Suppose in some interval I we discard K jobs

> K discarded

slide-89
SLIDE 89

EDF (2)

Analysis:

◮ Suppose in some interval I we discard K jobs ◮ Assume w.l.o.g. last discarded job had highest deadline

> K discarded

slide-90
SLIDE 90

EDF (2)

Analysis:

◮ Suppose in some interval I we discard K jobs ◮ Assume w.l.o.g. last discarded job had highest deadline ◮ Take maximum interval [t1, t2] ⊇ I so that in each block, there

are ≤ C idle times > K discarded ≤ C idle times each > C idle times t1 t2

slide-91
SLIDE 91

EDF (2)

Analysis:

◮ Suppose in some interval I we discard K jobs ◮ Assume w.l.o.g. last discarded job had highest deadline ◮ Take maximum interval [t1, t2] ⊇ I so that in each block, there

are ≤ C idle times

◮ All jobs scheduled or discarded in [t1, t2] have both release +

deadline in [t1, t2]. > K discarded ≤ C idle times each > C idle times t1 t2

slide-92
SLIDE 92

EDF (2)

Analysis:

◮ Suppose in some interval I we discard K jobs ◮ Assume w.l.o.g. last discarded job had highest deadline ◮ Take maximum interval [t1, t2] ⊇ I so that in each block, there

are ≤ C idle times

◮ All jobs scheduled or discarded in [t1, t2] have both release +

deadline in [t1, t2].

◮ K ≤ idle slots in [t1, t2] ≤ pmC (feasibility w/o prec.-constr.)

> K discarded ≤ C idle times each > C idle times t1 t2

slide-93
SLIDE 93

The end

Open problems:

◮ Are c(ε, m)-Lasserre rounds enough? ◮ .. or at least (log(n))c(ε,m) ◮ What about Pm | prec | Cmax (i.e. arbitrary running times)?

slide-94
SLIDE 94

The end

Open problems:

◮ Are c(ε, m)-Lasserre rounds enough? ◮ .. or at least (log(n))c(ε,m) ◮ What about Pm | prec | Cmax (i.e. arbitrary running times)?

Thanks for your attention