CSE101: Algorithm Design and Analysis Russell Impagliazzo Sanjoy - - PowerPoint PPT Presentation

β–Ά
cse101 algorithm design and analysis
SMART_READER_LITE
LIVE PREVIEW

CSE101: Algorithm Design and Analysis Russell Impagliazzo Sanjoy - - PowerPoint PPT Presentation

CSE101: Algorithm Design and Analysis Russell Impagliazzo Sanjoy Dasgupta Ragesh Jaiswal (Thanks for slides: Miles Jones) Week-06 Lecture 22: Divide and Conquer (Master Theorem) Master Theorem How do you solve a recurrence of the form


slide-1
SLIDE 1

CSE101: Algorithm Design and Analysis

Russell Impagliazzo Sanjoy Dasgupta Ragesh Jaiswal (Thanks for slides: Miles Jones)

Week-06 Lecture 22: Divide and Conquer (Master Theorem)

slide-2
SLIDE 2

Master Theorem

  • How do you solve a recurrence of the form

π‘ˆ π‘œ = π‘π‘ˆ π‘œ 𝑐 + 𝑃 π‘œ! We will use the master theorem.

slide-3
SLIDE 3

Summation Lemma

Consider the summation !

!"# $

𝑠! It behaves differently for different values of 𝑠.

slide-4
SLIDE 4

Summation Lemma

Consider the summation !

!"# $

𝑠! It behaves differently for different values of 𝑠. If 𝑠 < 1 then this sum converges. This means that the sum is bounded above by some constant 𝑑. Therefore 𝑗𝑔 𝑠 < 1, π‘’β„Žπ‘“π‘œ !

!"# $

𝑠! < 𝑑 𝑔𝑝𝑠 π‘π‘šπ‘š π‘œ 𝑑𝑝 !

!"# $

𝑠! Ο΅ 𝑃(1)

slide-5
SLIDE 5

Summation Lemma

Consider the summation !

!"# $

𝑠! It behaves differently for different values of 𝑠. If 𝑠 = 1 then this sum is just summing 1 over and over n times. Therefore 𝑗𝑔 𝑠 = 1, π‘’β„Žπ‘“π‘œ !

!"# $

𝑠! = !

!"# $

1 = π‘œ + 1 Ο΅ 𝑃(π‘œ)

slide-6
SLIDE 6

Summation Lemma

Consider the summation !

!"# $

𝑠! It behaves differently for different values of 𝑠. If 𝑠 > 1 then this sum is exponential with base 𝑠. 𝑗𝑔 𝑠 > 1, π‘’β„Žπ‘“π‘œ !

!"# $

𝑠! < 𝑑𝑠$ 𝑔𝑝𝑠 π‘π‘šπ‘š π‘œ, 𝑑𝑝 !

!"# $

𝑠! Ο΅ 𝑃 𝑠$ 𝑑 > 𝑠 𝑠 βˆ’ 1

slide-7
SLIDE 7

Summation Lemma

Consider the summation !

!"# $

𝑠! It behaves differently for different values of 𝑠. !

!"# $

𝑠! Ο΅ 9 𝑃 1 𝑗𝑔 𝑠 < 1 𝑃 π‘œ 𝑗𝑔 𝑠 = 1 𝑃 𝑠$ 𝑗𝑔 𝑠 > 1

slide-8
SLIDE 8

Master Theorem

Master Theorem: If π‘ˆ(π‘œ) = π‘π‘ˆ(π‘œ/𝑐) + 𝑃(π‘œ:) for some constants 𝑏 > 0, 𝑐 > 1, 𝑒 β‰₯ 0 , Then π‘ˆ π‘œ Ο΅ 𝑃 π‘œ: 𝑗𝑔 𝑏 < 𝑐: 𝑃 π‘œ: log π‘œ 𝑗𝑔 𝑏 = 𝑐: 𝑃 π‘œ;<=! > 𝑗𝑔 𝑏 > 𝑐:

slide-9
SLIDE 9

Master Theorem: Solving the recurrence

π‘ˆ(π‘œ) = π‘π‘ˆ(π‘œ/𝑐) + 𝑃(π‘œ:)

Size π‘œ Size π‘œ/𝑐 Size π‘œ/𝑐% Size 1 Depth log& π‘œ 1 subproblem 𝑏 subproblems 𝑏% subproblems 𝑏'()! $ subproblems

slide-10
SLIDE 10

Master Theorem: Solving the recurrence

After 𝑙 levels, there are 𝑏! subproblems, each

  • f size π‘œ/𝑐!.

So, during the 𝑙th level of recursion, the time complexity is 𝑃

$ &" *

𝑏! = 𝑃 𝑏!

$ &" *

= 𝑃 π‘œ* 𝑏 𝑐*

!

slide-11
SLIDE 11

Master Theorem: Solving the recurrence

After 𝑙 levels, there are 𝑏! subproblems, each of size π‘œ/𝑐!. So, during the 𝑙th level, the time complexity is 𝑃

$ &" *

𝑏! = 𝑃 𝑏!

$ &" *

= 𝑃 π‘œ* 𝑏 𝑐*

!

After log& π‘œ levels, the subproblem size is reduced to 1, which usually is the size

  • f the base case.

So the entire algorithm is a sum of each level. π‘ˆ π‘œ = 𝑃 π‘œ* !

!"# '()! $

𝑏 𝑐*

!

slide-12
SLIDE 12

Master Theorem: Proof

π‘ˆ π‘œ = 𝑃 π‘œ" &

#$% &'(! )

𝑏 𝑐"

#

Case 1: 𝑏 < 𝑐" Then we have that

* +" < 1 and the series converges to a constant so

π‘ˆ π‘œ = 𝑃 π‘œ"

slide-13
SLIDE 13

Master Theorem: Proof

π‘ˆ π‘œ = 𝑃 π‘œ" &

#$% &'(! )

𝑏 𝑐"

#

Case 2: 𝑏 = 𝑐" Then we have that

* +" = 1 and so each term is equal to 1

π‘ˆ π‘œ = 𝑃 π‘œ" log+ π‘œ

slide-14
SLIDE 14

Master Theorem: Proof

π‘ˆ π‘œ = 𝑃 π‘œ" &

#$% &'(! )

𝑏 𝑐"

#

Case 2: 𝑏 > 𝑐" Then the summation is exponential and grows proportional to its last term

* +" &'(! )

so π‘ˆ π‘œ = 𝑃 π‘œ" 𝑏 𝑐"

&'(! )

= 𝑃 π‘œ&'(! *

slide-15
SLIDE 15

Master Theorem

Theorem: If π‘ˆ(π‘œ) = π‘π‘ˆ(π‘œ/𝑐) + 𝑃(π‘œ:) for some constants 𝑏 > 0, 𝑐 > 1, 𝑒 β‰₯ 0 , Then π‘ˆ π‘œ Ο΅ 𝑃 π‘œ: 𝑗𝑔 𝑏 < 𝑐: 𝑃 π‘œ: log π‘œ 𝑗𝑔 𝑏 = 𝑐: 𝑃 π‘œ;<=! > 𝑗𝑔 𝑏 > 𝑐:

Top-heavy Steady-state Bottom-heavy

slide-16
SLIDE 16

Master Theorem Applied to Multiply

The recursion for the runtime of Multiply is T(n) = 4T(n/2) + cn So we have that a=4, b=2, and d=1. In this case, 𝑏 > 𝑐: so π‘ˆ π‘œ ϡ𝑃 π‘œ;<=, F = 𝑃 π‘œG Not any improvement of grade-school method.

π‘ˆ π‘œ Ο΅ 𝑃 π‘œ* 𝑗𝑔 𝑏 < 𝑐* 𝑃 π‘œ* log π‘œ 𝑗𝑔 𝑏 = 𝑐* 𝑃 π‘œ'()! + 𝑗𝑔 𝑏 > 𝑐*

slide-17
SLIDE 17

Master Theorem Applied to MultiplyKS

The recursion for the runtime of Multiply is T(n) = 3T(n/2) + cn So we have that a=3, b=2, and d=1. In this case, 𝑏 > 𝑐: so π‘ˆ π‘œ ϡ𝑃 π‘œ;<=, H = 𝑃 π‘œI.KL An improvement on grade-school method!!!!!!

π‘ˆ π‘œ Ο΅ 𝑃 π‘œ* 𝑗𝑔 𝑏 < 𝑐* 𝑃 π‘œ* log π‘œ 𝑗𝑔 𝑏 = 𝑐* 𝑃 π‘œ'()! + 𝑗𝑔 𝑏 > 𝑐*

slide-18
SLIDE 18

Poll: What is the fastest known integer multiplication time?

  • 𝑃 π‘œ/012
  • 𝑃(π‘œ π‘šπ‘π‘•π‘œ (log π‘šπ‘π‘•π‘œ)

3)

  • 𝑃(π‘œ π‘šπ‘π‘•π‘œ 2^{logβˆ— π‘œ})
  • 𝑃(π‘œ log π‘œ)
  • O(n)
slide-19
SLIDE 19

Poll: What is the fastest known integer multiplication time? All have/will be correct

  • 𝑃 π‘œ/012 Kuratsuba
  • 𝑃(π‘œ π‘šπ‘π‘•π‘œ log log π‘œ ) Schonhage-Strassen, 1971
  • 𝑃(π‘œ π‘šπ‘π‘•π‘œ 2^{𝑑 logβˆ— π‘œ}) Furer, 2007
  • 𝑃(π‘œ log π‘œ) Harvey and van der Hoeven, 2019
  • O(n), you, tomorrow?
slide-20
SLIDE 20

Can we do better than π‘œ!.#$?

  • Could any multiplication algorithm have a faster asymptotic

runtime than π›ͺ π‘œ5.78 ?

  • Any ideas?????
slide-21
SLIDE 21

Can we do better than π‘œ!.#$?

  • What if instead of splitting the number in half, we split it

into thirds.

  • x=
  • y=

xL xM yL yM xR yR

slide-22
SLIDE 22

Can we do better than π‘œ!.#$?

  • What if instead of splitting the number in half, we split it

into thirds.

  • 𝑦 = 239/2𝑦; + 29/2𝑦< + 𝑦=
  • 𝑧 = 239/2𝑧; + 29/2𝑧< + 𝑧=
slide-23
SLIDE 23

Multiplying trinomials

  • 𝑏𝑦3 + 𝑐𝑦 + 𝑑

𝑒𝑦3 + 𝑓𝑦 + 𝑔

slide-24
SLIDE 24

Multiplying trinomials

  • 𝑏𝑦3 + 𝑐𝑦 + 𝑑

𝑒𝑦3 + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦> + 𝑏𝑓 + 𝑐𝑒 𝑦2 + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦3 + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 9 multiplications means 9 recursive calls. Each multiplication is 1/3 the size of the original.

slide-25
SLIDE 25

Multiplying trinomials

  • 𝑏𝑦G + 𝑐𝑦 + 𝑑

𝑒𝑦G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦F + 𝑏𝑓 + 𝑐𝑒 𝑦H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 9 multiplications means 9 recursive calls. Each multiplication is 1/3 the size of the original. π‘ˆ π‘œ = 9π‘ˆ π‘œ 3 + 𝑃(π‘œ)

slide-26
SLIDE 26

Multiplying trinomials

  • 𝑏𝑦G + 𝑐𝑦 + 𝑑

𝑒𝑦G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦F + 𝑏𝑓 + 𝑐𝑒 𝑦H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 π‘ˆ π‘œ = 9π‘ˆ π‘œ 3 + 𝑃(π‘œ)

π‘ˆ π‘œ Ο΅ 𝑃 π‘œ* 𝑗𝑔 𝑏 < 𝑐* 𝑃 π‘œ* log π‘œ 𝑗𝑔 𝑏 = 𝑐* 𝑃 π‘œ'()! + 𝑗𝑔 𝑏 > 𝑐*

a=9 9 > 3! b=3 π‘ˆ π‘œ = 𝑃 π‘œ"#$M % d=1 π‘ˆ π‘œ = 𝑃 π‘œ&

slide-27
SLIDE 27

Multiplying trinomials

  • 𝑏𝑦G + 𝑐𝑦 + 𝑑

𝑒𝑦G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦F + 𝑏𝑓 + 𝑐𝑒 𝑦H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔

  • There is a way to reduce from 9 multiplications down to just

5!!!

  • Then the recursion becomes
  • π‘ˆ π‘œ = 5π‘ˆ(π‘œ/3) + O(n)
  • So by the master theorem
slide-28
SLIDE 28

Multiplying trinomials

  • 𝑏𝑦G + 𝑐𝑦 + 𝑑

𝑒𝑦G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦F + 𝑏𝑓 + 𝑐𝑒 𝑦H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔

  • There is a way to reduce from 9 multiplications down to just

5!!!

  • Then the recursion becomes
  • π‘ˆ π‘œ = 5π‘ˆ(π‘œ/3) + O(n)
  • So by the master theorem T(n)=O(π‘œ;<=- K) = 𝑃 π‘œI.FH
slide-29
SLIDE 29

Dividing into k subproblems

  • What happens if we divide into k subproblems each of

size n/k.

  • (𝑏#./𝑦#./ + 𝑏#.0𝑦#.0 + β‹― 𝑏/𝑦 + 𝑏%)(𝑐#./𝑦#./ + 𝑐#.0𝑦#.0 + β‹― 𝑐/𝑦 + 𝑐%)
  • How many terms are there? (multiplications.)
slide-30
SLIDE 30

Dividing into k subproblems

  • What happens if we divide into k subproblems each of size n/k.
  • (𝑏!,-𝑦!,- + 𝑏!,%𝑦!,% + β‹― 𝑏-𝑦 + 𝑏#)(𝑐!,-𝑦!,- + 𝑐!,%𝑦!,% + β‹― 𝑐-𝑦 + 𝑐#)
  • How many terms are there? (multiplications.)
  • There are 𝑙G multiplications. The recursion is

π‘ˆ π‘œ = 𝑙Gπ‘ˆ π‘œ 𝑙 + 𝑃 π‘œ … … … 𝑏 = 𝑙G, 𝑐 = 𝑙, 𝑒 = 1 π‘ˆ π‘œ = 𝑃(π‘œ;<=1 N,) = 𝑃 π‘œG

slide-31
SLIDE 31

Cook-Toom algorithm

  • In fact, if you split up your number into k equally sized

parts, then you can combine them with 2k-1 multiplications instead of the 𝑙3 individual multiplications.

  • This means that you can get an algorithm that runs in
  • π‘ˆ π‘œ = (2𝑙 βˆ’ 1)π‘ˆ(π‘œ/𝑙) + O(n)
slide-32
SLIDE 32

Cook-Toom algorithm

  • In fact, if you split up your number into k equally sized parts,

then you can combine them with 2k-1 multiplications instead

  • f the 𝑙G individual multiplications.
  • This means that you can get an algorithm that runs in
  • π‘ˆ π‘œ = (2𝑙 βˆ’ 1)π‘ˆ(π‘œ/𝑙) + O(n)
  • π‘ˆ π‘œ = 𝑃 π‘œ

234(,167) 234 1

time!!!!

slide-33
SLIDE 33

Cook-Toom algorithm

π‘ˆ π‘œ = (2𝑙 βˆ’ 1)π‘ˆ(π‘œ/𝑙) + O(n)

  • π‘ˆ π‘œ = 𝑃 π‘œ

!"# $%&' !"# %

time.

  • So we can have a near-linear time algorithm if we take

k to be sufficiently large. The O(n) term in the recursion takes a lot of time the bigger k gets. So is it worth it to make k very large?