Asymptotic Analysis If T ( n ) is some algorithms running time - - PowerPoint PPT Presentation

asymptotic analysis if t n is some algorithm s running
SMART_READER_LITE
LIVE PREVIEW

Asymptotic Analysis If T ( n ) is some algorithms running time - - PowerPoint PPT Presentation

Asymptotic Analysis If T ( n ) is some algorithms running time function , i.e., given input of size n , it runs T ( n ) steps, we are interested in asymptotic behaviour (read: as n approaches infinity) rather than the exact functions . We want


slide-1
SLIDE 1

Asymptotic Analysis If T(n) is some algorithm’s running time function, i.e., given input

  • f size n, it runs T(n) steps, we are interested in

asymptotic behaviour (read: as n approaches infinity) rather than the exact functions. We want compare the growth of T(n) with the growth of some simple function f(n). We’re going to formalise this notion in the following.

1

slide-2
SLIDE 2

Θ-notation (read: Theta)

For a given function g(n), Θ(n) denotes the set Θ(g(n)) = {f(n) : there exist positive constants c1, c2, n0 such that 0 ≤ c1 · g(n) ≤ f(n) ≤ c2 · g(n) for all n ≥ n0} Intuition: f(n) belongs to Θ(g(n)) if ∃ pos. constants c1, c2 s.t. f(n) can be “sandwiched” between c1 · g(n) and c2 · g(n), for n sufficiently large. Correct notation: f(n) ∈ Θ(g(n)) Usually used: f(n) = Θ(g(n)).

2

slide-3
SLIDE 3

Examples: f(n) = 2n2 = Θ(n2) because with

  • g(n) = n2, and
  • c1 = 1, and
  • c2 = 2

0 ≤ c1 · g(n) ≤ f(n) = 2 · n2 ≤ c2 · g(n). f(n) = 8n5 + 17n4 − 25 = Θ(n5)

  • f(n) ≥ 1 · n5 for n large enough

n 8n5 + 17n4 − 25 n5 1 8 · 1 + 17 · 1 − 25 = 0 1 2 8 · 32 + 17 · 16 − 25 = 503 32

  • f(n) ≤ 8n5 + 17n5 = 25n5.

Thus c1 = 1, c2 = 25 and n0 = 2 are good enough.

3

slide-4
SLIDE 4

More intuition: for all n ≥ n0, the function f(n) is equal to g(n) to within a constant factor. We say that g(n) is an asymptotically tight bound for f(n).

4

slide-5
SLIDE 5

However, n3 = Θ(n2) Recall: for n3 = Θ(n2) we would have to find constants c1, c2, n0 with 0 ≤ c1 · n2 ≤ n3 ≤ c2 · n2 for n ≥ n0. Intuition: there’s a factor of n between both functions, thus we cannot find a constant c2! Suppose, for purpose of contradiction, that there are constants c2 and n0 with n3 ≤ c2 · n2 for n ≥ n0. Dividing by n2 yields n ≤ c2, which cannot possibly hold for arbitrarily large n (c2 must be a constant).

5

slide-6
SLIDE 6

We’ve just seen one of the most important (and sometimes most annoying) gaps between theory and practice: In theory a factor of 1,000 doesn’t make one bit of a difference (just choose your c2 accordingly), Whereas in practice it does. Hence, sometimes the runtime of an algorithm is given without using asymptotic notation.

6

slide-7
SLIDE 7

O-notation (read: big-O)

We’ve seen: Θ-notation asymptotically bounds from above and be- low. When we’re interested in asymptotic upper bounds only, we use

  • notation (read: “big-O”).

For given function g(n), define O(g(n)) as follows: O(g(n)) = {f(n) : there exist positive constants c, n0 such that 0 ≤ f(n) ≤ c · g(n)for all n ≥ n0} We write f(n) = O(g(n)) to indicate that f(n) is member of set (g(n)). Obviously, f(n) = Θ(g(n)) implies f(n) = O(g(n)); we just drop the LH inequality in the definition of Θ(g(n)).

7

slide-8
SLIDE 8

Intuition: O-notation is used to denote upper bounds on running times, memory requirements, etc. Saying “the running time is O(n log n)” means: the running time is not greater than n log n times some constant factor, for n large enough.

8

slide-9
SLIDE 9

Ω-notation

Like O-notation, but for lower bounds For a given function g(n), Ω(n) denotes the set Ω(g(n)) = {f(n) : there exist positive constants c, n0 such that 0 ≤ c · g(n) ≤ f(n) for all n ≥ n0} Saying T(n) = Ω(n2) means growth of T(n) is at least the of n2. Clearly, f(n) = Θ(g(n)) iff f(n) = Ω(g(n)) and f(n) = O(g(n)).

9

slide-10
SLIDE 10
  • -notation

Similar to O-notation f(n) = O(g(n)) means we can upper-bound the growth of f by the growth of g (up to a constant factor) f(n) = o(g(n)) is the same, except we require the growth of f to be strictly smaller than the growth of g: For a given function g(n), o(n) denotes the set

  • (g(n)) = {f(n) :

for any pos constant c there exists a pos constant n0 such that 0 ≤ f(n) < c · g(n) for all n ≥ n0}

10

slide-11
SLIDE 11

Intuition: f(n) becomes insignificant relative to g(n) as n approaches infinity: lim

n→∞

f(n) g(n) = 0 In other words, f is o(something) if there is no constant factor between f and something. Examples: n = o(n2) log n = o(n) n = o(2n) n1,000 = o(1.0001n) 1 = o(log n)

11

slide-12
SLIDE 12

ω-notation

ω is to Ω what o is to : f(n) = ω(g(n)) iff g(n) = o(f(n)) For a given function g(n), ω(n) denotes the set ω(g(n)) = {f(n) : for any pos constant c there exists a pos constant n0 such that 0 ≤ c · g(n) < f(n) for all n ≥ n0} In other words: lim

n→∞

f(n) g(n) = ∞ if the limit exists. I.e., f(n) becomes arbitrarily large relative to g(n).

12

slide-13
SLIDE 13

So we have

  • Θ: asymptotically “equal”
  • : asymptotically “at most”
  • Ω: asymptotically “at least”
  • o: asymptotically “strictly smaller”
  • ω: asymptotically “strictly greater”

13

slide-14
SLIDE 14

This implies an analogy between aymptotic comparison of functions f and g and comparison of real numbers a and b: f(n) = O(g(n)) ≈ a ≤ b f(n) = Ω(g(n)) ≈ a ≥ b f(n) = Θ(g(n)) ≈ a = b f(n) = o(g(n)) ≈ a < b f(n) = ω(g(n)) ≈ a > b

14

slide-15
SLIDE 15

Θ(g(n)) = {f(n) : there ∃ c1, c2, n0 such that 0 ≤ c1 · g(n) ≤ f(n) ≤ c2 · g(n) for all n ≥ n0} O(g(n)) = {f(n) : there ∃ c, n0 such that 0 ≤ f(n) ≤ c · g(n) for all n ≥ n0} Ω(g(n)) = {f(n) : there ∃ c, n0 such that 0 ≤ c · g(n) ≤ f(n) for all n ≥ n0}

  • (g(n)) = {f(n) :

for any pos constant c there ∃ a pos constant n0 such that 0 ≤ f(n) < c · g(n) for all n ≥ n0} ω(g(n)) = {f(n) : for any pos constant c there ∃ a pos constant n0 such that0 ≤ c · g(n) < f(n)for all n ≥ n0}

15