orders of growth
play

Orders of Growth def factor_naive(N): def factor_clever(N): - PDF document

6/30/2016 Theres more than one way to do it Usually, there are many ways to solve the same problem . In jargon: there are many ways to implement the same functional abstraction . e.g. Factoring N: Orders of Growth def factor_naive(N): def


  1. 6/30/2016 There’s more than one way to do it Usually, there are many ways to solve the same problem . In jargon: there are many ways to implement the same functional abstraction . e.g. Factoring N: Orders of Growth def factor_naive(N): def factor_clever(N): factors = [] factors = [] i = 1 i = 1 61A SUMMER 2016 while i <= N: while i <= N ** 0.5: if N % i == 0: if N % i == 0: GUEST LECTURER: JONGMIN JEROME BAEK (JBAEK080@BERKELEY.EDU) factors += [i] factors += [i, N/i] return factors return factors There’s more than one way to do it The cleverness: proof by picture For the past two weeks, we talked about how to solve a problem correctly. 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 But what if there are multiple ways to solve a 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 problem correctly? For example, square may be 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 implemented in two ways. 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Are some ways “better” than others? 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 There’s more than one way to do it The cleverness: dense math proof Checking if � divides � is the same as checking if 1. def square(n): def square(n): � a, b, total = 0, n, 0 � divides � . return n * n while b: � a, b = a + 1, b - 1 If � + 1 divides � , then �� divides � . 2. � total = total + a + b � � return total 3. = � �� is evidently smaller than . � � 4. We’ve already checked all numbers smaller than � , so there’s no need to check � +1 .

  2. 6/30/2016 Why do we care? Why do we care? Naïve way: Try all sequences of moves and Speed! find the best sequence of moves. The naïve way divides N times. The clever way There are approximately 150 turns in a game and 200 possible choices per step. This is divides √N times. 200 150 possible games. This may not seem like a big deal, but… If each atom in the Universe did one step of computation each nanosecond, the Universe would need to start over about four times to try all sequences of moves and find the best sequence of moves. Why do we care? Why do we care? “Amazon calculated that a page load slowdown of Clever way: As far as we know, there is no just one second could cost it $1.6 billion in sales each clever way to solve the game of Go! There are some ways to generate approximate year. Google has calculated that by slowing its search solutions, such as using Google’s AlphaGo, results by just four tenths of a second they could lose or using the human brain. More on this at 8 million searches per day.” the end of lecture, if we have time. We simply do not know how to solve http://www.fastcompany.com/1825005/how-one- some really hard but really important second-could-cost-amazon-16-billion-sales problems. (We can find approximations.) Why do we care? How do we care? To take an extreme example, consider How do we claim mathematically that one function the game of Go: runs faster than another? Go is a very complicated game. It is A function is executed on some input. For example: in probably the most complex board game the factoring function, the input is the number we that is widely played by humans. want to factor. Two players play each other. Each turn, How long a function takes to run depends on the size a player can make one of approximately of the input. 200 choices. There are about 150 turns.

  3. 6/30/2016 How do we care? Mole’s eye view vs. Bird’s eye view What we want to know: given some function, such as factor , how long will this function take to run? Roughly, how many lines of code will this function need to execute? How does this depend on the size of the input? How do we care? We don’t care about… More precisely, we don’t care about ◦ Constant factors For every function f(N) , ◦ Any term that is not the largest term We can find a function g(N) : For example: ◦ g 1 (N) = 3N 2 + N vs. g 2 (N) = N 2 + 6: we treat both functions as about g(N) describes how long f(N) takes to run as a the same ; we say both take quadratic time and denote this as function of N . Ө(N 2 ). ◦ g 3 (N) = 35N vs. g4(N) = N + 49: we treat both functions as about the same ; we say both take linear time and denote this as Ө(N). We don’t care about… Why the lack of care? We don’t need to know exactly Mathematical rigorousness: it saves you the how the run-time grows as the headache of trivialities and it’s useful to think of g(n) size of the input increases. as n approaches infinity We only want a rough idea. Moore’s law: computers are always getting exponentially faster To get a rough idea, we look from a birds-eye view. Constant factors/small terms are easier to reduce: That is, we look at the shape of even if you reduce constant factors, that doesn’t tell g(N) as N gets really, really big. you much about the nature of the problem

  4. 6/30/2016 Mathematical Definition Linear Time N: size of the input Ө(N) def sum_all(lst): R(N): how long it takes to run the function on input of size N result = 0 Pretty fast R(N) = θ(g(N)) for e in lst: Runtime increases Means that there are positive constants k 1 and k 2 such that result += e linearly with input k 1 · g(N) ≤ R(N) ≤ k 2 · g(N) return result For all n larger than some minimum m Constant Time Quadratic Time Ө(1) def square(x): Ө(N 2 ) def print_all_pairs(lst): for i in lst: return x * x Fastest Not fast for j in lst: print(i, j) Runtime doesn’t depend Runtime increases def add(x, y): on size of input quadratically with input return x + y Logarithmic Time Exponential Time def exp_decay(x): Ө(2 N ) def fork_bomb(n): Ө(log N) if x == 0: if n == 0: Very fast return 1 Intractable return 1 else: return fork_bomb(n - 1) + Runtime increases with Runtime increases fork_bomb(n – 1) return exp_decay(x//2) + 1 input, but very little extremely fast with input n n-1 n-1 n-2 n-2 n-2 n-2

  5. 6/30/2016 How to determine orders of growth Why do we care? In CS61A, we particularly care about orders of growth when What seems harder: factoring n, or finding the nth Fibonacci number? writing iterative or recursive code. Counter to intuition, factoring is a much harder problem! Iteration: Some problems are inherently harder than others. Theoretical ◦How long does each iteration take? computer science is in the business of classifying problems by how hard ◦How many times do we loop? they are. Recursion: P vs. NP and the Computational Complexity Zoo: ◦How long does each call to the function take? https://www.youtube.com/watch?v=YX40hbAHx3s ◦How many times do we call the function? Examples Ө(N 2 ) def mystery1(n): Ө(N) def mystery3(n): x = 0 x = 0 for i in range(n): for i in range(n): for j in range(i): x += 1 x += 1 return x return x Ө(N) def mystery2(n): Ө(logN) def mystery4(n): x = 1 if n == 0: while x < n: return 0 x *= 2 return mystery2(n-1) + 1 return x Examples Ө(2 N ) def mystery5(n): Ө(1) def mystery7(n): if n > 0: if n == 0: return mystery5(-n)+ return 1 mystery5(-n) return mystery8(n-1) + else: mystery8(n-1) return 0 Ө(2 N ) def mystery8(n): Ө(1) def mystery6(n): if n == 0: i = 0 return 1 while i > 0: return mystery7(n-1) + i += 1 mystery7(n-1) return n

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend