cs lunch
play

CS Lunch RABBIT TRACKS An Animated film by Luke Jaeger! A journey - PowerPoint PPT Presentation

1 CS Lunch RABBIT TRACKS An Animated film by Luke Jaeger! A journey through a mortality-infused landscape populated by mysterious chickens, inconvenient frogs, and other cartoonish creatures. Today, 12:15 PM Kendade 307 2 Visitor Candidates


  1. 1 CS Lunch RABBIT TRACKS An Animated film by Luke Jaeger! A journey through a mortality-infused landscape populated by mysterious chickens, inconvenient frogs, and other cartoonish creatures. Today, 12:15 PM Kendade 307 2 Visitor Candidates Lee Spector Hampshire College Thursday, 4:30 PM Kendade 303 Jaime Davila Hampshire College Tuesday, April 2, 4:30 PM Kendade 107 3 Alfred Spector CTO at Two Sigma Led Google Research for 8 years Led IBM Software Research for 5 years Fellow of ACM and IEEE Member of the National Academy of Engineering Co-recipient of the 2016 ACM Software Systems Award for the Andrew File System April 4, 7:00 PM: Data Science - Challenges & Opportunities, Gamble Auditorium April 5, CS lunch: Research Challenges in Computer Science, Kendade 307 Slides16 - Segmented Least Squares.key - March 27, 2019

  2. 4 Midterm 2! Monday, April 8 In class Covers Greedy Algorithms and Divide and Conquer Closed book 5 Dynamic Programming Formula Divide a problem into a polynomial number of smaller subproblems Solve subproblem, recording its answer Build up answer to bigger problem by using stored answers of smaller problems 6 Weighted Interval Scheduling Job j starts at s j , finishes at f j , and has weight or value v j . Two jobs compatible if they don't overlap. Goal: find maximum weight subset of mutually compatible jobs. 2 4 3 4 3 1 Time 0 1 2 3 4 5 6 7 8 9 10 11 Slides16 - Segmented Least Squares.key - March 27, 2019

  3. 7 Optimal Substructure OPT(j) = value of optimal solution to the problem consisting of job requests 1, 2, ..., j. Case 1: OPT selects job j. Case 2: OPT does not select job j. # 0 if j = 0 OPT ( j ) = $ max { v j + OPT ( p ( j )), OPT ( j − 1) } otherwise % Case 1 Case 2 8 Straightforward Recursive Algorithm Input: n, s 1 ,…,s n , f 1 ,…,f n , v 1 ,…,v n Sort jobs by finish times so that f 1 ≤ f 2 ≤ ... ≤ f n . Compute p(1), p(2), …, p(n) Compute-Opt(j) { if (j = 0) return 0 else return max(v j + Compute-Opt(p(j)), Compute-Opt(j-1)) } 9 Iterative Solution Bottom-up dynamic programming. Unwind recursion. Sort jobs by finish times so that f 1 ≤ f 2 ≤ ... ≤ f n . Compute p(1), p(2), …, p(n) Iterative-Compute-Opt { M[0] = 0 for j = 1 to n M[j] = max(v j + M[p(j)], M[j-1]) } Slides16 - Segmented Least Squares.key - March 27, 2019

  4. Memoization 10-1 1 3 2 2 3 4 4 1 5 3 6 4 7 3 8 1 Time 0 1 2 3 4 5 6 7 8 9 10 11 M 3 3 4 4 6 8 9 9 Iterative-Compute-Opt { M[0] = 0 for j = 1 to n M[j] = max(v j + M[p(j)], M[j-1]) } Memoization 10-2 1 3 2 2 3 4 4 1 5 3 Great! The value is 9, but which jobs 6 4 7 should we select???? 3 8 1 Time 0 1 2 3 4 5 6 7 8 9 10 11 3 3 4 4 6 8 9 9 M Iterative-Compute-Opt { M[0] = 0 for j = 1 to n M[j] = max(v j + M[p(j)], M[j-1]) } 11 1 3 2 2 3 4 4 1 5 3 6 4 7 3 8 1 Time 0 1 2 3 4 5 6 7 8 9 10 11 3 3 4 4 6 8 9 9 M Find-Solution(j) { if (j = 0) return else if (M[j] == M[j-1]) Memoization Find-Solution(j-1) else Find-Solution(p(j)) include j } Slides16 - Segmented Least Squares.key - March 27, 2019

  5. 12 1 3 2 2 3 4 4 1 5 3 6 4 7 3 8 1 Time 0 1 2 3 4 5 6 7 8 9 10 11 M 3 3 4 4 6 8 9 9 Find-Solution(j) { if (j = 0) return else if (M[j] == M[j-1]) Memoization Find-Solution(j-1) else Find-Solution(p(j)) include j } 13 1 3 2 2 3 4 4 1 5 3 6 4 7 3 8 1 Time 0 1 2 3 4 5 6 7 8 9 10 11 3 3 4 4 6 8 9 9 M Find-Solution(j) { if (j = 0) return else if (M[j] == M[j-1]) Memoization Find-Solution(j-1) else Find-Solution(p(j)) include j } 14 1 3 2 2 3 4 4 1 5 3 6 4 7 3 8 1 Time 0 1 2 3 4 5 6 7 8 9 10 11 3 3 4 4 6 8 9 9 M Find-Solution(j) { if (j = 0) return else if (M[j] == M[j-1]) Memoization Find-Solution(j-1) else Find-Solution(p(j)) include j } Slides16 - Segmented Least Squares.key - March 27, 2019

  6. 15 Dynamic Programming Formula Divide a problem into a polynomial number of smaller subproblems We often think recursively to identify the subproblems Solve subproblem, recording its answer Build up answer to bigger problem by using stored answers of smaller problems We develop algorithm that builds up the answer iteratively 16 Iterative Solution Bottom-up dynamic programming. Unwind recursion. Sort jobs by finish times so that f 1 ≤ f 2 ≤ ... ≤ f n . Compute p(1), p(2), …, p(n) Iterative-Compute-Opt { Subproblems M[0] = 0 for j = 1 to n M[j] = max(v j + M[p(j)], M[j-1]) } 17 Iterative Solution Bottom-up dynamic programming. Unwind recursion. Sort jobs by finish times so that f 1 ≤ f 2 ≤ ... ≤ f n . Compute p(1), p(2), …, p(n) Record subproblem Iterative-Compute-Opt { answers M[0] = 0 for j = 1 to n M[j] = max(v j + M[p(j)], M[j-1]) } Slides16 - Segmented Least Squares.key - March 27, 2019

  7. 18 Iterative Solution Bottom-up dynamic programming. Unwind recursion. Sort jobs by finish times so that f 1 ≤ f 2 ≤ ... ≤ f n . Compute p(1), p(2), …, p(n) Combine subproblem Iterative-Compute-Opt { answers M[0] = 0 for j = 1 to n M[j] = max(v j + M[p(j)], M[j-1]) } 19 Least Squares Foundational problem in statistic and numerical analysis. Given n points in the plane: (x 1 , y 1 ), (x 2 , y 2 ) , . . . , (x n , y n ). Find a line y = ax + b that minimizes the sum of the squared error: y n SSE = ( y i − ax i − b ) 2 ∑ i = 1 x 20 Least Squares Solution Result from calculus, least squares achieved when: a = n x i y i − ( x i ) y i ) y i − a x i ∑ ∑ ( ∑ ∑ ∑ i i i i i b = , 2 − ( x i ) 2 n n x i ∑ ∑ i i Slides16 - Segmented Least Squares.key - March 27, 2019

  8. 21 Least Squares Sometimes a single line does not work very well. y x 22 Segmented Least Squares Points lie roughly on a sequence of several line segments. Given n points in the plane (x 1 , y 1 ), (x 2 , y 2 ) , . . . , (x n , y n ) with x 1 < x 2 < ... < x n , find a sequence of lines that fits well. y x 23 Segmented Least Squares Find a sequence of lines that minimizes: the errors E in each segment the number of lines L Tradeoff function: E + c L, for some constant c > 0. y x Slides16 - Segmented Least Squares.key - March 27, 2019

  9. 24 Calculating Line Segment Error First calculate slope and y-intercept for least squares line: a = n x i y i − ( x i ) y i ) y i − a x i ( ∑ ∑ ∑ ∑ ∑ i i i i i b = , 2 − ( n n x i x i ) 2 ∑ ∑ i i Then calculate the error associated with that line n ( y i − ax i − b ) 2 SSE = ∑ i = 1 25 Multiway Choice OPT(j) = minimum cost for points p 1 , p 2 , . . . , p j . e(i, j) = minimum error for the segment using points p i , p i+1 , . . . , p j . To compute OPT(j): Last segment uses points p i , p i+1 , . . . , p j for some i. Cost = e(i, j) + c + OPT(i-1). $ 0 if j = 0 & OPT ( j ) = % e ( i , j ) + c + OPT ( i − 1) min { } otherwise & ' 1 ≤ i ≤ j 26-1 Segmented Least Squares: Algorithm Segmented-Least-Squares() { sort p 1 ..p n by their x values for j = 1 to n for i = 1 to j compute the least square error e[i][j] for 
 the segment p i ,…, p j M[0] = 0 for j = 1 to n M[j] = min 1 ≤ i ≤ j (e[i][j] + c + M[i-1]) return M[n] } Slides16 - Segmented Least Squares.key - March 27, 2019

  10. 26-2 Segmented Least Squares: Algorithm Segmented-Least-Squares() { Cost sort p 1 ..p n by their x values for j = 1 to n for i = 1 to j compute the least square error e[i][j] for 
 the segment p i ,…, p j M[0] = 0 for j = 1 to n M[j] = min 1 ≤ i ≤ j (e[i][j] + c + M[i-1]) return M[n] } 26-3 Segmented Least Squares: Algorithm Cost Segmented-Least-Squares() { sort p 1 ..p n by their x values for j = 1 to n O(n 3 ) for i = 1 to j compute the least square error e[i][j] for 
 the segment p i ,…, p j M[0] = 0 for j = 1 to n M[j] = min 1 ≤ i ≤ j (e[i][j] + c + M[i-1]) return M[n] } 26-4 Segmented Least Squares: Algorithm Segmented-Least-Squares() { Cost sort p 1 ..p n by their x values for j = 1 to n O(n 3 ) for i = 1 to j compute the least square error e[i][j] for 
 the segment p i ,…, p j M[0] = 0 O(n 2 ) for j = 1 to n M[j] = min 1 ≤ i ≤ j (e[i][j] + c + M[i-1]) return M[n] } Slides16 - Segmented Least Squares.key - March 27, 2019

  11. 26-5 Segmented Least Squares: Algorithm Segmented-Least-Squares() { Cost sort p 1 ..p n by their x values for j = 1 to n O(n 3 ) for i = 1 to j compute the least square error e[i][j] for 
 the segment p i ,…, p j M[0] = 0 O(n 2 ) for j = 1 to n M[j] = min 1 ≤ i ≤ j (e[i][j] + c + M[i-1]) Total = O(n 3 ) return M[n] } 26-6 Segmented Least Squares: Algorithm Cost Segmented-Least-Squares() { sort p 1 ..p n by their x values O(n log n) for j = 1 to n O(n 3 ) for i = 1 to j compute the least square error e[i][j] for 
 the segment p i ,…, p j M[0] = 0 O(n 2 ) for j = 1 to n M[j] = min 1 ≤ i ≤ j (e[i][j] + c + M[i-1]) Total = O(n 3 ) return M[n] } Slides16 - Segmented Least Squares.key - March 27, 2019

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend