scientific programming algorithms part b
play

Scientific Programming: Algorithms (part B) Programming paradigms - - PowerPoint PPT Presentation

Scientific Programming: Algorithms (part B) Programming paradigms - continued - Luca Bianco - Academic Year 2019-20 luca.bianco@fmach.it [credits: thanks to Prof. Alberto Montresor] Greedy algorithms Independent intervals Independent


  1. Scientific Programming: Algorithms (part B) Programming paradigms - continued - Luca Bianco - Academic Year 2019-20 luca.bianco@fmach.it [credits: thanks to Prof. Alberto Montresor]

  2. Greedy algorithms

  3. Independent intervals

  4. Independent intervals these three intervals are not maximal! intervals are open on the right, hence these are disjoint

  5. Independent intervals

  6. Path to the solution

  7. Optimal substructure b 0 a n+1 ends at -∞ starts at +∞ S[i,j]

  8. Optimal substructure S[i,k] S[k,j] i j k optimal solution A[i,j] once found k that belongs to the optimal solution A[i,j], we need to solve the two smaller intervals

  9. Optimal substructure S[i,k] S[k,j] i j k optimal solution A[i,j] once found k that belongs to the optimal solution A[i,j], we need to solve the two We want to prove that if A[i,j] contains the optimal solution of S[i,j] and smaller intervals k is in A[i,j] then it optimally solves S[i,k] and S[k,j]. By contradiction: A[i,j] k ex. if S[i,k] is better than the corresponding intervals in A[i,j] → A[i,j] is not optimal S[i,k] k S[k,j]

  10. Optimal substructure S[i,k] S[k,j] i j k optimal solution A[i,j] once found k that belongs to the optimal solution A[i,j], we need to solve the two We want to prove that if A[i,j] contains the optimal solution of S[i,j] and smaller intervals k is in A[i,j] then it optimally solves S[i,k] and S[k,j]. By contradiction: A[i,j] k ex. if S[i,k] is better than the corresponding intervals in A[i,j] → A[i,j] is not optimal S[i,k] k S[k,j]

  11. Recursive formula otherwise because we chose interval K

  12. Dynamic programming top-down: DP[0,n]

  13. Complexity

  14. Greedy choice Proof

  15. Greedy choice Proof

  16. Greedy choice m m’ Consequences of the theorem

  17. Greedy algorithm #first greedy choice #other greedy choices Complexity? If input not sorted: O(n log n + n) = O(n log n) If input sorted: O(n)

  18. Greedy algorithm #first greedy choice #other greedy choices Complexity If input not sorted: O(n log n + n) = O(n log n) If input sorted: O(n)

  19. Greedy algorithm #first greedy choice #other greedy choices Complexity If input not sorted: O(n log n + n) = O(n log n) If input sorted: O(n)

  20. Greedy algorithm #first greedy choice #other greedy choices Complexity If input not sorted: O(n log n + n) = O(n log n) If input sorted: O(n)

  21. Greedy algorithm #first greedy choice #other greedy choices Complexity If input not sorted: O(n log n + n) = O(n log n) If input sorted: O(n)

  22. Greedy algorithm #first greedy choice #other greedy choices Complexity If input not sorted: O(n log n + n) = O(n log n) If input sorted: O(n)

  23. Greedy algorithm #first greedy choice #other greedy choices Complexity If input not sorted: O(n log n + n) = O(n log n) If input sorted: O(n)

  24. Greedy algorithm #first greedy choice #other greedy choices Complexity If input not sorted: O(n log n + n) = O(n log n) If input sorted: O(n)

  25. Greedy algorithm #first greedy choice #other greedy choices Complexity If input not sorted: O(n log n + n) = O(n log n) If input sorted: O(n)

  26. Genome rearrangements [3, 5, 2, 4, 1] [3, 2, 5, 4, 1] [3, 2, 1, 4, 5] [1, 2, 3, 4, 5] Transformation of mouse gene order into human gene order on Chr X (biggest synteny blocks)

  27. Genome rearrangements

  28. Greedy solution

  29. Greedy solution In list: [2, 4, 1, 3, 0] [0, 3, 1, 4, 2] [0, 1, 3, 4, 2] [0, 1, 2, 4, 3] [0, 1, 2, 3, 4] In list: In list: [5, 0, 1, 2, 3, 4] [5, 0, 1, 2, 3, 4] Simple but not optimal! [0, 5, 1, 2, 3, 4] [4, 3, 2, 1, 0, 5] [0, 1, 5, 2, 3, 4] [0, 1, 2, 3, 4, 5] [0, 1, 2, 5, 3, 4] [0, 1, 2, 3, 5, 4] Approximated algorithms exist... [0, 1, 2, 3, 4, 5]

  30. Backtracking

  31. Typical problems we explore all possible solutions building/enumerating them and counting or stopping when we find one

  32. Typical problems

  33. Typical problems

  34. Build all solutions

  35. Backtracking Approach ● Try to build a solution, if it works you are done else undo it and try again “keep trying, you’ll get luckier” ● Needs a systematic way to explore the search space looking for the admissible solution(s)

  36. General scheme

  37. Partial solutions

  38. Decision tree Note : the decision tree is “virtual” we do not need to store it all...

  39. Decision tree Note : the decision tree is “virtual” we do not need to store it all...

  40. Decision tree Note : the decision tree is “virtual” we do not need to store it all...

  41. Decision tree process or ignore the Note : the decision tree solution is “virtual” we do not need to store it all...

  42. Decision tree solution ignored Note : the decision tree is “virtual” we do not need to store it all...

  43. Decision tree Note : the decision tree is “virtual” we do not need to store it all...

  44. Decision tree Note : the decision tree is “virtual” we do not need to store it all...

  45. Decision tree Note : the decision tree is “virtual” we do not need to store it all...

  46. Decision tree solution processed Note : the decision tree is “virtual” we do not need to store it all...

  47. Pruning Even though the tree might be exponential, with pruning we might not need to explore it all Note : the decision tree is “virtual” we do not need to store it all...

  48. Pruning Even though the tree might be exponential, with pruning we might not need to explore it all

  49. General schema to find a solution (modify as you like) S is the list of choices n is the maximum number of choices i is the index of the choice I am currently making … other inputs The recursive call will test all solutions unless they return true 1. We build a next choice with choices(...) based on the previous choices S[0:i-1]: the logic of the code goes here 2. For each possible choice, we memorize the choice in S[i] 3. If S[i] is admissible then we process it and we can either stop (if we needed at least one solution) or continue to the next one (return false) 4. In the latter case we keep going calling enumeration again to compute choice i+1

  50. Enumeration

  51. Subsets problem False: we want all solutions subsets([0, 0, 0, 0, 0],5,0) Calling: subsets([1, 0, 0, 0, 0],5,1) subsets([1, 0, 0, 0, 0],5,1) Calling: subsets([1, 1, 0, 0, 0],5,2) choice: keep or subsets([1, 1, 0, 0, 0],5,2) discard element Calling: subsets([1, 1, 1, 0, 0],5,3) subsets([1, 1, 1, 0, 0],5,3) an admissible solution has decided if to Calling: subsets([1, 1, 1, 1, 0],5,4) keep or discard all elements subsets([1, 1, 1, 1, 0],5,4) S:[1, 1, 1, 1, 1] c:1 i:4 1 1 1 1 1 S:[1, 1, 1, 1, 0] c:0 i:4 1 1 1 1 0 Calling: subsets([1, 1, 1, 0, 0],5,4) subsets([1, 1, 1, 0, 0],5,4) S:[1, 1, 1, 0, 1] c:1 i:4 1 1 1 0 1 S:[1, 1, 1, 0, 0] c:0 i:4 1 1 1 0 0 Calling: subsets([1, 1, 0, 0, 0],5,3) subsets([1, 1, 0, 0, 0],5,3) ...

  52. Subsets problem ( → i.e. 2^n sets, printing each costs n)

  53. Subsets problem ( → i.e. 2^n sets, printing each costs n) ( → 11111 first and then values decrease...)

  54. Subsets problem:iterative code n=3, k=1

  55. Subsets problem:iterative code n=3, k=1 What is the complexity of this iterative code? all subsets (cost: O(2^n)) creation of the subsets (cost: O(n)) printing subsets (cost: O(n)) How many solutions are we testing? no pruning… can we improve this?

  56. Subsets problem: bactracking n=3, k=1 Still generates 2^n subsets, for each it will count how we want all solutions many 1s are present and finally print only the ones having a correct number of 1s. What is the complexity of this backtracking count how many 1s code? admissible solutions have k 1s How many solutions are we testing? no pruning… can we improve this?

  57. Subsets problem: bactracking & pruning X n=3, k=1 generate only solutions that can potentially be admissible! What is the complexity of this iterative code?

  58. Sudoku

  59. Sudoku: pseudocode

  60. Sudoku: pseudocode

  61. Sudoku: python code

  62. 8 queens puzzle

  63. 8 queens puzzle

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend