cmsc 641 algorithms
play

CMSC 641: Algorithms Overlapping subproblems: few subproblems in - PDF document

Review: Dynamic Programming When applicable: Optimal substructure: optimal solution to problem consists of optimal solutions to subproblems CMSC 641: Algorithms Overlapping subproblems: few subproblems in total, many recurring


  1. Review: Dynamic Programming ● When applicable: ■ Optimal substructure: optimal solution to problem consists of optimal solutions to subproblems CMSC 641: Algorithms ■ Overlapping subproblems: few subproblems in total, many recurring instances of each ■ Basic approach: ○ Build a table of solved subproblems that are used to solve larger ones NP Completeness ○ What is the difference between memoization and dynamic programming? � Memoization employs a top-down strategy � Advantage:Selective subproblems � Disadvantage: Overhead of recursion Koustuv Dasgupta Koustuv Dasgupta Review: Greedy Algorithms Review: Activity-Selection Problem ● A greedy algorithm always makes the choice ● The activity selection problem : get your that looks best at the moment money’s worth out of a carnival ■ The hope: a locally optimal choice will lead to a ■ Buy a wristband that lets you onto any ride globally optimal solution ■ Lots of rides, starting and ending at different times ■ For some problems, it works ■ Your goal: get onto as many rides as possible ○ Yes: fractional knapsack problem ● Naïve first-year CS major strategy: ○ No: playing a bridge hand ■ Ride the first ride, get off, get on the very next ride ● Dynamic programming can be overkill; greedy possible, repeat until carnival ends algorithms tend to be easier to code ● What is the sophisticated third-year strategy? Koustuv Dasgupta Koustuv Dasgupta Review: Activity-Selection Review: Activity-Selection ● Formally: ● Formally: ■ Given a set S of n activities ■ Given a set S of n activities ○ s i = start time of activity i ○ s i = start time of activity i ○ f i = finish time of activity i ○ f i = finish time of activity i ■ Find max-size subset A of compatible activities ■ Find max-size subset A of compatible activities ■ Assume activities sorted by finish time ● What is an optimal substructure for this ■ Assume activities sorted by finish time problem? ● What is the optimal substructure for this ■ If k is the activity in A with the earliest finish time, problem? then A - { k } is an optimal solution to S* = { i ∈ S : s i ≥ f k } Koustuv Dasgupta Koustuv Dasgupta 1

  2. Review: Greedy Choice Property Review: For Activity Selection The Knapsack Problem ● Dynamic programming? Memoize? Yes, but… ● The 0-1 knapsack problem : ● Activity selection problem also exhibits the greedy ■ A thief must choose among n items, where the i th item choice property: worth v i dollars and weighs w i pounds ■ Locally optimal choice ⇒ globally optimal solution ■ Carrying at most W pounds, maximize value ● A variation, the fractional knapsack problem : ■ Pick the activity a* with the earliest finish time ■ if S is an activity selection problem sorted by finish time, ■ Thief can take fractions of items then ∃ optimal solution ■ Think of items in 0-1 problem as gold ingots, in fractional A ⊆ S such that {a*} ∈ A problem as buckets of gold dust ○ Sketch of proof: if ∃ optimal solution B that does not contain {a*}, ● What greedy choice algorithm works for the we can always replace first activity b* in B with {a*} ( Why? ) . Same number of activities, thus optimal. fractional problem but not the 0-1 problem? ■ Value per pound : v i / w i Koustuv Dasgupta Koustuv Dasgupta NP-Completeness Polynomial-Time Algorithms ● Some problems are intractable : ● Are some problems solvable in polynomial time? as they grow large, we are unable to solve ■ Of course: every algorithm you have studied provides them in reasonable time polynomial-time solution to some problem ■ We define P to be the class of problems solvable in ● What constitutes reasonable time? Standard polynomial time working definition: polynomial time ● Are all problems solvable in polynomial time? ■ On an input of size n the worst-case running time ■ No: Turing’s “Halting Problem” is not solvable by any is O( n k ), for some constant k computer, no matter how much time is given ■ Polynomial time: O(n 2 ), O(n 3 ), O(1), O(n lg n) ○ Undecidable problems ■ Not in polynomial time: O(2 n ), O( n n ), O( n !) ■ Such problems are clearly intractable, not in P Koustuv Dasgupta Koustuv Dasgupta An NP-Complete Problem: NP-Complete Problems Hamiltonian Cycles ● The NP-Complete problems are an interesting ● An example of an NP-Complete problem: class of problems whose status is unknown ■ A Hamiltonian cycle of an undirected graph is a ■ No polynomial-time algorithm has been simple cycle that contains every vertex discovered for an NP-Complete problem ■ The Hamiltonian-cycle problem: given a graph G, ■ No super polynomial lower bound has been proved does it have a Hamiltonian cycle? for any NP-Complete problem, either ■ Describe a naïve algorithm for solving the ● We call this the P = NP question Hamiltonian-cycle problem. Running time? ■ Biggest open problem in CS ? ■ Maybe … who am I to say Koustuv Dasgupta Koustuv Dasgupta 2

  3. P and NP Nondeterminism ● As mentioned, P is set of problems that can be ● Think of a non-deterministic computer as a solved in polynomial time computer that magically “guesses” a solution, then has to verify that it is correct ● NP ( nondeterministic polynomial time ) is the set of problems that can be solved in ■ If a solution exists, computer always guesses it polynomial time by a nondeterministic ■ One way to imagine it: a parallel computer that can freely spawn an infinite number of processes computer ○ Have one processor work on each possible solution ■ What is that? ○ All processors attempt to verify that their solution works ○ If a processor finds one, we have a working solution ■ So: NP = problems verifiable in polynomial time Koustuv Dasgupta Koustuv Dasgupta P and NP NP-Complete Problems ● We will see that NP-Complete problems are the ● Summary so far: “hardest” problems in NP: ■ P = problems that can be solved in polynomial time ■ If any one NP-Complete problem can be solved in ■ NP = problems for which a solution can be verified polynomial time… in polynomial time ■ …then every NP-Complete problem can be solved in ■ Unknown whether P = NP (most suspect not) polynomial time… ■ …and in fact every problem in NP can be solved in ● Hamiltonian-cycle problem is in NP : polynomial time (which would show P = NP ) ■ Cannot solve in polynomial time ■ Thus: solve Hamiltonian-cycle in O( n 100 ) time, ■ Easy to verify solution in polynomial time ( How? ) you’ve proved that P = NP . Retire rich & famous. Koustuv Dasgupta Koustuv Dasgupta Reduction Reducibility ● The crux of NP-Completeness is reducibility ● An example: ■ P: Given a set of Booleans, is at least one TRUE? ■ Informally, a problem P can be reduced to another ■ Q: Given a set of integers, is their sum positive? problem Q if any instance of P can be “easily ■ Transformation: (x 1 , x 2 , …, x n ) = (y 1 , y 2 , …, y n ) where y i = rephrased” as an instance of Q, the solution to which 1 if x i = TRUE, y i = 0 if x i = FALSE provides a solution to the instance of P ● Another example: ○ What do you suppose “easily” means? ■ Solving linear equations is reducible to solving quadratic ○ This rephrasing is called transformation equations ■ Intuitively: If P reduces to Q, P is “no harder to ○ we can easily use a quadratic-equation solver to solve linear solve” than Q equations Koustuv Dasgupta Koustuv Dasgupta 3

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend