applied algorithm design lecture 6
play

Applied Algorithm Design Lecture 6 Pietro Michiardi Institut - PowerPoint PPT Presentation

Applied Algorithm Design Lecture 6 Pietro Michiardi Institut Eurcom Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 1 / 91 Local Search Algorithms Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 2 / 91 Introduction


  1. Applied Algorithm Design Lecture 6 Pietro Michiardi Institut Eurécom Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 1 / 91

  2. Local Search Algorithms Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 2 / 91

  3. Introduction Local search (LS) is a very general technique It describes an algorithm that “explores” the solution space in a sequential fashion The algorithm moves in each step from a current solution to a “ nearby ” one The main advantage of such technique is that it is very easy to design and implement; furthermore, LS addresses almost any kind of computationally hard problems Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 3 / 91

  4. Introduction The main problem that derives from this flexibility is that it is often very difficult to say anything precise or provable about the quality of the solutions that a LS algorithm finds As a consequence, it is very difficult to make a distinction between a good LS heuristic and a bad one Today we will discuss a number of LS algorithms that find good, but not necessarily optimal solutions It is useful to proceed by relating LS to energy minimization principles in physics Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 4 / 91

  5. Introduction Note that there are cases in which it is possible to prove properties of LS algorithms, and to bound their performance relative to an optimal solution In fact, we will focus on a set of problems, Facility Location problems, in which it is possible to design LS algorithms that appear trivial Hence, a major effort will go into: ◮ Proving that these algorithms achieve a solution close to the optimal one ◮ Finding a relation between approximation and LS algorithm Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 5 / 91

  6. The Landscape of an Optimization Problem Much of the core of local search was developed by people thinking in terms of analogies with physics Looking at a range of hard computational problem, they reasoned as follows ◮ Physical systems are performing minimization all the time ◮ They do so when they try to minimize their potential energy ◮ What can we learn from the ways nature performs minimization? Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 6 / 91

  7. Potential energy Example We now focus on a simple example, the movement of a sphere (ball) Why do balls roll downhill? From Newtonian mechanics, the ball is trying to minimize its potential energy If a ball has a mass m and falls a distance of h , it looses an amount of potential energy proportional to mh A funnel Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 7 / 91

  8. Potential energy If we make the landscape a bit more complicated, some extra issues creep in A jagged funnel There is a lower point in the figure, which is more desirable to place the ball to rest If we start rolling the ball from the wrong side, it will not be able to get over the barrier and settle in the lowest point Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 8 / 91

  9. Potential energy We say that the ball is trapped in a local minimum A local minimum looks like the lowest point from neighbor locations However, if we look at the whole landscape we know that there is a global minimum Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 9 / 91

  10. The Connection to Optimization The physical system can be in one of a large number of possible states Its energy is a function of its current state From a given state, a small perturbation leads to a “ neighboring ” state The way that these neighboring states are linked together, along with the structure of the energy function on them, defines the underlying energy landscape Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 10 / 91

  11. The Connection to Optimization Relation to optimization problems In an optimization problem, we have a large (typically exponential-size) set C of possible solutions We also have a cost function c ( · ) that measures the quality of each solution For a solution S ∈ C we write its cost as c ( S ) The goal: Find a solution S ∗ ∈ C for which c ( S ∗ ) is as small as possible Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 11 / 91

  12. The Connection to Optimization Neighborhoods We now move on and define precisely what do we mean by neighboring solutions We capture the idea that S ′ can be obtained by a small modification of another solution S We write S ′ ∼ S to denote that S ′ is a neighboring solution of S We use N ( S ) to denote the neighborhood of S N ( S ) = { S ′ : S ∼ S ′ } Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 12 / 91

  13. The Connection to Optimization The set C of possible solutions and the cost function c ( · ) are provided by the specification of the problem Instead, we can make any neighbor we want Local Search Algorithm A Local Search algorithm takes this setup, including a neighbor relation, and works according to the following scheme: At all times it maintains a current solution S ∈ C In a given step, it chooses a neighbor S ′ of S , declares S ′ to be the new current solution and iterates Through the execution of the algorithm, it remembers the minimum-cost solution that it has seen so far Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 13 / 91

  14. The Connection to Optimization The crux of a LS algorithm is in the choice of the neighbor relation, and in the design of the rule for choosing a neighboring solution at each step One can think of a neighbor relation as defining a graph on the set of all possible solutions Edges join neighboring pairs of solutions A LS algorithm can be viewed as performing a walk on this graph, trying to move toward a good solution Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 14 / 91

  15. The Vertex Cover Problem The Vertex Cover Problem You are given a graph G = ( V , E ) A set S ⊆ V is a vertex cover if each edge e ∈ E has at least one end in S The Weighted Vertex Cover Problem You are given a graph G = ( V , E ) Each v ∈ V has a weight w i ≥ 0 The weight of a set S of vertices is denoted w ( S ) = � i ∈ S w i A set S ⊆ V is a vertex cover if each edge e ∈ E has at least one end in S Find a vertex cover S of minimum weight w ( S ) Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 15 / 91

  16. The Vertex Cover Problem So, we are given a graph G The set C of possible solutions consists of all subsets S of V that form vertex covers ◮ We always have that V ∈ C The cost c ( S ) of a vertex cover will simply be its size, if no weights are given, or its weight With such a model, minimizing the cost of a vertex cover is the same as finding one of minimum size Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 16 / 91

  17. The Vertex Cover Problem We focus on examples on local search algorithms that use a particularly simple neighbor relation We say that S ∼ S ′ if S ′ can be obtained from S by adding or deleting a single node Thus, our local search algorithm will be walking through the space of possible vertex covers, adding or deleting a node to their current solution in each step, and trying to find as small a vertex cover as possible Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 17 / 91

  18. The Vertex Cover Problem Proposition Each vertex cover S has at most n neighboring solutions Proof. Each neighboring solution of S is obtained by adding or dropping a distinct node There are a maximum of n nodes in the graph G A consequence of the previous proposition is that we can efficiently examine all possible neighboring solutions of S Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 18 / 91

  19. Gradient Descent Method Gradient descent Start with S = V If there is a neighbor S ′ that is a vertex cover and has lower cardinality, replace S with S ′ . Proposition The above algorithm terminates after at most n steps Proof. Each update decreases the size of the cover by one Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 19 / 91

  20. Gradient Descent Method Informally, gradient descent moves strictly “downhill” as long as it can Once this is no longer possible, it stops Note The gradient descent algorithm terminates precisely at solutions that are local minima That is, solutions S such that, for all neighboring S ′ , we have that: c ( S ) ≤ c ( S ′ ) Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 20 / 91

  21. Gradient Descent Method Example 1 Think about an empty graph , that is a graph with no edges The empty set is the optimal solution (there are no edges to cover) Gradient descent does very well! It starts with the full vertex set V and keeps deleting nodes until there are none left → There is a unique local minimum which is also the unique global minimum Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 21 / 91

  22. Gradient Descent Method Example 2 optimum = center node only local optimum = all other nodes Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 22 / 91

  23. Gradient Descent Method The minimum vertex cover for G is the singleton composed by the center node Gradient descent can reach this solution by deleting successively all other nodes in any order But if the algorithm starts by deleting the center first, then it is immediately stuck! It reaches a local minimum Pietro Michiardi (EUR) Applied Algorithm Design Lecture 6 23 / 91

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend