SLIDE 1
Complexity Theory So far in this course, almost all algorithms had - - PowerPoint PPT Presentation
Complexity Theory So far in this course, almost all algorithms had - - PowerPoint PPT Presentation
Complexity Theory So far in this course, almost all algorithms had polynomial running time , i.e., on inputs of size n , worst-case running time is O ( n k ) for some constant k . Is this always the case? Obviously no (just consider basic FF
SLIDE 2
SLIDE 3
Obviously, P ⊆ NP. Study “NP-complete” problems. They are in NP, and, in a sense, they are “hard” among all problems in NP (or, as hard as any other); will be formalised later. Will prove later: if one can show for only one NP-complete problem that it is in fact in P, then P = NP. Seems to be difficult, so far nobody has succeeded ;-) Sometimes slight modifications to some problem make a huge difference. Euler tour vs Hamiltonian cycle. Euler tour: does given directed graph have a cycle that traverses each edge exactly once? Easy, O(E). Hamiltonian cycle: does given directed graph have a simple cycle that contains each vertex? NP- complete!
Complexity 3
SLIDE 4
Back to this verification business. Example Hamiltonian cycle. As mentioned, NP-complete, so apparently hard, perhaps no polynomial-time algorithm that can compute a solution (for all in- stances). But: verifying a solution/certificate is trivial! Certificate is sequence (v1, v2m . . . , v|V |) (some ordering of vertices), just have to check whether it’s a proper cycle. Techniques for proving NP-completeness differ from “usual” techniques for de- signing or analysing algorithms. In the following, a few key concepts.
Complexity 4
SLIDE 5
1) Decision problems vs optimisation problems Many problems are optimisation problems: compute shortest paths, maximum matching, maximum clique, maximum independent set, etc. Concept of NP-completeness does not apply (directly) to those, but to decision problems, where answer is just “yes” or “no” Does this given graph have a Hamiltonian cycle? Given this graph, two vertices, and some k, does G contain a path of length at most k from u to v? However, usually close relationship. In a sense, decision is “easier” than “optimi- sation” (or not harder). Example: can solve SP decision problem by solving SP optimisation: just com- pare length of (computed) shortest path with k. General idea: evidence that decision problem is hard implies evidence, that re- lated optimisation problem is hard.
Complexity 5
SLIDE 6
2) Encodings When we want an algorithm to solve some problem, problem instances must be encoded. For instance, encode natural numbers as strings {0, 1, 10, 11, 100, 101, . . .} encoding e : IN → {0, 1}∗ with e(i) = (i)2 Concrete problem: problem whose instance set is the set of binary strings (read: encodings of “real” instances) We say an alg. solves some concrete decision problem in time O(T(n)) if, when given problem instance i of length n = |i|, it can produce solution in time O(T(n)). A concrete problem is poly-time solvable, if there is an alg. to solve it in time O(nk) for some constant k. P is the set of all poly-time solvable concrete decision problems.
Complexity 6
SLIDE 7
Would like to talk about abstract problems rather than concrete ones, indepen- dent of any particular encoding.
- But. . . efficiency depends heavily on encoding.
Example: suppose integer k is input to an alg., running time is Θ(k). If k is given in unary, then then running time is O(n) in length-n inputs, polynomial. If k is given binary, then input length is n = ⌊log k⌋ + 1; exponential running time Θ(k) = Θ(2n). In practice, rule out “expensive” encodings like unary.
Complexity 7
SLIDE 8
A formal-language framework
An alphabet Σ is finite set of sysmbols. A language L over Σ is any set of strings made up of symbols from Σ. Empty string is ǫ, empty language is ∅. Language of all strings over Σ is Σ∗ Ex: Σ = {0, 1} ⇒ Σ∗ = {ǫ, 0, 1, 00, 01, 10, 11, 000, . . .}, all binary strings. Every language L over Σ is subset of Σ∗. Union L1∪L2 and intersection L1∩L2 just like with ordinary sets, complement ¯ L = Σ∗ − L. Concatenation of L1 and L2 is {xy : x ∈ L1, y ∈ L2}. Closure of L is {ǫ} ∪ L ∪ L2 ∪ L3 ∪ · · · with Lk being L concatenated to itself k times.
Complexity 8
SLIDE 9
Set of instances for any decision problem Q is Σ∗, where Σ = {0, 1}. Q is entirely characterised by those instances that produce 1 (yes), so we view Q as language L over Σ = {0, 1} with L = {x ∈ Σ∗ : Q(x) = 1} Ex: (<something> meaning “standard” enc. of something) HAMILTON = {< G > : G contains Hamiltonian cycle} SAT = {< F > : formula F is satisfyable}
Complexity 9
SLIDE 10
Encoding: Using formal languages is ”fine”
We say function f : {0, 1}∗ → {0, 1}∗ is polynomially computable if there is polynomial-time TM M that, given x ∈ {0, 1}∗, computes f(x). For set I of problem instances, two encodings e1 and e2 are polynomially re- lated if there are two poly-time computable functions f12 and f21 such that ∀i ∈ I, f12(e1(i)) = e2(i) and f21(e2(i)) = e1(i) If two encodings of some abtract problem are polynomially related, then whether problem is in P is independent of which encoding we use. Note: the length can increase only by a polynomial factor! It can be shown: the ”standard inputs” (graph as an adjacency matrix,...) are polynomially related.
Complexity 10
SLIDE 11
- Lemma. Let Q be an abstract decision problem on an instance set I, let e1, e2,
be polynomially related encodings on I. Then, e1(Q) ∈ P iff e2(Q) ∈ P.
- Proof. We show one direction (e1(Q) ∈ P ⇒ e2(Q) ∈ P), other is symmetric.
Suppose e1(Q) ∈ P, i.e., e1(Q) can be solved in time O(nk) for some constant k, and that for any instance i, e1(i) can be computed from e2(i) in time O(nc) for some constant c, n = |e2(i)|. To solve e2(Q), on input e2(i), first compute e1(i), and run alg. for e1(Q) on e1(i). Time: conversion takes O(nc), therefore |e1(i)| = O(nc) (can’t write more). Solving e1(i) takes O(|e1(i)|k) = O(nck), polynomial. Thus e2(Q) ∈ P.
Complexity 11
SLIDE 12
3) Machine Model Our goal is to say that Problem A can not be solved in polynomial time. But on which machine? A parallel machine with 1000 processors? A modern computer? Can we solve more in polynomial time if the computers get faster? In complecity theory people use a very simple machine model, the so-called Tur- ing machine. One can show that everything that can be solved by a modern computer in poly- nomial time can also be solved on a TM in polynomial time.
Complexity 12
SLIDE 13
Machine Model
A TM (Turing machine) consists of a tape of infinite length (in one direction) and a pointer. The tape consists of cells and every cell can store one symbol. The pointer points to one of the memory cells. In the beginning the input is in cell 1, ..... and the pointer position is cell 1. The TM has a finite control. In every step the control is in one of a finite amount
- f states.
In every step the TM does the following.
- It reads the symbol at the actual pointer position
- It writes a new symbol into the position (which one depends on the state)
- It can move the pointer one step to the left or to the right.
Complexity 13
SLIDE 14
Formally, a Turing Machine M = (Q, Σ, Γ, δ, q0, B, F) is defined as follows.
- Q is the (finite) set of states.
- Γ is the set of tape symbols.
- Σ is the set of input symbols (Γ ⊂ Σ).
- B = {N, R, L}.
- δ : (Q × Γ) → (Q × Γ × {L, R, N} is the transition function.
- q0 ∈ Q is the initial state.
- F ⊂ Q is an end state.
Complexity 14
SLIDE 15
State of the TM:
α1qα2 with q ∈ Q and α1, α2 ∈ Γ∗. here α = α1α2 is the contents of the tape. If now α2 = αα′
2 the TM looks up (q, α) = (q′, β, M).
- If M = N the next state will be α1q′βα′
2.
- If M = R the next state will be α1βq′α′
2.
- If M = R and α1 = α′
1γ the next state will be α1q′γβα′ 2.
We write α1qα2 ⇒ α′
1q′α′ 2 if a one step transition from α1qα2 to α′ 1q′α′ 2 exists.
We write α1qα2⇒
∗ α′ 1q′α′ 2 if a (possibly long) transition from α1qα2 to α′ 1q′α′ 2
exists. Examples for TM: see homework.
Complexity 15
SLIDE 16
The language L accepted by TM M is the set of words from Σ∗ so that M reaches a state in F. To make our life easier we will say in the following that M
- utputs 1.
In all other cases M rejects in input. It can stop in a state not in F and output 0,
- r M can go into an endless loop.
Complexity 16
SLIDE 17
We say TM M accepts string x ∈ {0, 1}∗ if, given input x, M’s output is M(x) = 1. The language accepted by M is L = {x ∈ Σ∗ : M(x) = 1}. M rejects x if M(x) = 0. Important: even if L is accepted by some TM M, M will not necessarily reject x ∈ L. A language L is decided by TM M, if L is accepted by M and every string not in L is rejected. L is accepted/decided in polynomial time by TM M, if it is accepted/decided by M in time O(nk) for some const. k. Now: alternative definition P = {L ⊆ {0, 1}∗ : ∃ TM M that decides L in poly-time}
Complexity 17
SLIDE 18
Moreover:
- Theorem. P = {L : L is accepted by a TM in poly-time}.
- Proof. Clearly decided⇒accepted, so we need only show accepted⇒decided.
Let L be accepted by M in time O(nk), thus there is constant c s.t. M accepts L in at most T = cnk steps. Construct M′: for any input string x, M′ simulates M for T steps. If M has accepted, then M′ accepts x by outputting 1, otherwise M′ rejects by outputting q.e.d. Note: proof is nonconstructive, we may not know runtime bound for given L ∈ P (but it exists!).
Complexity 18
SLIDE 19
Poly-time verification
Look at TMs that verify membership in languages. Ex: for instance < G > of HAMILTON, we are also given ordering c of vertices on
- cycle. Can easily check whether c is a proper cycle on all vertices of G, so c is
certificate that G indeed belongs to HAMILTON. As we will see later, HAMILTON is NP-complete, and thus most likely not in P, but verification is easy. Define verification TM: two argument TM M , one argument is ordinary input string x, other is binary string y called certificate. M verifies input string x if ∃ certificate y s.t. M(x, y) = 1. The language verified by M is L = {x ∈ {0, 1}∗ : ∃y ∈ {0, 1}∗ s.t. M(x, y) = 1} Intuition: M verifies L if ∀x ∈ L there is y that M can use to prove x ∈ L. For any x ∈ L, there must be no certificate proving that x ∈ L. Ex: If G ∈ HAMILTON there must be no permutation of vertices that can fool verifier into believing G is hamiltonian.
Complexity 19
SLIDE 20
Def: a language L belongs to NP if and only if ∃ two-input poly-time TM M and constant c s.t. L = {x ∈ {0, 1}∗ : ∃ certificate y with |y| = O(|x|c) such that M(x, y) = 1} Historically, NP=“non-deterministic poly-time”, all the problems that are ac- cepted by a poly-time non-deterministic Turing machine (NTM), which
- 1. non-deterministically “guesses” a solution (certificate) if there is one, and
- 2. deterministically verifies it
in poly-time. See the similarity? Facts: NP = ∅ (HAMILTON∈ NP) P ⊆ N P
Complexity 20
SLIDE 21
Open questions: P = NP or P = NP ? NP closed under complement, i.e., L ∈ NP ⇒ ¯ L ∈ NP ? (with co-NP = {L : ¯ L ∈ NP}, equiv. to NP = co-NP) Since P is closed under complement, P ⊆ NP ∩co-NP. Thus four possibilities:
co−NP NP P = NP co−NP P = NP co−NP P co−NP NP P NP = co−NP P = NP = co−NP
TL: NP = co-NP and P = NP, most unlikely of the four TR: NP = co-NP and P = NP BL: NP = co-NP and P = NP ∩ co-NP BR: NP = co-NP and P = NP ∩ co-NP, most likely
Complexity 21
SLIDE 22
NP-completeness
Remember: class of NP-complete problems; property: if one of them is in P, then all of NP. Reducibility Intuition: problem Q can be reduced to Q′ if any instance of Q can be “easily rephrased” as instance of Q′ We say that language L1 is poly-time reducible to language L2, written L1 ≤p L2, if there exists a poly-time computable function f : {0, 1}∗ → {0, 1}∗ s.t. ∀x ∈ {0, 1}∗, x ∈ L1 if and only if f(x) ∈ L2. f is called reduction function.
Complexity 22
SLIDE 23
NP-completeness II
A language L ⊆ {0, 1}∗ is called NP-complete if
- 1. L ∈ NP, and
- 2. L′ ≤p L for every L′ ∈ NP.
If L satisfies only (2) it is called NP-hard.
Complexity 23
SLIDE 24
Idea of poly-time reduction from L1 to L2:
- L1, L2 ⊆ {0, 1}∗
- f provides poly-time mapping s.t.
– if x ∈ L1 then f(x) ∈ L2 – if x ∈ L1 then f(x) ∈ L2
- Thus f maps any instance x of decision problem represented by L1 to in-
stance f(x) of problem represented by L2
- Answer to whether f(x) ∈ L2 directly provides answer to whether x ∈ L1
SLIDE 25
{0,1}* {0,1}* L1 L2 f
Complexity 24
SLIDE 26
Proving that some language is in P is easy now:
- Lemma. If L1, L2 ⊆ {0, 1}∗ with L1 ≤p L2, then L2 ∈ P implies L1 ∈ P.
- Proof. Let M2 be poly-time TM that decides L2, let F be poly-time TM that
computes reduction function f. Construction of poly-time TM M1 that decides L1:
x f(x) F A2 yes, f(x) in L2 no, f(x) not in L2 yes, x in L1 no, x not in L1 A1
Complexity 25
SLIDE 27
The set of NP-complete languages is called NPC. NP-completeness may helpful for showing P = NP or P = NP:
- Theorem. (i) If any NP-complete problem is in P, then P = NP. Equivalently,
(ii) if any problem in NP is not in P, then no NP-complete problem is in P.
- Proof. (i) Suppose L ∈ P and L ∈ NPC. For any L′ ∈ NP, we have L′ ≤p L
by definition of NP-completeness. By last lemma, L′ ∈ P, and thus NP = P. (ii) is contrapositive of (i) [a la (A ⇒ B) ⇐ ⇒ (¬B ⇒ ¬A)]
Complexity 26
SLIDE 28
Most people believe P = NP, thus
NP NPC P
Anyways, how does one prove NP-completeness of some particular problem? Two ways. . .
- “By hand” (master reduction), show that L ∈ NP (easy part) and L′ ≤p L
for every L′ ∈ P (hard part). Most famous (and first): Cook’s proof that SAT∈ NPC.
- or (usually significantly) easier, given that you already know some other prob-
lem to be NP-complete:
Complexity 27
SLIDE 29
A helpful lemma:
- Lemma. (Transitivity) “≤p” is a transitive relation on languages, i.e., if L1 ≤p L2
and L2 ≤p L3, then L1 ≤p L3.
- Proof. By definition, there are poly-time functions f and g such that x ∈ L1 ⇔
f(x) ∈ L2 and y ∈ L2 ⇔ g(y) ∈ L3, thus x ∈ L1 ⇔ f(x) ∈ L2 ⇔ g(f(x)) ∈
- L3. Obviously, g(f(·)) is poly-time (since |f(x)| is polynomial in |x|).
Complexity 28
SLIDE 30
- Theorem. If L is a language s.t. L′ ≤p L for some L′ ∈ NPC, then L is NP-
- hard. Moreover, if L ∈ NP, then L ∈ NPC.
- Proof. L′ is NP-complete, thus we have L′′ ≤p L′ for all L′′ ∈ NP. With
L′ ≤p L and by transitivity (last lemma) we have L′′ ≤p L, so L is NP-hard. If L ∈ NP, we have L ∈ NPC by definition.
L’ L NP
Method for proving that L ∈ NPC:
- prove L ∈ NP
- select known NP-complete L′
- describe alg. F that computes f mapping every instance x ∈ {0, 1}∗ of L′
to instance f(x) of L
- prove that f satisfies x ∈ L′ ⇔ f(x) ∈ L ∀x ∈ {0, 1}∗
- prove that F runs in poly time
Complexity 29
SLIDE 31
Formula satisfiability
Was first problem to be proven NP-complete. Among most popular problems for reductions. Definition in terms of language SAT. An instance of SAT is a boolean formula F composed of
- 1. n boolean variables x1, x2, . . . , xn
- 2. m boolean operators; any boolean function with one or two inputs and one
- utput, such as ∧ (AND), ∨ (OR), ¬ (NOT), → (implication), ↔ (if and only if)
- 3. parenthesis (WLOG assume no redundant parenthesis)
Easy to encode F in length polynomial in n + m. A truth assignment for F is set of values for the variables. A satisfying truth assignment is a t.a. that causes it to evaluate to 1.
SLIDE 32
A formula with a satisfying t.a. is called satisfiable. SAT = {< F >: F is a satisfiable boolean formula}.
Complexity 30
SLIDE 33
Example F = ((x1 → x2) ∨ ¬((¬x1 ↔ x3) ∨ x4)) ∧ ¬x2 has satisfying assignment x1 = 0, x2 = 0, x3 = 1, x4 = 1 F = ((0 → 0) ∨ ¬((¬0 ↔ 1) ∨ 1)) ∧ ¬0 = (1 ∨ ¬((1 ↔ 1) ∨ 1)) ∧ 1 = (1 ∨ ¬(1 ∨ 1)) ∧ 1 = (1 ∨ ¬1) ∧ 1 = 1 ∧ 1 = 1 Note: a b a → b 1 1 1 1 1 1 1 a b a ↔ b 1 1 1 1 1 1
SLIDE 34
Thus F ∈ SAT.
Complexity 31
SLIDE 35
- Theorem. SAT is NP-complete.
- Proof. (i) SAT ∈ NP. Easy, replace each variable with corresponding value,
- evaluate. Polynomial time. (ii) SAT is NP-hard, i.e., L ≤p SAT for all L ∈ NP.
Not quite so easy, proof not shown. Now: could use SAT to show NP-completeness of other languages. First, consider CNF-SAT and 3-CNF-SAT. A literal is an occurrence of variable (xi) or its negation (¬xi). A boolean formula is in conjunctive normal form (CNF) if it is an AND of clauses, each of which is OR of literals. Ex: (x1 ∨ x2 ∨ ¬x3)
- clause
∧ (x2 ∨ x4)
- clause
∧ (x2 ∨ ¬x3 ∨ ¬x4 ∨ ¬x5)
- clause
A formula is in 3CNF if each clause contains exactly three, distinct literals.
Complexity 32
SLIDE 36
CNF-SAT = {< F >: F is in CNF and is satisfiable} 3-CNF-SAT = {< F >: F is in 3CNF and is satisfiable}
- Theorem. CNF-SAT is NP-complete.
- Proof. Obviously in NP (since SAT ∈ NP). NP-hardness can be shown e.g. by
reduction from SAT, i.e., SAT ≤p CNF-SAT. Proof not shown.
- Theorem. 3-CNF-SAT is NP-complete.
- Proof. Obviously in NP (since SAT ∈ N P and CNF-SAT ∈ NP). NP-hardness
by reduction from CNF-SAT. Let F = F1 ∧ F2 ∧ · · · ∧ Fk be a CNF formula. Suppose clause Fi has more than three literals e.g. Fi = α1 ∨ α2 ∨ · · · ∨ αℓ with ℓ > 3
SLIDE 37
Note: α are literals rather than variables. We want to construct new clauses with exactly three literals each, but same satisfiability property.
Complexity 33
SLIDE 38
Introduce new variables y1, y2, . . . , yℓ−3 and replace Fi with F ′
i
= (α1 ∨ α2 ∨ y1) ∧ (α3 ∨ ¬y1 ∨ y2) ∧ (α4 ∨ ¬y2 ∨ y3) ∧ · · · ∧ (αℓ−2 ∨ ¬yℓ−4 ∨ yℓ−3) ∧ (αℓ−1 ∨ αℓ ∨ ¬yℓ−3) Now, an assignment satisfies Fi if and only if it (rather, an extension) satisfies F ′
i.
“⇒” Suppose Fi is satisfied by some assignment. Must be that αj = 1 for some j. Suppose α1 = · · · = αj−1 = 0 and αj = 1. Then y1 = · · · = yj−2 = 1 and yj−1 = · · · = yℓ−3 = 0 is an extension of the original assignment that satisfies F ′
i (all new clauses in F ′ i
are satisfied).
Complexity 34
SLIDE 39
“⇐” Suppose some assignment satisfies F ′
- i. Need to show that for Fi, ∃j with αj = 1
when restricting to assignment for α’s. Suppose that’s not true, i.e. αi = 0 for i = 1, . . . , ℓ. Then, by construction, must be that y1 = 1 (1st clause). Also, since y1 = 1 ⇔ ¬y1 = 0, must be that y2 = 1 (2nd clause). By induction, must be that ym = 1 ∀m ∈ {1, . . . , ℓ − 3}. But wait, hang on, don’t move! Last clause in F ′
i would be 0,
<blink> contradiction! </blink> Conclusion: each assignm. that satisfies F ′
i also satisfies Fi. Complexity 35
SLIDE 40
What about size 1&2 clauses? (α1) − → (α1 ∨ y1 ∨ y2) ∧ (α1 ∨ y1 ∨ ¬y2) ∧ (α1 ∨ ¬y1 ∨ y2) ∧ (α1 ∨ ¬y1 ∨ ¬y2)∧ If LHS satisfied, then obviously also RHS. Suppose RHS satisfied but α1 = 0. Then in each of the four RHS-clauses the y’s must take care of the business. But: impossible! (α1 ∨ α2) − → (α1 ∨ α2 ∨ y) ∧ (α1 ∨ α2 ∨ ¬y) Similar reasoning. Size of formula remains polynomial in what it was. Time also polynomial. (q.e.d.)
Complexity 36
SLIDE 41
Another NP-complete problem: CLIQUE
A clique in an undirected graph G = (V, E) is subset V ′ ⊆ V , each pair of which is connected by an edge in E, i.e., a complete subgraph. A k-clique is a clique of size k. Optimisation problem: given G, find a clique of maximum size. Decision problem: given G and k ∈ IN, does G contain k-clique? CLIQUE = {< G, k >: G is a graph with a k-clique} Naive solution: list all k-subsets, check each one. Running time is Ω
- k2|V |
k
- (time for checking × # of k-subsets).
Polynomial if k constant, but super-polynomial if e.g. k ≈ |V |/2.
Complexity 37
SLIDE 42
- Theorem. CLIQUE is NP-complete.
- Proof. To show CLIQUE ∈ NP, use V ′ ⊆ V as certificate. Just check whether
(u, v) ∈ E for all u, v ∈ V ′. Now we show 3-CNF-SAT ≤p CLIQUE. Given instance F = C1 ∧ C2 ∧ · · · ∧ Ck in 3CNF. Each clause Ci contains three distinct literals αi
1, αi 2, αi 3.
Shall construct G s.t. G contains k-clique if and only if F is satisfiable. For each clause Cr = (αr
1 ∨ αr 2 ∨ αr 3) we place triple of vertices vr 1, vr 2, vr 3 into
V . Edge between vr
i and vs j if
- 1. vr
i and vs j are in different triples, i.e., r = s, and
- 2. their corresponding literals are consistent: αr
i is not the negation of αs j Complexity 38
SLIDE 43
Ex: F = (x1 ∨ ¬x2 ∨ ¬x3) ∧ (¬x1 ∨ x2 ∨ x3) ∧ (x1 ∨ x2 ∨ x3) becomes
$x_1$ $\neg x_2$ $\neg x_3$ $\neg x_1$ $x_2$ $x_3$ $x_1$ $x_2$ $x_3$ 1st clause 2nd clause 3rd clause
Obviously, construction possible in polynomial time.
Complexity 39
SLIDE 44
“⇒” Suppose F has a satisfying assignment. Then each clause Cr contains ≥ 1 literal αr
i = 1.
Picking one such true literal from each clause corresponds to picking k vertices in G, yielding V ′. Claim: V ′ is a k-clique. For any vr
i , vs j ∈ V ′, r = s, both corresponding literals are true, so cannot be
complements. Thus, edge (vr
i , vs j) ∈ E.
“⇐” Suppose G has k-clique V ′. No inner-triple edges, thus exactly one vertex per triple in V ′. No edges between vertices corresponding to inconsistent literals. Thus, we can assign 1 to each literal αr
i s.t. vr i ∈ V ′.
One true literal per clause, thus F satisfied. (q.e.d.)
Complexity 40
SLIDE 45
Yet another example: INDEPENDENT-SET
An independent set (IS) in an undirected graph G = (V, E) is subset V ′ ⊆ V , each pair of which is not connected by an edge in E, i.e., an “edge-less” subgraph. A k-IS is an independent set of size k. Optimisation problem: given G, find an IS of maximum size. Decision problem: given G and k ∈ IN, does G contain a k-IS? INDEPENDENT-SET = {< G, k >: G is a graph with a k-IS} Naive solution: list all k-subsets, check each one. Running time is Ω
- k2|V |
k
- (time for checking × # of k-subsets).
Polynomial if k constant, but super-polynomial if e.g. k ≈ |V |/2.
Complexity 41
SLIDE 46
Why superpolynomial? Let n = |V |, then k2|V | k
- =
(n/2)2 ·
n
n/2
- =
(n2/4) · n! (n/2)! · (n − n/2)! = (n2/4) · n! ((n/2)!)2 By Stirling’s famous formula x! ≈ √ 2πx · (x/e)x, k2|V | k
- ≈
(n2/4) · √ 2πn · (n/e)n
- 2π(n/2) · (n/2e)(n/2)2
= (n2/4) · √ 2πn · (n/e)n 2π(n/2) · (n/2e)n = (n2/4) · √ 2πn 2π(n/2) · 2n = (n2/4) · 2 √ 2πn 2πn · 2n = (n2/4) · 2 √ 2πn · 2n = 2n2 4 √ 2π · √n · 2n = n3/2 2 √ 2π · 2n = Ω(2n)
SLIDE 47
- Theorem. INDEPENDENT-SET is NP-complete.
- Proof. To show INDEPENDENT-SET ∈ NP, use V ′ ⊆ V as certificate. Just check
whether (u, v) ∈ E for all u, v ∈ V ′. Now we show CLIQUE ≤p INDEPENDENT-SET. Given instance (G, k). Construct G′ = (V, E′) with E′ = V × V − {(v, v) : v ∈ V }
- self-loops
−E Claim: G contains k-clique if and only if G′ contains k-IS. Trivially true, since
- a clique is a subset of vertices with an edge between any pair of vertices,
and
- an IS is a subset of vertices without an edge between any pair of vertices.