Computer Science CPSC 322
Lectur ture 1 e 10 Stochastic Local Search
(4. (4.8)
Slide 1
Lectur ture 1 e 10 Stochastic Local Search (4. (4.8) Slide 1 - - PowerPoint PPT Presentation
Computer Science CPSC 322 Lectur ture 1 e 10 Stochastic Local Search (4. (4.8) Slide 1 Lect cture re O Overvi rview Recap Domain Splitting for Arc Consistency Local Search Stochastic Local Search (SLS) Comparing SLS
Slide 1
2
3
“Consistent before” means each element yi in Y must have an element xi in X that satisfies the constraint. Those xi would not be pruned from Dom(X), so arc 〈Y,c〉 stays consistent
The domains of Zi have not been touched
Z1 c1 Z2 c2 Z3 c3 Y c T H E S E X A c4
4
procedure (compare with DFS O(dn))
(n (n * (n (n-1)) )) / 2
can be inserted in the ToDoArc list? O(d) d)
the consistency of an arc? O(d2)
ll complexit ity: O(n2d3)
5
e.g. built-in AISpace example “Scheduling problem 1” We have: a (unique) solution.
We have: No solution! All values are ruled out for this variable. e.g. try this graph (can easily generate it by modifying Simple Problem 2)
There may be: one solution, multiple ones, or none Need to solve this new CSP (usually simpler) problem: – same constraints, domains have been reduced
6
7
instance CSP with dom(X) = {x1, x2, x3, x4} becomes CSP1 with dom(X) = {x1, x2} and CSP2 with dom(X) = {x3, x4}
8
9
10
C. arcs < Zi, r(Z,X)> and <Y, r(X,Y)>
11
C. arcs < Zi, r(Z,X)> and <Y, r(X,Y)>
12
If domains with multiple values Split on one
If domains with multiple values Split on one If domains with multiple values…..Split on one
13
14
({1,2,3,4}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1,3}, {1,3}) ({2,4}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1}, {1,3}) AC ({}, {}, {}) ({1,3}, {3}, {1,3}) ({}, {}, {}) AC AC ({2,4}, {2,4}, {2,4}) ({2,4}, {2}, {2,4}) AC ({}, {}, {}) ({2,4}, {4}, {2,4}) ({}, {}, {}) AC AC A ϵ {1,3} A ϵ {2,4} B ϵ {1} B ϵ {3} B ϵ {2} B ϵ {4} 3 variables: A, B, C Domains: all {1,2,3,4} A=B, B=C, A≠C No solution No solution No solution ({1,2,3,4}, {1,2,3,4}, {1,2,3,4}) AC (arc consistency)
15
({1,2,3,4}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1,3}, {1,3}) ({2,4}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1}, {1,3}) AC ({1}, {1}, {1}) ({1,3}, {3}, {1,3}) ({3}, {3}, {3}) AC AC ({2,4}, {2,4}, {2,4}) ({2,4}, {2}, {2,4}) AC ({2}, {2}, {2}) ({2,4}, {4}, {2,4}) ({4}, {4}, {4}) AC AC A ϵ {1,3} A ϵ {2,4} B ϵ {1} B ϵ {3} B ϵ {2} B ϵ {4} 3 variables: A, B, C Domains: all {1,2,3,4} A=B, B=C, A=C Solution Solution Solution Solution ({1,2,3,4}, {1,2,3,4}, {1,2,3,4}) AC (arc consistency)
16
If domains with multiple values Split on one
How many CSPs do we need to keep around at a time? Assume solution at depth m and b children at each split
If domains with multiple values Split on one If domains with multiple values…..Split on one
17
If domains with multiple values Split on one
How many CSPs do we need to keep around at a time? Assume solution at depth m and b children at each split O(bm): It is DFS
If domains with multiple values Split on one If domains with multiple values…..Split on one
18
19
20
Systematic Search Branching factor b = 104 Solution depth d = 105 Time Complexity = O((104 ) 105) Constraint Network Size = O(105 + 105 *105) Time Complexity of AC = O((105*2 * 104*3)
21
between:
whether it is a model, a solution)
disadvantages.
state). Compare strategies for CSP search. Implement pruning for DFS search in a CSP.
its complexity and assess its possible outcomes
arc consistency
22
23
systematically
solution if one exists (thus, cannot prove that there is no solution)
24
(all possible worlds)
how good each assignment is
– Best available method for many constraint satisfaction and constraint optimization problems
25
26
are nodes with assignments that differ from A for one value only
27
V1 = v1 ,V2 = v1 ,.., Vn = v1
V1 = v2 ,V2 = v1 ,.., Vn = v1 V1 = v4 ,V2 = v1 ,.., Vn = v1 V1 = v1 ,V2 = vn ,.., Vn = v1 V1 = v4 ,V2 = v2 ,.., Vn = v1 V1 = v4 ,V2 = v3 ,.., Vn = v1 V1 = v4 ,V2 = v1 ,.., Vn = v2
have seen so far!
29
30
h(n): number of constraint violations in state n
with minimal h(n) – minimize the number of unsatisfied constraints
31
evaluation/scoring function: number of satisfied contrs. Current nt s state/pos e/possibl ble w e world X 1 assume both vars have integer domain X2
32
33
the ith column seats;
34
For each column, assign randomly each queen to a row (a number between 1 and N) Repeat
violations changing the assignment to that number would yield
constraints; change it Until solved
Each cell lists h (i.e. #constraints unsatisfied) if you move the queen in that column into the cell
35
36
37
Random initialization Local search step 1: Proced edur ure Local-Search(V,dom,C) 2: Input nputs 3: V: a set of variables 4: dom: a function such that dom(X) is the domain of variable X 5: C: set of constraints to be satisfied 6: Output put complete assignment that satisfies the constraints 7: Loc Local al 8: A[V] an array of values indexed by V 9: repea epeat 10: for
each variable X do do 11: A[X] ←a random value in dom(X); 12: 13: wh while ile (stopping criterion not met & A is not a satisfying assignment) 14: Select a variable Y and a value V ∈dom(Y) 15: Set A[Y] ←V 16: 17: if if (A is a satisfying assignment) then hen 18: ret etur urn A 19: 20: unt until termination
38
Random initialization 1: Proced edur ure Local-Search(V,dom,C) 2: Input nputs 3: V: a set of variables 4: dom: a function such that dom(X) is the domain of variable X 5: C: set of constraints to be satisfied 6: Output put complete assignment that satisfies the constraints 7: Loc Local al 8: A[V] an array of values indexed by V 9: repea epeat 10: for
each variable X do do 11: A[X] ←a random value in dom(X); 12: 13: wh while ile (stopping criterion not met & A is not a satisfying assignment) 14: Select a variable Y and a value V ∈dom(Y) 15: Set A[Y] ←V 16: 17: if if (A is a satisfying assignment) then hen 18: ret etur urn A 19: 20: unt until termination
39 Local Search Step Based on local information. E.g., for each neighbour evaluate how many constraints are unsatisfied. Greedy descent: select Y and V to minimize #unsatisfied constraints at each step
40
41
improve on this
(h ≥ 2)
local minimum
42
44
45
– i.e., move to a neighbour with some randomness
46
1: Proced edur ure Local-Search(V,dom,C) 2: Input nputs 3: V: a set of variables 4: dom: a function such that dom(X) is the domain of variable X 5: C: set of constraints to be satisfied 6: Output put complete assignment that satisfies the constraints 7: Loc Local al 8: A[V] an array of values indexed by V 9: repea epeat 10: for
each variable X do do 11: A[X] ←a random value in dom(X); 12: 13: wh while ile (stopping criterion not met & A is not a satisfying assignment) 14: Select a variable Y and a value V ∈dom(Y) 15: Set A[Y] ←V 16: 17: if if (A is a satisfying assignment) then hen 18: ret etur urn A 19: 20: unt until termination
Extreme case 1: random sampling. Restart at every step: Stopping criterion is “true” Random restart
47
1: Proced edur ure Local-Search(V,dom,C) 2: Input nputs 3: V: a set of variables 4: dom: a function such that dom(X) is the domain of variable X 5: C: set of constraints to be satisfied 6: Output put complete assignment that satisfies the constraints 7: Loc Local al 8: A[V] an array of values indexed by V 9: repea epeat 10: for
each variable X do do 11: A[X] ←a random value in dom(X); 12: 13: wh while ile (stopping criterion not met & A is not a satisfying assignment) 14: Select a variable Y and a value V ∈dom(Y) 15: Set A[Y] ←V 16: 17: if if (A is a satisfying assignment) then hen 18: ret etur urn A 19: 20: unt until termination
Extreme case 2: greedy descent Select the neighbor with best h value (select at random among neighbors with same h)
48
Random restart
49