1 Classical CSP Classical CSP Valued CSP Valued CSP For each - - PDF document

1
SMART_READER_LITE
LIVE PREVIEW

1 Classical CSP Classical CSP Valued CSP Valued CSP For each - - PDF document

Example: Bioinformatics Example: Bioinformatics RNA is single-strand molecule, composed of A,U,G,C Soft Constraint Processing Soft Constraint Processing Function of RNA depends on structure (3-D folding) Structure induced by base


slide-1
SLIDE 1

1

Soft Constraint Processing Soft Constraint Processing

16.412J/6.834J Cognitive Robotics Martin Sachenbacher (Using material from Thomas Schiex)

Example: Bioinformatics Example: Bioinformatics

RNA is single-strand molecule,

composed of A,U,G,C

Function of RNA depends on

structure (3-D folding)

Structure induced by base

pairing: Watson-Crick (A-U, G-C) and Wobble (G-U).

Problem: Find RNA structure

that maximizes base pairings.

Cumbersome to frame as

Optimal CSP!

Overview Overview

Soft Constraints Framework Algorithms: Search (Branch-and-Bound) Algorithms: Inference (Dynamic Programming) Applications: Frequency Assignment Problems

From Optimal CSP to Soft CSP From Optimal CSP to Soft CSP

Optimal CSP: Minimize function f(y), s.t. constraints

C(x) are satisfiable. Utility Function Constraint

From Optimal CSP to Soft CSP From Optimal CSP to Soft CSP

Soft CSP: Extend the notion of constraints to

include preferences. Soft Constraint

Notation Notation

A -tuple is a sequence of objects . The -th component of a tuple

is denoted .

The projection of a tuple

  • n a subset of its

components is denoted .

The cartesian product of sets , denoted

, is the set of all -tuples such that .

slide-2
SLIDE 2

2

Classical CSP Classical CSP

A constraint network

set of variables set of domains set of constraints

A constraint is a relation

  • n variables with arity

. A complete assignment is allowed if .

Valued CSP Valued CSP

For each constraint/tuple: a valuation that reflects

preference (e.g. cost, weight, priority, probability, …).

The valuation of an assignment is the combination

  • f the valuations expressed by each constraint using

a binary operator (with special axioms).

Assignments can be compared using a total order

  • n valuations.

The problem is to produce an assignment of

minimum valuation.

Formally: Valuation Structure Formally: Valuation Structure

  • = set of valuations, used to assess assignments
  • = minimum element of , corresponds to totally

consistent assignments

  • = maximum element of , corresponds to totally

inconsistent assignments

  • = total order on E, used to compare two valuations
  • = operator used to combine two valuations

Valued CSP Valued CSP

A constraint network

set of variables set of domains set of constraints valuation structure

A constraint is a function mapping tuples over to valuations. The valuation of a complete assignment is .

Required Properties Required Properties

  • (Commutativity)

(Associativity) (Monotonicity) (Neutral element) (Annihilator) Exercise: Justify properties.

Instances of the Framework Instances of the Framework

Fuzzy Probabilistic Weighted Classical Many others in the literature.

slide-3
SLIDE 3

3

From Valued CSP to Optimal CSP From Valued CSP to Optimal CSP

Introduce decision variable for each constraint Its values correspond to different valuations

From Valued CSP to Optimal CSP From Valued CSP to Optimal CSP

Introduce decision variable for each constraint Its values correspond to different valuations

From Valued CSP to Optimal CSP From Valued CSP to Optimal CSP

Introduce decision variable for each constraint Its values correspond to different valuations Utility function maps values to valuations Constraints become relations

Multiattribute utility function

Overview Overview

Soft Constraints Framework Algorithms: Search (Branch-and-Bound) Algorithms: Inference (Dynamic Programming) Applications: Frequency Assignment Problems

Branch Branch-

  • and

and-

  • Bound Search

Bound Search

Lower Bound (lb):

Optimistic estimate of best solution in subtree

Each search node is a soft

constraint subproblem

Upper Bound (ub):

Best solution found so far

Prune, if lb ≥ ub.

Branch Branch-

  • and

and-

  • Bound Algorithm

Bound Algorithm

Function

( : assignment, : value): value if then if then return let be an unassigned variable for each do return return Time: O(exp(n)) Space: O(n)

slide-4
SLIDE 4

4

Lower Bound Procedure Lower Bound Procedure

Must be:

Strong: the closest to the real value, the better. Efficient: as easy to compute as possible.

Creates a trade-off. Choice is often a matter of compromises and experimental evaluation.

Distance Lower Bound Distance Lower Bound

At each node, let be the set of constraints

all of whose variables have been assigned.

Use the bound Problem: often weak, as it takes into account only

past variables.

Improvement: Russian Doll Search Improvement: Russian Doll Search

Idea: we can add the value of

the optimal solution to the subproblem over future variables to distance lower bound, and get a stronger lower bound.

Must solve subproblem over

future variables beforehand.

Yields recursive procedure

that solves increasingly large subproblems.

X1 X2 X3 X4 X5

Russian Doll Search Russian Doll Search

[Lemaitre Verfaillie Schiex 96]: Experiments with

Earth Observation Satellite Scheduling Problems (maximization problem).

Example: 105 variables, 403 constraints. Branch-and-Bound with distance lower bound:

Aborted after 30 min, best solution so far = 8095.

Russian Doll Search: Optimal solution = 9096 found

in 2.5 sec.

Overview Overview

Soft Constraints Framework Algorithms: Search (Branch-and-Bound) Algorithms: Inference (Dynamic Programming) Applications: Frequency Assignment Problems

Inference Inference

Inference produces new constraints that are implied

by the problem.

Makes problem more explicit, easier to solve. Operations on constraints: combination and

projection. VCSP VCSP’ Equivalent, simpler to solve

slide-5
SLIDE 5

5

Combination Combination

  • is constraint on s.t.

Combination Combination

  • is constraint on s.t.

Projection Projection

  • is a constraint on s.t.

Inferring Solutions Inferring Solutions

Constraint network Value of optimal solution obtained by combining all

constraints and eliminating all variables :

Problem: Very costly: Time O(exp(n)), Space

O(exp(n).

Improvement: Bucket Elimination Improvement: Bucket Elimination

Idea: Eliminate variable as soon as it no longer

  • ccurs in remaining set of constraints.

For variable , let Compute combination

  • f all constraints in

Now eliminate

from :

Remove

from and add to .

Induced Graph Induced Graph

When processing a node (variable), connect all

neighboring nodes not yet processed. Graph Induced Graph

slide-6
SLIDE 6

6

Bucket Elimination: Complexity Bucket Elimination: Complexity

Let width be the maximum number of successors in the induced graph. Then:

Time dominated by computation of largest :

O(exp(width+1))

Space dominated by storage of largest :

O(exp(width))

Impact of Variable Ordering Impact of Variable Ordering

Width: 4 Width: 2 Width: 2 Finding ordering with minimal width is NP-hard.

Min Min-

  • Fill Ordering Heuristic

Fill Ordering Heuristic

Function ( : Graph with edges and

nodes ): Order for to let be a node in with minimal number of edges required to connect its neighbors put in position of order Often finds good orderings in practice.

Example Example

Full Adder Circuit Hyper- Graph Full Adder Circuit

Example Example

MF Order Computational Scheme (“Bucket Tree”)

Tree Decomposition Tree Decomposition

A tree decomposition for a problem is a triple , where is a rooted tree, and are labeling functions associating with each node two sets such that:

For each , there is exactly one such that

. For this , (covering condition);

For each , the set of

vertices labeled with induces a connected subtree

  • f (connectedness condition).
slide-7
SLIDE 7

7

Example Example

Time: O(exp(maxi|χi|)) Space: O(exp(maxi,j|χi-χj|)) λ χ Tree Decomposition

Overview Overview

Soft Constraints Framework Algorithms: Search (Branch-and-Bound) Algorithms: Inference (Dynamic Programming) Applications: Frequency Assignment Problems

Frequency Assignment Frequency Assignment

Site A Site B

Frequency Assignment Frequency Assignment

Site A Site B

Frequency Assignment Frequency Assignment

Site A Site B Link

Frequency Assignment Frequency Assignment

Site A Site B Link

slide-8
SLIDE 8

8

Frequency Assignment Frequency Assignment

Site A Site B Link

Frequency Assignment Frequency Assignment

Several instances available from CELAR (200 to 916

variables, 1200 to 5000 constraints, domain size >30)

These are very hard instances of valued CSPs. Good results reported for dynamic programming.

RNA Structure Example RNA Structure Example Example Example

Bucket Tree

Tree Decomposition Example Tree Decomposition Example

Hypergraph Tree Decomposition

Dynamic Programming Dynamic Programming

Inference on the tree: dynamic programming

Tree Decomposition

slide-9
SLIDE 9

9

Example Example

AND-gates broken with 1% probability OR, XOR-gates broken with 5% probability Probabilistic valuation structure

Example Example Russian Doll Search Russian Doll Search

static variable ordering solves increasingly large

subproblems

uses previous LB recursively May speedup search by

several orders of magnitude

X1 X2 X3 X4 X5

Soft Constraint Framework Soft Constraint Framework

(X,D,C)

– X={x1,..., xn} variables – D={D1,..., Dn} finite domains – C={c1,..., ce} cost functions

var(ci)

scope

ci(t): → E (ordered cost domain, T, ⊥)

  • Obj. Function: F(X)= ⊕ci (X)
  • Solution: F(t) ≠ T
  • Soft CN: find minimal cost solution

identity annihilator

  • commutative
  • associative
  • monotonic

Valued Valued CSPs CSPs and and Semiring Semiring CSPs CSPs

Valued CSP (total order) Semiring-based CSP (partial order)

Specific Frameworks Specific Frameworks

E = {t,f}

⊕ = and Classical CSP

E = N ∪{∞}

⊕ = + Weighted CSP

E = [0,1]

⊕ = * Probabilistic CSP Lexicographic CSP, probabilistic CSP…

slide-10
SLIDE 10

10

From From VCSPs VCSPs to to OCSPs OCSPs

Introduce decision variable for each constraint Introduce domain value for each different value of a

tuple’s constraint

Basic Operations on Constraints Basic Operations on Constraints

Assignment (Conditioning) Combination (Join) Projection (Elimination)

Assignment (Conditioning) Assignment (Conditioning)

T=6 r r g r b r r g T=6 g g b g r b g b T=6 b b c(xi,xj) xj xi

Assign(c,x i,b)

r g T=6 b xj

f(xj)

Assign(f,x j,g)

h∅

Combination (Join) Combination (Join)

6 g g b g g b 6 b b c(xi,xj) xj xi 6 g g b g g b 6 b b c(xj,xk) xk xj 12 g g g 6 b g g g b g 6 b b g 6 g g b b g b 6 g b b 12 b b b f(xi,xj,xk) xk xj xi

= 0 ⊕ 6

Projection (Elimination) Projection (Elimination)

6 r r g r 1 b r 3 r g 6 g g 2 b g r b 6 g b 4 b b c(xi,xj) xj xi

Elim(c,xj)

r g b f(xi) xi

2

Elim(f,xi)

h∅

Min

Solutions Solutions

F(X)= ⊕ci (X) ⇓∅

slide-11
SLIDE 11

11

Weighted CSP Example Weighted CSP Example

x3 x2 x5 x1 x4

F(X): number of non blue vertices For each vertex For each edge:

1 r 1 g b c(xi) xi T r r g r b r r g T g g b g r b g b T b b c(xi,xj) xj xi

Probabilistic CSP Example Probabilistic CSP Example

AND-gates broken with 1% probability OR, XOR-gates broken with 5% probability Probabilistic valuation structure

Probabilistic CSP Example Probabilistic CSP Example Application: Bioinformatics Application: Bioinformatics

Multiple sequence alignment (DNA) Given k homologous sequences…

– AATAATGTTATTGGTGGATCGATGA – ATGTTGTTCGCGAAGGATCGATAA

… find the best alignment (sum)

– AATAATGTTATTGGTG---GATCGATGATTA – ----ATGTTGTTCGCGAAGGATCGATAA---

Application: Resource Allocation Application: Resource Allocation

Given a telecommunication network Find frequency for each communication link… … such that total interference is minimized

Application: Resource Allocation Application: Resource Allocation

slide-12
SLIDE 12

12

Overview Overview

Introduction and Definitions Solving soft constraints

– By Search – By Inference

Depth Depth-

  • First Search

First Search

BT(X,D,C) if (X=∅) then Top :=c∅ else xj := selectVar(X) forall a∈Dj do ∀c∈C s.t. xj ∈var(c) c:=Assign(c, xj ,a) c∅:= Σc∈C s.t. var(c)= ∅ c if (LB<Top) then BT(X-{xj},D-{Dj},C)

Improving the LB Improving the LB

  • Assigned constraints (c∅)
  • Original constraints

Solve

Optimal cost ⊕ c∅

  • Gives a stronger LB
  • Can be solved beforehand

Importance of Bounds Importance of Bounds

Example: Frequency assignment problem

– Instance: CELAR6-sub4

#var: 22 , #val: 44 , Optimum: 3230

– Depth-first branch-and-bound search – UB initialized to 100000 3 hours – UB initialized to 3230 1 hour

Stochastic local search (SLS) can find UB=3230 in a few

minutes

Overview Overview

Introduction and Definitions Solving soft constraints

– By Search – By Inference

Synthesis Synthesis

Join all constraints Project Limitations: very costly (Time: exp(n), Space: exp(n))

slide-13
SLIDE 13

13

Bucket Elimination Bucket Elimination

C∅

  • Select a variable Xi
  • Compute the set Ki of

constraints that involves this variable

  • Add Elim( )
  • Remove variable and Ki
  • Time: Θ(exp(degi+1))
  • Space: Θ(exp(degi))

X4 X3 X5 X2 X1

1

, X c

i

K c∈

Tree Decomposition Tree Decomposition Min Min-

  • Fill Heuristics

Fill Heuristics

Input: A graph G = (V,E), V = {v1, …, vn} Output: An ordering of the nodes For j = 1 to n do

– r ← a node in V with smallest number of fill edges – Put r in position j – Connect r’s neighbors – Remove r from resulting graph

Induced Graph Induced Graph Tree Tree-

  • structured Problems

structured Problems

BnB BnB with Variable Elimination with Variable Elimination

Hybrid Method At each node

– Select an unassigned variable Xi – If degi ≤ k then eliminate Xi – Else branch on the values of Xi

Properties

– BE-VE(-1) is BB – BE-VE(w*) is VE – BE-VE(1) is like cycle-cutset

slide-14
SLIDE 14

14

BnB BnB with Variable Elimination with Variable Elimination

Example: BB-VE(2)

Example BnB Example BnB-

  • VE(2)

VE(2)

c∅ c∅ c∅

BnB BnB with VE: Results with VE: Results

  • Example: Still-life (academic problem)

– Instance: n=14

#var:196 , #val:2

– Branch-and-Bound 5 days – Variable Elimination 1 day – BB-VE(18) 2 seconds

Background Background

Domain Splitting (e.g. Hentenryck’s book) Bucket Elimination (and extension to super-bucket

elimination/tree decomposition) (Dechter’s book)

Backtracking combined with tree decompositions

(algorithm BTD, Jégou and Terrioux 03)

Dynamic programming on tree decompositions

(algorithm CTE, Dechter’s book)

Decision Diagrams (Bryant 86, Bahar 93) Soft constraints [A* search]

CSPs CSPs

Domains Variables Constraints

  • : Scope , Function

Example: 4 Example: 4-

  • Queens

Queens

Variables: Rows Domains: Columns Constraints:

slide-15
SLIDE 15

15

Backtracking Search Backtracking Search

Order on variables: Choose value for unassigned variable Check all completely assigned constraints

– If inconsistent, prune and backtrack

Example Example

Search Tree

Solution Constraints involved

Generalization to Domain Splitting Generalization to Domain Splitting

Partition domains into sets Choose subset for unassigned variable Check all completely assigned constraints

– Combine (join) relevant parts of the constraints – If inconsistent, prune and backtrack

Example Example

Partition E.g., check assignment :

Constraint satisfied

Example Example

Search Tree

Solution

Example Example

Search Tree

Solution Constraints involved

slide-16
SLIDE 16

16

Cases Cases

Partition : Limiting case of backtrack

search (single assignments are tested, as before)

Partition : Limiting case of constraint

synthesis (single constraint is inferred):

Partition : Hybrid of search and

inference (search on subsets of tuples)

Example Example

Synthesis

Solution Constraints involved

Example Example

Synthesis

Exploiting Structure Exploiting Structure

Problem: Search is uninformed about CSP structure

– |Pi| =|di|: leads to unnecessarily large search tree (thrashing) – |Pi| = 1: leads to unnecessarily large constraints

We can do better by considering structure of graph

– |Pi| = |di|: can be used to reduce size of search tree – |Pi| = 1: can be used to reduce size of constraints

Bucket Elimination Bucket Elimination

Define variable order Eliminate the variables one-by-one

– Combine constraints mentioning in their scope (“bucket”) – Project out from result

That is, variables disappear as soon as they no

longer influence (cannot constrain) the result

E.g., instead of

bucket elimination needs to compute only

Super Super-

  • bucket Elimination

bucket Elimination

Generalization of Bucket Elimination Eliminate variables in groups (i.e., in partial order) E.g. eliminate in order :

Tree Decomposition

slide-17
SLIDE 17

17

Combination with Search Combination with Search

To exploit decomposition in search, the order in

which variables are assigned must be “compatible” with the order in which variables are eliminated

More precisely, if variables are assigned in order

, variables have to be eliminated in reverse (partial) order:

Construct (super-)buckets (tree decomposition

scheme) from this reverse order

Combination with Search (Cont Combination with Search (Cont’ ’d) d)

A tree decomposition with compatible elimination

  • rder can be exploited during search as follows:

– Let separator(vj) denote the separator of tree node vj (the set of variables that vj shares with its parent, vi) – Once a complete assignment has been found for a subtree, record it as a good at the separator (same for nogoods) – By checking the goods/nogoods during search, we can then avoid descending into the same subtree again and again

This algorithm is called BTD (backtracking with tree

decompositions) (Jégou Terrioux AIJ03)

BTD ( BTD (J Jé égou gou Terrioux Terrioux AIJ03) AIJ03)

Input: (Partial) assignment , tree node , set of

variables to be assigned

Output: “True” if assignment is consistent with all

constraints in subtree of , “false” otherwise

Initial call: BTD( )

BTD BTD Pseudocode Pseudocode

  • Function BTD(A,vi,Xvi)

If Xvi = ∅ Then Consistent ← True F ← children(vi) While F ≠ ∅ And Consistent Do Choose vj ∈ F F ← F \ {vj} If A ↓ separator(vi) is a good of vi/vj Then Consistent ← True Else If A ↓ separator(vj) is a nogood of vi/vj Then Consistent ← False Else Consistent ← BTD(A,vj,vars(vj) \ separator(vj)) If Consistent Then Record the good A ↓ separator(vj) for vi/vj Else Record the nogood (A ↓ separator(vj)) for vi/vj End If End If End if End While Return Consistent (continued on next slide)

BTD BTD Pseudocode Pseudocode (Cont (Cont’ ’d) d)

  • (Continued)

Else Choose x ∈ Xvi dom ← Dx Consistent ← False While dom ≠ ∅ And Not Consistent Do Choose val ∈ dom dom ← dom \ {val} If (A ∧ {x ← val}) semijoin {c: c ∈ C ∧ var(c) ⊆ (var(A)∪{x})} ≠ ∅ Then Consistent ← BTD(A ∧ {x ← val}, vi, Xvi \ {x}) End If End While Return Consistent End If

Note: Computes a full assignment, but returns only true/false.

Generalization to Domain Splitting Generalization to Domain Splitting

Incorporate domain splitting into BTD, that is, search

  • ver sets of assignments

Yields new algorithm BTDS (backtracking with tree

decompositions and domain splitting)

Like BTD, BTDS records set of good tuples and

nogood tuples for each separator

Unlike BTD, BTDS maintains only assignments to

(instead of full assignment)

Unlike BTD, BTDS returns assignments to

separator of (instead of only true/false)

slide-18
SLIDE 18

18

BTDS BTDS

Input: Set of (partial) assignments (constraint) , tree

node , set of variables

Output: Assignments to separator of that are

consistent with all constraints in subtree of

Initial call: BTDS( )

BTDS BTDS Pseudocode Pseudocode

  • Function BTDS(A,vi,Xvi)

If Xvi = ∅ Then F ← children(vi) While F ≠ ∅ And A ≠ ∅ Do Choose vj ∈ F F ← F \ {vj} Asep ← A ⇓ separator(vj) Aseprest ← Asep \ (goods(vj) ∪ nogoods(vj)) If Aseprest ≠ ∅ Then Aseprestcons ← BTDS(Aseprest, vj, χ(vj) \ separator(vj)) goods(vj) ← goods(vj) ∪ Aseprestcons nogoods(vj) ← nogoods(vj) ∪ (Aseprest \ Aseprestcons) End If A ← A semijoin goods(vj) End While Return A ⇓ separator(vi) (continued on next slide)

BTDS BTDS Pseudocode Pseudocode (Cont (Cont’ ’d) d)

  • (Continued)

Else Choose x ∈ Xvi PartitionElements ← Px Aextended ← A While PartitionElements ≠ ∅ And Aextended = ∅ Do Choose p ∈ PartitionElements PartitionElements ← PartitionElements \ {p} Aextended ← (A ∧ {x ← p}) semijoin {c: c ∈ C ∧ var(c) ⊆ (var(A)∪{x})} If Aextended ≠ ∅ Then Aextended ← BTDS(Aextended, vi, Xvi \ {x}) End If End While Return Aextended End If

Cases Cases

Partition : Yields backtracking algorithm

BTD (Jégou Terrioux AIJ03))

Partition : Yields dynamic programming

algorithm CTE (Dechter 03)

Partition : Hybrid of BTD and CTE

Note: for case |Pi|>1, algorithm has higher space complexity than CTE (exp(width) instead of exp(sep)). But, it should be possible to reduce the space complexity to exp(sep).

BTDS applied to 4 BTDS applied to 4-

  • Queens

Queens

Variable order Partition

Separators Tree Decomposition Constraint Graph

BTDS applied to 4 BTDS applied to 4-

  • Queens

Queens

Search Tree

slide-19
SLIDE 19

19

BTDS applied to 4 BTDS applied to 4-

  • Queens

Queens

Search Tree

BTDS applied to 4 BTDS applied to 4-

  • Queens

Queens

Search Tree

V1 traversed, descend into v2

BTDS applied to 4 BTDS applied to 4-

  • Queens

Queens

Search Tree

V2 traversed, descend into v3

BTDS applied to 4 BTDS applied to 4-

  • Queens

Queens

Search Tree

BTDS applied to 4 BTDS applied to 4-

  • Queens

Queens

Search Tree

Solution Record nogood for v2, v3 <x1=2,x2=4,x3=2>

Granularity of Domain Splitting Granularity of Domain Splitting

Empirical observation and theoretical considerations

(Jégou Terrioux AIJ03): BTD outperforms CTE (cluster tree elimination, i.e. dynamic programming

  • n tree decomposition)

– BTD is a “lazy” variant of CTE (dynamic programming)

Therefore, (finest granularity) is optimal

granularity of partitions in BTDS

– Best to perform dynamic programming as lazily as possible

But: This assumes that tuples are handled explicitly

– More efficient, implicit datastructures are possible when manipulating whole sets of tuples

slide-20
SLIDE 20

20

Symbolic Encoding Symbolic Encoding

Decision diagrams (Bryant 86): graph-based,

canonical representation of (boolean) functions

Time and space complexity depends on graph size

rather than number of tuples of function represented Decision Diagram (ROBDD) Function

BTDS with Symbolic Encoding BTDS with Symbolic Encoding

In many practical cases, decision diagrams much

more compact than representing tuples explicitly

Can make operations on sets of tuples (inference) more efficient But won’t make operations on single tuples (search) more efficient

Therefore, in BTDS, larger partition elements become

more advantageous (shifts optimal granularity towards )

In many practical cases, optimal granularity for partitions in BTDS becomes 1 < |Pi| < |di| Exploit both structure in graph and structure in tuples

Generalization to Optimization Generalization to Optimization

Domains Variables Constraints

  • : Scope , Function

Valuation structure

Soft Constraints ⊥ ⊥ best, worst

Example: Full Adder Diagnosis Example: Full Adder Diagnosis

Variables

  • describe modes of gates

Gates are either in good ( ) or broken ( ) mode

Example: Full Adder Diagnosis Example: Full Adder Diagnosis

AND-gates broken with 1% probability OR, XOR-gates broken with 5% probability Probabilistic valuation structure

Example: Soft Constraints Example: Soft Constraints

For details, see ECAI’04 paper.

slide-21
SLIDE 21

21

Example: Tree Decomposition Example: Tree Decomposition

Eliminination order

Depth Depth-

  • First Branch and Bound

First Branch and Bound

Recursive algorithm BTDval (Terrioux Jégou CP03)

(back-tracking with tree decompositions for valued constraints) that extends BTD to soft constraints

Records tuples and their values for each separator

(“valued goods” instead of goods and nogoods)

BTDval BTDval ( (Terrioux Terrioux J Jé égou gou CP03) CP03)

Input: (Partial) assignment , tree node , set of

variables , lower bound (value of assignment so far), upper bound (value of best solution found so far)

Output: Value of best extension to subtree of with

value < , or some value ≥ , if that does not exist

Initial call: BTDval( )

BTDval BTDval Pseudocode Pseudocode

  • Function BTDval(A,vi,Xvi,lvi,uvi)

If Xvi = ∅ Then F ← children(vi) While F ≠ ∅ And lci < uvi Do Choose vj ∈ F F ← F \ {vj} If 〈A ↓ separator(vj), v〉 is a good of vi/vj Then lci ← lci ⊕ v Else v ← BTDval(A,vj,vars(vj) \ separator(vj), ⊥, uvi) lci ← lci ⊕ v Record the goods 〈A ↓ separator(vj), v〉 for vi/vj End If End While Return lci (continued on next slide)

BTDval BTDval Pseudocode Pseudocode (Cont (Cont’ ’d) d)

  • (Continued)

Else Choose x ∈ Xvi dom ← Dx While dom ≠ ∅ And lvi < uvi Do Choose val ∈ dom dom ← dom \ {val} lval ← ((A ∧ {x ← val}) semijoin {c: c ∈ C ∧ var(c)⊆(var(A)∪{x})})↓∅ If lvi ⊕ lval < uvi Then uvi ← min(uvi, BTDval(A ∧ {x ← val}, vi, Xvi \ {x}, lvi ⊕ lval, uvi) End If End While Return uvi End If

Note: Computes a full assignment, but returns only a value.

Generalization to Domain Splitting Generalization to Domain Splitting

Incorporate domain splitting into BTDval, that is,

search over a whole set of valued assignments

Yields BTDSval (backtracking with tree decompo-

sitions and domain splitting for valued constraints)

Like BTDval, BTDSval records valued goods Unlike BTDval, BTDSval maintains only assignments

to , and returns assignments to separator of

slide-22
SLIDE 22

22

Sinking Operation Sinking Operation

Sinking operation (Bistarelli et al. SOFT03, Morris

AAAI93): returns a new constraint where all values of tuples have been replaced by Constraint sink(fe2,0.05) Constraint

Generalized Sinking Operation Generalized Sinking Operation

Generalized sinking operation returns a

new constraint where all values of tuples of that are values of tuples of have been replaced by

Generalizes the check to soft constraints

Constraint sink(fe2,f) Constraints

BTDSval BTDSval

Input: Set of (partial) assignments (constraint) , tree

node , variables , upper bound

– No explicit lower bound (contained in valued assignments) – Note that the bounds (lower and upper) are now functions

Output: Best assignments to separator of with

values , or values , if not existent

Notation: : constraint with value for all tuples,

: constraint with value for all tuples

Initial call: BTDSval( )

BTDSval BTDSval Pseudocode Pseudocode

  • Function BTDSval(fa,vi,Xvi,fu)

If Xvi = ∅ Then F ← children(vi) fa ← sink(fa,fu) While F ≠ ∅ And fa ≠ f Do Choose vj ∈ F F ← F \ {vj} fasep ← fa ⇓ separator(vj) faseprest ← tuples of fasep that are not goods of vi/vj If faseprest ≠ f Then farestval ← BTDS(faseprest, vj, χ(vj) \ separator(vj), fu) Record tuples in farestval as goods of vi/vj End If fa ← fa semijoin goods(vj) fa ← sink(fa,fu) End While Return fa ⇓ separator(vi) (continued on next slide) ⊥ ⊥

BTDSval BTDSval Pseudocode Pseudocode (Cont (Cont’ ’d) d)

  • (Continued)

Else Choose x ∈ Xvi PartitionElements ← Px fa ← sink(fa,fu) Aextended ← fa While PartitionElements ≠ ∅ And Aextended ≠ f Do Choose p ∈ PartitionElements PartitionElements ← PartitionElements \ {p} Aextended ← (fa ∧ {x ← p}) semijoin {c: c ∈ C ∧ var(c) ⊆ (var(fa)∪{x})} Aextended ← sink(Aextended,fu) If Aextended ≠ f Then Aextended ← BTDS(Aextended, vi, Xvi \ {x}, fu) fu ← min(fu,Aextended) End If End While Return fu End If

⊥ ⊥

Cases Cases

Partition : Yields backtracking algorithm

BTDval (Jégou Terrioux CP03)

Partition : Yields dynamic programming

algorithm CTE with soft constraints (Dechter 03)

Partition : Hybrid of BTDval and CTE

Note: for case |Pi|>1, algorithm has higher space complexity than CTE (exp(width) instead of exp(sep)). But, it should be possible to reduce the space complexity to exp(sep).

slide-23
SLIDE 23

23

BTDSval BTDSval applied to Full Adder applied to Full Adder

Partition , all else

Constraint Hypergraph Tree Decomposition

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

Upper bound = 0

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

<u=0, y=0> .047 <u=0, y=1> .902 Upper bound = 0

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

<u=0, y=0> .047 <u=0, y=1> .902 <v=0, w=0> .950 Upper bound = .044

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

<u=0, y=0> .047 <u=0, y=1> .902 <v=0, w=0> .950 Upper bound = .044

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

<u=0, y=0> .047 <u=0, y=1> .902 <v=0, w=0> .950 Upper bound = .044 Exploiting goods recorded at v2 (“forward jump”)

slide-24
SLIDE 24

24

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

<u=0, y=0> .047 <u=0, y=1> .902 <v=0, w=0> .950 <v=0, w=1> .050 Upper bound = .044

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

<u=0, y=0> .047 <u=0, y=1> .902 <v=0, w=0> .950 <v=0, w=1> .050 Upper bound = .044 Cut by bound

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

Cut by bound

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

Cut by bound

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

Cut by bound

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

Cut by bound

slide-25
SLIDE 25

25

BTDSval BTDSval applied to Full Adder applied to Full Adder

Search Tree

Cut by bound Finished. Best solution = .044 # Nodes = 45

BTDSval BTDSval with Symbolic Encoding with Symbolic Encoding

Algebraic Decision Diagrams (ADDs, Bahar 93):

graph-based, canonical representation of functions with non-binary values

When encoding constraints as DDs in BTDSval, then

like for BTDS, larger partition elements become more advantageous (shifts the optimal granularity towards )

In many practical cases, optimal granularity for partitions in BTDSval becomes 1 < |Pi| < |di|

Experimental Results Experimental Results

Best Best-

  • First Search

First Search

Replace depth-first branch-and-bound search in

BTDval by best-first (A*) search

Yields algorithm ATDval (A* search with tree

decompositions for valued constraints)

One search queue per each tree node Search queues have entries

– A: assignment – v: value – vi: tree node – Xvi: set of variables – F: set of children of vi

Best Best-

  • First Search

First Search

Problem: Search to be performed given a particular

assignment ; values depend on this assignment

Therefore, would have to maintain different search

queues for each different assignment!

Possible solution: Switch to dual problem (unary soft

constraints, n-ary hard equality constraints)

– See SOFT-04 paper

Related Work Related Work

Set-based search (Jörg Denzinger, U Calgary)

slide-26
SLIDE 26

26

Material

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Variable order

Separators Tree Decomposition Constraint Graph

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

V1 traversed, descend into v2

slide-27
SLIDE 27

27

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

Record nogood for v1, v2 <1,3>

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

<1,3> V1 traversed, descend into v2

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

<1,3> V2 traversed, descend into v3

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

<1,3> Record nogood for v2, v3 <1,4,2>

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

<1,3>, <1,4> <1,4,2> Record nogood for v1, v2

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

<1,3>, <1,4> <1,4,2> V1 traversed, descend into v2

slide-28
SLIDE 28

28

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

<1,3>, <1,4> <1,4,2> V2 traversed, descend into v3

BTD applied to 4 BTD applied to 4-

  • Queens

Queens

Search Tree

<1,3>, <1,4> <1,4,2> Solution

Example: Example: “ “Soft Soft” ” Graph Coloring Graph Coloring

Variables: Domains: for , for Constraints:

– Adjacent colors must be different – Combinations red and blue, red and green have penalty

Example: Example: “ “Soft Soft” ” Graph Coloring Graph Coloring

Tree Decomposition:

Example: Example: “ “Soft Soft” ” Graph Coloring Graph Coloring

Partitions: for ,

for