Lecture 6: Constraint Satisfaction Problems
Why CSPs Matter¶
Scheduling, assignment, design, configuration
Same abstraction: variables, domains, constraints
Inference (arc consistency) + search (backtracking)
Local search (min-conflicts) often wins in practice
Learning Objectives¶
Define CSPs: variables, domains, constraints
Apply constraint propagation: arc consistency, path consistency
Implement backtracking search with heuristics
Use local search for CSPs
Exploit problem structure
CSP Formulation¶
Variables: X₁, X₂, ..., Xₙ
Domains: D₁, D₂, ..., Dₙ (possible values)
Constraints: Restrictions on variable combinations
Solution: Assignment satisfying all constraints
Example: Map Coloring¶

Variables: Regions (e.g., WA, NT, SA, Q, NSW, V, T)
Domain: {red, green, blue}
Constraints: Adjacent regions different colors
Solution: Valid coloring
Example: N-Queens¶
Variables: Queen position (row or column) for each queen
Domain: {1, 2, ..., N}
Constraints: No two attack each other
Binary constraints: Pairwise
Constraint Types¶
Unary: Single variable (e.g., SA ≠ red)
Binary: Two variables (e.g., WA ≠ NT)
Higher-order: 3+ variables
Global: All-different, etc.
Constraint Propagation¶
Inference: Reduce domains before search
Node consistency: Unary constraints
Arc consistency: For each (X,Y), for each x in D(X), ∃ y in D(Y) satisfying (X,Y)
AC-3 Algorithm¶

Maintain queue of arcs to check
When (X,Y) reduces D(X), add arcs (Z,X) for all Z
Repeat until no changes
O(n²d³) for n variables, domain size d
Backtracking Search¶
Depth-first: Assign variables one by one
Backtrack: When constraint violated, try next value
Improvement: Use inference when assigning
Variable Ordering¶
MRV (Minimum Remaining Values): Choose variable with fewest choices
Degree heuristic: Choose variable involved in most constraints
Fail-first: Reduces branching factor
Value Ordering¶
Least Constraining Value: Choose value that rules out fewest values for neighbors
Enables more solutions downstream
Interleaving Search and Inference¶
Forward checking: After assigning X, remove inconsistent values from neighbors
MAC (Maintaining Arc Consistency): Full AC-3 after each assignment
MAC: Stronger pruning, more inference cost
Intelligent Backtracking¶
Conflict-directed backtracking: Jump to variable that caused failure
Backjumping: Don’t try sibling values of same variable
Constraint learning: Add nogood constraints
Local Search for CSPs¶

State: Complete assignment (may violate constraints)
Min-conflicts: Pick variable in conflict, assign value that minimizes conflicts
Very effective for many CSPs (e.g., N-queens, scheduling)
Problem Structure¶
Tree-structured CSP: No cycles → O(nd²) with arc consistency
Cutset conditioning: Instantiate cutset, solve tree
Tree decomposition: Decompose into tree of subproblems
Summary¶
CSP: Variables, domains, constraints
Propagation: Arc consistency (AC-3)
Backtracking: MRV, LCV, forward checking, MAC
Local search: Min-conflicts
Structure: Trees, cutsets, decomposition
References¶
Russell & Norvig, AIMA 4e, Ch. 6
Chapter PDF:
chapters/chapter-06.pdfaima-python: csp.ipynb