Lecture 13: Probabilistic Reasoning
Learning Objectives¶
Represent knowledge with Bayesian networks
Perform exact inference: variable elimination
Apply approximate inference: sampling
Understand causal networks
Bayesian Networks¶

DAG: Directed acyclic graph
Nodes: Random variables
Edges: Direct influence
CPT: Conditional probability table per node
BN Semantics¶
Joint: P(X₁,...,Xₙ) = Πᵢ P(Xᵢ|Parents(Xᵢ))
Compact: O(n × d^k) vs O(d^n) for joint
Conditional independence: d-separation
Constructing BN¶
Order: Choose variable ordering
For each X: Add parents that “directly influence” X
CPT: P(X|Parents(X))
d-Separation¶
Blocked path: Through evidence
Active path: Unblocked
X ⊥ Y | Z: All paths between X and Y blocked by Z
Exact Inference: Enumeration¶
P(X|e) = α P(X,e)
P(X,e) = Σᵧ P(X,e,y) (sum over hidden)
Exponential in hidden variables
Variable Elimination¶
Eliminate variables one by one
Factors: Functions over variable sets
Operations: Multiply factors, sum out variable
Ordering: Affects efficiency
Complexity of Exact Inference¶
NP-hard in general
Polytrees: Linear time
Clustering: Join tree algorithm
Approximate Inference: Sampling¶
Direct sampling: Sample from prior
Rejection sampling: Reject if evidence doesn’t match
Importance sampling: Weight by likelihood
Markov Chain Monte Carlo¶
Gibbs sampling: Sample one variable at a time given others
Stationary distribution: Joint distribution
Mixing: How fast to converge
Causal Networks¶
Causation: X causes Y
do-operator: Intervention vs. observation
Back-door criterion: Block confounding
Summary¶
BN: Compact representation, product form
Exact: Variable elimination, join tree
Approximate: Sampling, Gibbs MCMC
Causal: do-operator, back-door
References¶
Russell & Norvig, AIMA 4e, Ch. 13
Chapter PDF:
chapters/chapter-13.pdf
Questions?¶
Next lecture: Probabilistic Reasoning over Time (Chapter 14)