Lecture 12: Quantifying Uncertainty
Why Probability?¶
Logic is brittle: we need degrees of belief
Probability: axioms, conditioning, Bayes
Same math for diagnosis, prediction, and decision
Foundation for learning and decision making
Learning Objectives¶
Represent uncertainty with probability
Apply Bayes’ rule
Use full joint distributions for inference
Build naive Bayes models
Acting Under Uncertainty¶
Deterministic: One outcome per action
Uncertain: Multiple outcomes, probabilities
Rational: Maximize expected utility
Probability Basics¶
P(a): Probability of proposition a
P(a) ∈ [0, 1]
P(True) = 1, P(False) = 0
P(a ∨ b) = P(a) + P(b) - P(a ∧ b)
Conditional Probability¶
P(a|b): Probability of a given b
P(a|b) = P(a ∧ b) / P(b)
Product rule: P(a ∧ b) = P(a|b) P(b)
Full Joint Distribution¶
Joint: P(X₁,...,Xₙ) for all variables
Marginalization: P(X) = Σᵧ P(X,y)
Inference: P(query|evidence) from joint
Applying Bayes’ Rule¶
Diagnosis: P(disease|symptom)
Update: Prior → Posterior with evidence
Combining evidence: Chain rule
Independence¶
P(a|b) = P(a) if a independent of b
P(a ∧ b) = P(a) P(b)
Reduces: Joint size
Naive Bayes¶
Assumption: Features independent given class
P(Class|X₁,...,Xₙ) ∝ P(Class) Πᵢ P(Xᵢ|Class)
Efficient: O(n) parameters
Text classification: Bag of words
Wumpus World Revisited¶
P(Pit|Breeze): Use Bayes
Squares: Independent pit probability
Update: With each observation
Summary¶
Probability: Uncertainty, conditioning, Bayes
Joint: Full distribution, marginalization
Naive Bayes: Independence given class
Next: Bayesian networks for compact representation
References¶
Russell & Norvig, AIMA 4e, Ch. 12
Chapter PDF:
chapters/chapter-12.pdfaima-python: probability4e.ipynb