Lecture 7: Logical Agents
Why Logic Matters¶
Explicit knowledge: tell the agent facts and rules
Inference: derive new facts without enumerating worlds
Soundness and completeness: we can prove correctness
Foundation for planning and diagnosis
Learning Objectives¶
Design knowledge-based agents
Understand the Wumpus world
Represent knowledge in propositional logic
Apply inference: resolution, forward/backward chaining
Build agents for planning and state estimation
Knowledge-Based Agents¶
Knowledge base (KB): Set of sentences
Tell: Add sentence to KB
Ask: Query KB (inference)
Agent: Tell(percepts), Ask(what action?)
The Wumpus World¶

Grid: 4×4, rooms
Pit: Bottomless, breeze in adjacent
Wumpus: Smell in adjacent
Gold: Goal
Agent: Perceive stench, breeze, glitter, bump, scream
Logic¶
Syntax: Formal structure of sentences
Semantics: Meaning (truth in a world)
Entailment: α ⊨ β means α entails β (every model of α is model of β)
Inference: Derive new sentences from KB
Propositional Logic¶
Atomic sentences: True, False, or proposition symbols (P, Q, ...)
Connectives: ¬, ∧, ∨, ⇒, ⇔
Complex sentences: Built from atoms and connectives
Propositional Logic: Semantics¶
Model: Assignment of truth values to symbols
m ⊨ P: P is true in model m
KB ⊨ α: α is true in every model of KB
Inference¶
Sound: Only derives entailed sentences
Complete: Derives all entailed sentences
Model checking: Enumerate models (exponential)
Theorem proving: Apply inference rules
Resolution¶
Rule: (A ∨ B) ∧ (¬A ∨ C) ⊨ (B ∨ C)
Resolvent: (B ∨ C)
CNF: Conjunctive Normal Form — conjunction of clauses
Resolution: Add resolvents until empty clause (contradiction)
Resolution Algorithm¶
Convert KB ∧ ¬α to CNF
Repeatedly apply resolution
If empty clause derived → KB ⊨ α
Complete for propositional logic
Horn Clauses¶
Horn clause: At most one positive literal
Definite clause: Exactly one positive (P₁ ∧ ... ∧ Pₙ ⇒ Q)
Fact: Single positive literal
Efficient inference for Horn KBs
Forward Chaining¶
Start with facts
Apply rules whose premises are satisfied
Add conclusions to KB
Repeat until no new facts
Data-driven, O(n) for Horn KBs
Backward Chaining¶
Start with query
Find rules whose conclusion matches query
Recursively prove premises
Goal-driven
Logic programming (Prolog)
SAT and Model Checking¶
SAT: Is formula satisfiable?
DPLL: Backtracking search for model
Local search: WalkSAT, GSAT
Random SAT: Phase transition
Logical State Estimation¶
Belief state: Set of possible worlds
Axiom: “Fluent holds at t iff it held at t-1 or was caused”
Frame problem: What stays the same?
Planning by Propositional Inference¶
Planning as SAT: Encode planning problem as propositional formula
Variables: At(Location, t), Have(Arrow, t), etc.
SAT solvers: Very efficient
Summary¶
KB agents: Tell, Ask
Propositional logic: Syntax, semantics
Resolution: Complete for inference
Forward/backward chaining: For Horn clauses
Planning: Encode as SAT
References¶
Russell & Norvig, AIMA 4e, Ch. 7
Chapter PDF:
chapters/chapter-07.pdfaima-python: logic.ipynb