Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

Lecture 19: Learning from Examples

Lecture 19: Learning from Examples

AIMA Chapter 19 — 1 hour

Learning Objectives

  • Define supervised, unsupervised, reinforcement learning

  • Build decision trees with ID3

  • Apply linear regression and logistic regression

  • Use k-NN, SVMs, ensemble methods

  • Understand model selection and regularization

Forms of Learning

  • Supervised: (x, y) pairs

  • Unsupervised: x only

  • Reinforcement: Rewards from environment

Decision Trees

  • Nodes: Test on attribute

  • Leaves: Class label

  • ID3: Information gain

Information Gain

  • Entropy: H(S) = -Σ p log p

  • Gain(S, A): H(S) - Σᵥ |Sᵥ|/|S| H(Sᵥ)

  • Choose: Attribute with max gain

Overfitting

  • Bias-variance tradeoff

  • Pruning: Remove branches

  • Regularization: Penalize complexity

Linear Regression

  • Model: y = w·x + b

  • Loss: MSE

  • Optimization: Gradient descent

Logistic Regression

  • Classification: P(y=1|x) = σ(w·x)

  • Sigmoid: σ(z) = 1/(1+e⁻ᶻ)

  • Cross-entropy loss

k-Nearest Neighbors

  • Nonparametric: Store training data

  • Predict: Majority vote of k nearest

  • Distance: Euclidean, etc.

Support Vector Machines

  • Max margin: Separate with maximum margin

  • Kernel trick: Implicit high-dimensional space

  • Soft margin: Allow misclassification

Ensemble Learning

  • Bagging: Bootstrap, aggregate

  • Random forest: Random features + bagging

  • Boosting: Weight misclassified

  • Stacking: Meta-learner

Summary

  • Decision trees: ID3, gain

  • Linear models: Regression, logistic

  • k-NN, SVM: Nonparametric

  • Ensembles: Bagging, boosting

References

  • AIMA Ch. 19

  • Russell & Norvig, AIMA 4e, Ch. 19

  • Chapter PDF: chapters/chapter-19.pdf

  • aima-python: learning.ipynb

Questions?

Next lecture: Learning Probabilistic Models (Chapter 20)