10: Probabilistic Reasoning
In real-world environments, information is often incomplete or uncertain. Probabilistic reasoning allows agents to make informed decisions by quantifying and managing uncertainty.

10.1 The Need for Probabilistic Reasoning
Logical systems require complete and accurate information, which is often unavailable in practical scenarios. Probabilistic reasoning provides a framework to:
Represent uncertainty.
Make predictions based on partial evidence.
Update beliefs when new information becomes available.
Example:
A medical diagnostic system:
Uncertainty: Symptoms may correspond to multiple diseases.
Solution: Use probabilities to estimate the likelihood of each disease based on observed symptoms.
10.2 Basic Probability Concepts
10.2.1 Random Variables
A random variable represents an uncertain quantity with a range of possible values.
Example: Let WeatherWeatherWeather be a random variable with values {Sunny,Rainy,Cloudy}\{Sunny, Rainy, Cloudy\}{Sunny,Rainy,Cloudy}.
10.2.2 Probability Distributions
Assign probabilities to the possible values of a random variable.
Example: P(Weather=Sunny)=0.7P(Weather = Sunny) = 0.7P(Weather=Sunny)=0.7, P(Weather=Rainy)=0.2P(Weather = Rainy) = 0.2P(Weather=Rainy)=0.2, P(Weather=Cloudy)=0.1P(Weather = Cloudy) = 0.1P(Weather=Cloudy)=0.1.
10.2.3 Joint Probability Distribution
Represents the probabilities of combinations of multiple variables.
Example: P(Weather=Sunny ∧ Traffic=Heavy)=0.3P(Weather = Sunny \, \wedge \, Traffic = Heavy) = 0.3P(Weather=Sunny∧Traffic=Heavy)=0.3.
10.3 Conditional Probability and Bayes’ Rule
10.3.1 Conditional Probability
Represents the probability of one event given that another event has occurred.
Formula: P(A∣B)=P(A ∧ B)P(B)P(A | B) = \frac{P(A \, \wedge \, B)}{P(B)}P(A∣B)=P(B)P(A∧B).
Example: P(Traffic=Heavy∣Weather=Rainy)P(Traffic = Heavy | Weather = Rainy)P(Traffic=Heavy∣Weather=Rainy) represents the probability of heavy traffic given that it’s raining.
10.3.2 Bayes’ Rule
Bayes’ Rule provides a way to update probabilities when new evidence is observed.
Formula: P(H∣E)=P(E∣H)⋅P(H)P(E)P(H | E) = \frac{P(E | H) \cdot P(H)}{P(E)}P(H∣E)=P(E)P(E∣H)⋅P(H), where:
HHH: Hypothesis.
EEE: Evidence.
Example:
Hypothesis: A patient has the flu (HHH).
Evidence: The patient has a fever (EEE).
P(H∣E)P(H | E)P(H∣E): Updated probability of the flu given the fever.
10.4 Bayesian Networks
A Bayesian Network is a graphical model that represents probabilistic relationships among variables. It combines:
Graph Structure: Nodes represent variables; edges indicate dependencies.
Conditional Probability Tables (CPTs): Quantify the strength of dependencies.
10.4.1 Constructing Bayesian Networks
Identify variables.
Determine dependencies between variables.
Assign probabilities to CPTs.
Example: A Bayesian Network for diagnosing car issues:
Variables: Battery, Fuel, Engine.
Dependencies: The engine starting depends on the battery and fuel.
10.4.2 Inference in Bayesian Networks
Inference involves computing probabilities for specific variables given evidence.
Example: Given that the engine doesn’t start, infer the likelihood of a dead battery.
10.5 Probabilistic Reasoning in Time
10.5.1 Hidden Markov Models (HMMs)
HMMs model systems where the state is partially observable, and transitions between states follow probabilistic rules.
Components:
States: Possible conditions of the system.
Observations: Evidence related to the state.
Transition Probabilities: Likelihood of moving between states.
Example: Speech recognition uses HMMs to infer spoken words from acoustic signals.
10.5.2 Dynamic Bayesian Networks
Extend Bayesian Networks to represent sequences over time, enabling agents to reason about temporal changes.
Example: Predicting weather patterns over multiple days.
10.6 Applications of Probabilistic Reasoning
10.6.1 Medical Diagnosis
Estimate the probability of diseases based on observed symptoms and test results.
10.6.2 Robotics
Robots use probabilistic reasoning to localize themselves and navigate uncertain environments.
10.6.3 Natural Language Processing
Language models predict the likelihood of words or phrases in text, enabling tasks like machine translation and autocomplete.
10.7 Summary
In this chapter, we explored:
The need for probabilistic reasoning in uncertain environments.
Fundamental probability concepts, including conditional probability and Bayes’ Rule.
Bayesian Networks for representing and reasoning about dependencies.
Temporal reasoning with Hidden Markov Models and Dynamic Bayesian Networks.
Applications in medical diagnosis, robotics, and natural language processing.
Last updated