probability hot rain 0 1 sun 0 6 distribution cold sun 0
play

} Probability hot rain 0.1 ? sun 0.6 Distribution cold sun - PDF document

Topics from 30,000 CSE 473: Artificial Intelligence Probability We re done with Part I Search and Planning! Part II: Probabilistic Reasoning Diagnosis Speech recognition Tracking objects Robot mapping Genetics Error


  1. Topics from 30,000’ CSE 473: Artificial Intelligence Probability § We ’ re done with Part I Search and Planning! § Part II: Probabilistic Reasoning § Diagnosis § Speech recognition § Tracking objects § Robot mapping § Genetics § Error correcting codes § … lots more! Dieter Fox University of Washington § Part III: Machine Learning [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Outline Uncertainty § Probability § General situation: § Random Variables § Observed variables (evidence) : Agent knows certain things about the state of the world (e.g., sensor § Joint and Marginal Distributions readings or symptoms) § Conditional Distribution § Unobserved variables : Agent needs to reason about other aspects (e.g. where an object is or what disease is § Product Rule, Chain Rule, Bayes’ Rule present) § Inference § Model : Agent knows something about how the known § Independence variables relate to the unknown variables § You’ll need all this stuff A LOT for the § Probabilistic reasoning gives us a framework for next few weeks, so make sure you go managing our beliefs and knowledge over it now! What is….? Joint Distributions § A joint distribution over a set of random variables: specifies a probability for each assignment (or outcome ): ? Random Variable Value ? T W P hot sun 0.4 § Must obey: W P } Probability hot rain 0.1 ? sun 0.6 Distribution cold sun 0.2 rain 0.1 cold rain 0.3 fog 0.3 meteor 0.0 § Size of joint distribution if n variables with domain sizes d? § For all but the smallest distributions, impractical to write out! 1

  2. Probabilistic Models Events § An event is a set E of outcomes Distribution over T,W § A probabilistic model is a joint distribution over a set of random variables T W P hot sun 0.4 § Probabilistic models: hot rain 0.1 § (Random) variables with domains § From a joint distribution, we can § Joint distributions: say whether assignments cold sun 0.2 calculate the probability of any event (called “ outcomes ”) are likely cold rain 0.3 T W P § Normalized: sum to 1.0 § Ideally: only certain variables directly interact § Probability that it’s hot AND sunny? hot sun 0.4 Constraint over T,W hot rain 0.1 § Constraint satisfaction problems: § Probability that it’s hot? T W P cold sun 0.2 § Variables with domains hot sun T cold rain 0.3 § Constraints: state whether assignments are possible § Probability that it’s hot OR sunny? Ideally: only certain variables directly interact hot rain F § cold sun F § Typically, the events we care about are partial assignments , like P(T=hot) cold rain T Quiz: Events Marginal Distributions § P(+x, +y) ? § Marginal distributions are sub-tables which eliminate variables Marginalization (summing out): Combine collapsed rows by adding § X Y P +x +y 0.2 T P § P(+x) ? +x -y 0.3 hot 0.5 T W P -x +y 0.4 cold 0.5 hot sun 0.4 -x -y 0.1 hot rain 0.1 § P(-y OR +x) ? cold sun 0.2 W P cold rain 0.3 sun 0.6 rain 0.4 Quiz: Marginal Distributions Conditional Probabilities § A simple relation between joint and marginal probabilities § In fact, this is taken as the definition of a conditional probability X P +x P(a,b) X Y P -x +x +y 0.2 +x -y 0.3 -x +y 0.4 Y P P(a) P(b) -x -y 0.1 +y -y T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 2

  3. Quiz: Conditional Probabilities Conditional Distributions § P(+x | +y) ? § Conditional distributions are probability distributions over some variables given fixed values of others Conditional Distributions X Y P Joint Distribution +x +y 0.2 § P(-x | +y) ? W P +x -y 0.3 T W P sun 0.8 -x +y 0.4 hot sun 0.4 rain 0.2 -x -y 0.1 hot rain 0.1 cold sun 0.2 § P(-y | +x) ? cold rain 0.3 W P sun 0.4 rain 0.6 Conditional Distribs - The Slow Way… Normalization Trick SELECT the joint NORMALIZE the T W P probabilities selection T W P matching the (make it sum to one) hot sun 0.4 W P hot sun 0.4 evidence W P T W P hot rain 0.1 sun 0.4 hot rain 0.1 sun 0.4 cold sun 0.2 cold sun 0.2 rain 0.6 cold sun 0.2 rain 0.6 cold rain 0.3 cold rain 0.3 cold rain 0.3 Normalization Trick Quiz: Normalization Trick § P(X | Y=-y) ? SELECT the joint NORMALIZE the selection probabilities SELECT the joint NORMALIZE the T W P (make it sum to one) matching the probabilities selection hot sun 0.4 evidence W P X Y P T W P matching the (make it sum to one) hot rain 0.1 sun 0.4 +x +y 0.2 evidence cold sun 0.2 cold sun 0.2 rain 0.6 +x -y 0.3 cold rain 0.3 cold rain 0.3 -x +y 0.4 -x -y 0.1 § Why does this work? Sum of selection is P(evidence)! (P(T=c), here) 3

  4. To Normalize Probabilistic Inference § Dictionary: “To bring or restore to a normal condition “ § Probabilistic inference = “compute a desired probability from other known probabilities (e.g. conditional from joint)” All entries sum to ONE § Procedure: § Step 1: Compute Z = sum over all entries § We generally compute conditional probabilities § Step 2: Divide every entry by Z § P(on time | no reported accidents) = 0.90 § These represent the agent’s beliefs given the evidence § Example 1 § Example 2 § Probabilities change with new evidence: T W P T W P W P W P Normalize § P(on time | no accidents, 5 a.m.) = 0.95 hot sun 20 Normalize hot sun 0.4 sun 0.2 sun 0.4 § P(on time | no accidents, 5 a.m., raining) = 0.80 hot rain 5 hot rain 0.1 rain 0.3 Z = 0.5 rain 0.6 Z = 50 § Observing new evidence causes beliefs to be updated cold sun 10 cold sun 0.2 cold rain 15 cold rain 0.3 Inference by Enumeration Inference by Enumeration * Works fine with § General case: § We want: multiple query S T W P § Evidence variables: variables, too § P(W)? § Query* variable: summe hot sun 0.30 All variables Hidden variables: § r summe hot rain 0.05 § P(W | winter)? r § Step 1: Select the § Step 2: Sum out H to get joint § Step 3: Normalize entries consistent of Query and evidence summe cold sun 0.10 with the evidence r × 1 summe cold rain 0.05 r Z § P(W | winter, hot)? winter hot sun 0.10 winter hot rain 0.05 winter cold sun 0.15 winter cold rain 0.20 Inference by Enumeration The Product Rule § Sometimes have conditional distributions but want the joint § Computational problems? § Worst-case time complexity O(d n ) § Space complexity O(d n ) to store the joint distribution 4

  5. The Product Rule The Chain Rule § More generally, can always write any joint distribution as an incremental product of conditional distributions § Example: D W P D W P wet sun 0.1 wet sun 0.08 R P dry sun 0.72 dry sun 0.9 sun 0.8 wet rain 0.7 wet rain 0.14 rain 0.2 dry rain 0.06 dry rain 0.3 Independence Example: Independence? § Two variables are independent in a joint distribution if: T P hot 0.5 P 2 ( T, W ) = P ( T ) P ( W ) cold 0.5 T W P T W P § Says the joint distribution factors into a product of two simple ones hot sun 0.4 hot sun 0.3 § Usually variables aren’t independent! hot rain 0.1 hot rain 0.2 § Can use independence as a modeling assumption cold sun 0.2 cold sun 0.3 § Independence can be a simplifying assumption cold rain 0.3 cold rain 0.2 § Empirical joint distributions: at best “close” to independent W P § What could we assume for {Weather, Traffic, Cavity}? sun 0.6 rain 0.4 § Independence is like something from CSPs: what? Example: Independence Conditional Independence § N fair, independent coin flips: H 0.5 H 0.5 H 0.5 T 0.5 T 0.5 T 0.5 5

  6. Conditional Independence Conditional Independence § P(Toothache, Cavity, Catch) § Unconditional (absolute) independence very rare (why?) § If I have a cavity, the probability that the probe catches in it doesn't depend on whether I have a toothache: § Conditional independence is our most basic and robust form § P(+catch | +toothache, +cavity) = P(+catch | +cavity) of knowledge about uncertain environments. § The same independence holds if I don’t have a cavity: § X is conditionally independent of Y given Z § P(+catch | +toothache, -cavity) = P(+catch| -cavity) § Catch is conditionally independent of Toothache given Cavity: if and only if: § P(Catch | Toothache, Cavity) = P(Catch | Cavity) § Equivalent statements: or, equivalently, if and only if § P(Toothache | Catch , Cavity) = P(Toothache | Cavity) § P(Toothache, Catch | Cavity) = P(Toothache | Cavity) P(Catch | Cavity) One can be derived from the other easily § Conditional Independence Conditional Independence § What about this domain: § What about this domain: § Traffic § Fire § Umbrella § Smoke § Raining § Alarm Bayes Rule Pacman – Sonar (P4) [Demo: Pacman – Sonar – No Beliefs(L14D1)] 6

Recommend


More recommend