cs188 outline cs 188 artificial intelligence
play

CS188 Outline CS 188: Artificial Intelligence Probability Were - PDF document

CS188 Outline CS 188: Artificial Intelligence Probability Were done with Part I: Search and Planning! Part II: Probabilistic Reasoning Diagnosis Speech recognition Tracking objects Robot mapping Genetics Error


  1. CS188 Outline CS 188: Artificial Intelligence Probability � We’re done with Part I: Search and Planning! � Part II: Probabilistic Reasoning � Diagnosis � Speech recognition � Tracking objects � Robot mapping � Genetics � Error correcting codes � … lots more! � Part III: Machine Learning Instructors: Dan Klein and Pieter Abbeel --- University of California, Berkeley [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Today Inference in Ghostbusters � Probability � A ghost is in the grid somewhere � Random Variables � Sensor readings tell how � Joint and Marginal Distributions close a square is to the � Conditional Distribution ghost � Product Rule, Chain Rule, Bayes’ Rule � On the ghost: red � Inference � 1 or 2 away: orange � 3 or 4 away: yellow � Independence � 5+ away: green � You’ll need all this stuff A LOT for the � Sensors are noisy, but we know P(Color | Distance) next few weeks, so make sure you go P(red | 3) P(orange | 3) P(yellow | 3) P(green | 3) over it now! 0.05 0.15 0.5 0.3 [Demo: Ghostbuster – no probability (L12D1) ] Uncertainty Random Variables � A random variable is some aspect of the world about � General situation: which we (may) have uncertainty � Observed variables (evidence) : Agent knows certain things about the state of the world (e.g., sensor � R = Is it raining? readings or symptoms) � T = Is it hot or cold? � D = How long will it take to drive to work? � Unobserved variables : Agent needs to reason about other aspects (e.g. where an object is or what disease is � L = Where is the ghost? present) � We denote random variables with capital letters � Model : Agent knows something about how the known variables relate to the unknown variables � Like variables in a CSP, random variables have domains � Probabilistic reasoning gives us a framework for � R in {true, false} (often write as {+r, -r}) managing our beliefs and knowledge � T in {hot, cold} � D in [0, ∞ ) � L in possible locations, maybe {(0,0), (0,1), …}

  2. Probability Distributions Probability Distributions � Associate a probability with each value � Unobserved random variables have distributions Shorthand notation: � Weather: � Temperature: T P W P hot 0.5 sun 0.6 cold 0.5 rain 0.1 fog 0.3 W P meteor 0.0 T P sun 0.6 � A distribution is a TABLE of probabilities of values OK if all domain entries are unique hot 0.5 rain 0.1 cold 0.5 fog 0.3 � A probability (lower case value) is a single number meteor 0.0 � Must have: and Joint Distributions Probabilistic Models � A joint distribution over a set of random variables: Distribution over T,W � A probabilistic model is a joint distribution specifies a real number for each assignment (or outcome ): over a set of random variables T W P � hot sun 0.4 Probabilistic models: � (Random) variables with domains hot rain 0.1 � Assignments are called outcomes cold sun 0.2 � Joint distributions: say whether assignments T W P (outcomes) are likely cold rain 0.3 � Must obey: hot sun 0.4 � Normalized: sum to 1.0 � Ideally: only certain variables directly interact hot rain 0.1 Constraint over T,W cold sun 0.2 � Constraint satisfaction problems: T W P cold rain 0.3 � Variables with domains hot sun T � Constraints: state whether assignments are possible hot rain F � Size of distribution if n variables with domain sizes d? � Ideally: only certain variables directly interact cold sun F � For all but the smallest distributions, impractical to write out! cold rain T Events Quiz: Events � An event is a set E of outcomes � P(+x, +y) ? X Y P +x +y 0.2 � P(+x) ? � From a joint distribution, we can +x -y 0.3 calculate the probability of any event -x +y 0.4 T W P -x -y 0.1 � Probability that it’s hot AND sunny? hot sun 0.4 � P(-y OR +x) ? hot rain 0.1 � Probability that it’s hot? cold sun 0.2 cold rain 0.3 � Probability that it’s hot OR sunny? � Typically, the events we care about are partial assignments , like P(T=hot)

  3. Marginal Distributions Quiz: Marginal Distributions � Marginal distributions are sub-tables which eliminate variables � Marginalization (summing out): Combine collapsed rows by adding X P +x X Y P -x T P +x +y 0.2 hot 0.5 +x -y 0.3 T W P cold 0.5 -x +y 0.4 hot sun 0.4 Y P hot rain 0.1 -x -y 0.1 +y cold sun 0.2 -y W P cold rain 0.3 sun 0.6 rain 0.4 Conditional Probabilities Quiz: Conditional Probabilities � A simple relation between joint and conditional probabilities � P(+x | +y) ? � In fact, this is taken as the definition of a conditional probability P(a,b) X Y P � P(-x | +y) ? +x +y 0.2 +x -y 0.3 -x +y 0.4 P(a) P(b) -x -y 0.1 � P(-y | +x) ? T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 Conditional Distributions Normalization Trick � Conditional distributions are probability distributions over some variables given fixed values of others Conditional Distributions Joint Distribution T W P W P hot sun 0.4 T W P W P sun 0.8 hot rain 0.1 hot sun 0.4 sun 0.4 rain 0.2 cold sun 0.2 hot rain 0.1 rain 0.6 cold rain 0.3 cold sun 0.2 cold rain 0.3 W P sun 0.4 rain 0.6

  4. Normalization Trick Normalization Trick SELECT the joint NORMALIZE the probabilities selection T W P (make it sum to one) matching the SELECT the joint NORMALIZE the hot sun 0.4 evidence W P selection T W P probabilities T W P matching the (make it sum to one) hot rain 0.1 sun 0.4 cold sun 0.2 evidence hot sun 0.4 W P T W P cold sun 0.2 rain 0.6 cold rain 0.3 hot rain 0.1 sun 0.4 cold rain 0.3 cold sun 0.2 cold sun 0.2 rain 0.6 cold rain 0.3 cold rain 0.3 � Why does this work? Sum of selection is P(evidence)! (P(T=c), here) Quiz: Normalization Trick To Normalize � (Dictionary) To bring or restore to a normal condition � P(X | Y=-y) ? All entries sum to ONE SELECT the joint NORMALIZE the � Procedure: probabilities selection X Y P (make it sum to one) matching the � Step 1: Compute Z = sum over all entries +x +y 0.2 evidence � Step 2: Divide every entry by Z +x -y 0.3 -x +y 0.4 � Example 1 � Example 2 -x -y 0.1 T W P T W P W P Normalize W P Normalize hot sun 20 hot sun 0.4 sun 0.2 sun 0.4 hot rain 0.1 hot rain 5 Z = 0.5 rain 0.3 rain 0.6 Z = 50 cold sun 10 cold sun 0.2 cold rain 15 cold rain 0.3 Probabilistic Inference Inference by Enumeration * Works fine with � � General case: We want: multiple query � Probabilistic inference: compute a desired � Evidence variables: variables, too probability from other known probabilities (e.g. � Query* variable: All variables � Hidden variables: conditional from joint) � � � Step 3: Normalize Step 1: Select the Step 2: Sum out H to get joint � We generally compute conditional probabilities entries consistent of Query and evidence with the evidence � P(on time | no reported accidents) = 0.90 � These represent the agent’s beliefs given the evidence � Probabilities change with new evidence: � P(on time | no accidents, 5 a.m.) = 0.95 � P(on time | no accidents, 5 a.m., raining) = 0.80 � Observing new evidence causes beliefs to be updated

  5. Inference by Enumeration Inference by Enumeration S T W P � P(W)? � Obvious problems: summer hot sun 0.30 � Worst-case time complexity O(d n ) summer hot rain 0.05 summer cold sun 0.10 � Space complexity O(d n ) to store the joint distribution � P(W | winter)? summer cold rain 0.05 winter hot sun 0.10 winter hot rain 0.05 winter cold sun 0.15 winter cold rain 0.20 � P(W | winter, hot)? The Product Rule The Product Rule � Sometimes have conditional distributions but want the joint � Example: D W P D W P wet sun 0.1 wet sun 0.08 R P dry sun 0.9 dry sun 0.72 sun 0.8 wet rain 0.7 wet rain 0.14 rain 0.2 dry rain 0.3 dry rain 0.06 The Chain Rule Bayes Rule � More generally, can always write any joint distribution as an incremental product of conditional distributions � Why is this always true?

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend