Planning and Acting Chapter 11, Section 3 of; based on AIMA Slides - - PowerPoint PPT Presentation

planning and acting
SMART_READER_LITE
LIVE PREVIEW

Planning and Acting Chapter 11, Section 3 of; based on AIMA Slides - - PowerPoint PPT Presentation

Planning and Acting Chapter 11, Section 3 of; based on AIMA Slides c Artificial Intelligence, spring 2013, Peter Ljungl Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 1 Outline The real world Sensorless/contingent


slide-1
SLIDE 1

Planning and Acting

Chapter 11, Section 3

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 1

slide-2
SLIDE 2

Outline

♦ The real world ♦ Sensorless/contingent planning (Conditional planning) ♦ Online replanning (Monitoring and replanning)

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 2

slide-3
SLIDE 3

The real world

~Flat(Spare) Intact(Spare) Off(Spare) On(Tire1) Flat(Tire1) START FINISH On(x) ~Flat(x) Remove(x) On(x) Off(x) ClearHub Puton(x) Off(x) ClearHub On(x) ~ClearHub Inflate(x) Intact(x) Flat(x) ~Flat(x)

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 3

slide-4
SLIDE 4

Things go wrong

Incomplete information Unknown preconditions, e.g., Intact(Spare)? Disjunctive effects, e.g., Inflate(x) causes Inflated(x) ∨ SlowHiss(x) ∨ Burst(x) ∨ BrokenPump ∨ . . . Incorrect information Current state incorrect, e.g., spare NOT intact Missing/incorrect postconditions in operators Qualification problem: can never finish listing all the required preconditions and possible conditional outcomes of actions

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 4

slide-5
SLIDE 5

Solutions

Conformant or sensorless planning Devise a plan that works regardless of state or outcome Such plans may not exist Conditional planning Plan to obtain information (observation actions) Subplan for each contingency, e.g., [Check(Tire1), if Intact(Tire1) then Inflate(Tire1) else CallAAA Expensive because it plans for many unlikely cases Monitoring/Replanning Assume normal states, outcomes Check progress during execution, replan if necessary Unanticipated outcomes may lead to failure (e.g., no AAA card) (Really need a combination; plan for likely/serious eventualities, deal with others when they arise, as they must eventually)

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 5

slide-6
SLIDE 6

Conformant planning

Search in space of belief states (sets of possible actual states)

L R L R S L R S S S S R L S S L R R L R L Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 6

slide-7
SLIDE 7

Conditional planning

If the world is nondeterministic or partially observable then percepts usually provide information, i.e., split up the belief state

ACTION PERCEPT

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 7

slide-8
SLIDE 8

Conditional planning contd.

Conditional plans check (any consequence of KB +) percept [. . . , if C then PlanA else PlanB, . . .] Execution: check C against current KB, execute “then” or “else” Need some plan for every possible percept (Cf. game playing: some response for every opponent move) (Cf. backward chaining: some rule such that every premise satisfied AND–OR tree search (very similar to backward chaining algorithm)

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 8

slide-9
SLIDE 9

Example

Double Murphy: sucking or arriving may dirty a clean square

8 3 6 8 7 1 5 7 8 4 2

Left Suck Right Suck Left Suck

GOAL GOAL LOOP LOOP

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 9

slide-10
SLIDE 10

Example

Triple Murphy: also sometimes stays put instead of moving

8

Left Suck

6 3 7

GOAL

[L1 : Left, if AtR then L1 else [if CleanL then [ ] else Suck]]

  • r [while AtR do [Left], if CleanL then [ ] else Suck]

“Infinite loop” but will eventually work unless action always fails

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 10

slide-11
SLIDE 11

Execution Monitoring

“Failure” = preconditions of remaining plan not met Preconditions of remaining plan = all preconditions of remaining steps not achieved by remaining steps = all causal links crossing current time point On failure, resume POP to achieve open conditions from current state IPEM (Integrated Planning, Execution, and Monitoring): keep updating Start to match current state links from actions replaced by links from Start when done

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 11

slide-12
SLIDE 12

Example

At(SM) At(Home) At(HWS) Buy(Drill) Buy(Milk) Buy(Ban.) Go(Home) Go(HWS) Go(SM) Finish Start

Sells(SM,Milk)

At(Home) Have(Ban.) Have(Drill) Have(Milk) Sells(SM,Milk) At(SM) Sells(SM,Ban.) At(SM) Sells(HWS,Drill) At(HWS)

At(Home) Sells(SM,Ban.) Sells(HWS,Drill)

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 12

slide-13
SLIDE 13

Example

At(SM) At(HWS) Buy(Drill) Buy(Milk) Buy(Ban.) Go(Home) Go(HWS) Go(SM) Finish Start

Sells(SM,Milk)

At(Home) Have(Ban.) Have(Drill) Have(Milk) Sells(SM,Milk) At(SM) Sells(SM,Ban.) At(SM) Sells(HWS,Drill) At(HWS)

Sells(SM,Ban.) Sells(HWS,Drill) At(HWS)

At(Home)

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 13

slide-14
SLIDE 14

Example

At(SM) At(Home) At(HWS) Buy(Drill) Buy(Milk) Buy(Ban.) Go(Home) Go(HWS) Go(SM) Finish Start

At(HWS) Have(Drill) Sells(SM,Ban.) Sells(SM,Milk)

At(Home) Have(Ban.) Have(Drill) Have(Milk) Sells(SM,Milk) At(SM) Sells(SM,Ban.) At(SM) Sells(HWS,Drill) At(HWS)

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 14

slide-15
SLIDE 15

Example

At(SM) At(Home) At(HWS) Buy(Drill) Buy(Milk) Buy(Ban.) Go(Home) Go(HWS) Go(SM) Finish Start

Have(Drill) Sells(SM,Ban.) Sells(SM,Milk)

At(Home) Have(Ban.) Have(Drill) Have(Milk) Sells(SM,Milk) At(SM) Sells(SM,Ban.) At(SM) Sells(HWS,Drill) At(HWS)

At(SM)

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 15

slide-16
SLIDE 16

Example

At(SM) At(Home) At(HWS) Buy(Drill) Buy(Milk) Buy(Ban.) Go(Home) Go(HWS) Go(SM) Finish Start

Have(Drill)

At(Home) Have(Ban.) Have(Drill) Have(Milk) Sells(SM,Milk) At(SM) Sells(SM,Ban.) At(SM) Sells(HWS,Drill) At(HWS)

At(SM) Have(Ban.) Have(Milk)

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 16

slide-17
SLIDE 17

Example

At(SM) At(Home) At(HWS) Buy(Drill) Buy(Milk) Buy(Ban.) Go(Home) Go(HWS) Go(SM) Finish Start

Have(Drill)

At(Home) Have(Ban.) Have(Drill) Have(Milk) Sells(SM,Milk) At(SM) Sells(SM,Ban.) At(SM) Sells(HWS,Drill) At(HWS)

Have(Ban.) Have(Milk) At(Home)

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 17

slide-18
SLIDE 18

Emergent behavior

START Get(Red) Color(Chair,Blue) ~Have(Red) Paint(Red) Have(Red) FINISH Color(Chair,Red)

FAILURE RESPONSE Have(Red) PRECONDITIONS Fetch more red

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 18

slide-19
SLIDE 19

Emergent behavior

START Get(Red) Color(Chair,Blue) ~Have(Red) Paint(Red) Have(Red) FINISH Color(Chair,Red)

FAILURE RESPONSE PRECONDITIONS Color(Chair,Red) Extra coat of paint

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 19

slide-20
SLIDE 20

Emergent behavior

START Get(Red) Color(Chair,Blue) ~Have(Red) Paint(Red) Have(Red) FINISH Color(Chair,Red)

FAILURE RESPONSE PRECONDITIONS Color(Chair,Red) Extra coat of paint

“Loop until success” behavior emerges from interaction between monitor/replan agent design and uncooperative environment

Artificial Intelligence, spring 2013, Peter Ljungl¨

  • f; based on AIMA Slides c

Stuart Russel and Peter Norvig, 2004 Chapter 11, Section 3 20