CS 680: GAME AI WEEK 8: STORY GENERATION 3/5/2012 Santiago Ontan - - PowerPoint PPT Presentation

cs 680 game ai
SMART_READER_LITE
LIVE PREVIEW

CS 680: GAME AI WEEK 8: STORY GENERATION 3/5/2012 Santiago Ontan - - PowerPoint PPT Presentation

CS 680: GAME AI WEEK 8: STORY GENERATION 3/5/2012 Santiago Ontan santi@cs.drexel.edu https://www.cs.drexel.edu/~santi/teaching/2012/CS680/intro.html Reminders This is the last lecture! Next week will be a special project assistance


slide-1
SLIDE 1

CS 680: GAME AI

WEEK 8: STORY GENERATION

3/5/2012 Santiago Ontañón santi@cs.drexel.edu https://www.cs.drexel.edu/~santi/teaching/2012/CS680/intro.html

slide-2
SLIDE 2

Reminders

  • This is the last lecture! Next week will be a special project

assistance session.

  • I’ll have slides on the AI of some of the games that you asked, and a

2-3 slides summary of the course. But it will be short.

  • Progress self-check indicator:
  • Your progress is good is you have:
  • Project 1 completed.
  • Project 2 completed
  • Progress self-check indicator (next week):
  • Your progress is good is you have:
  • Project 1 completed.
  • Project 2 completed
  • You have a prototype (even if it doesn’t work) for Project 3
  • You have started writing your paper
slide-3
SLIDE 3

Outline

  • Student Presentation:

“Towards Automated Game Design”

  • Student Presentation:

“Game AI as Storytelling”

  • Introduction to Computational Narrative
  • (brief introduction to automated planning)
  • Planning-based Story Generation
  • (very brief introduction to CBR)
  • Analogy/CBR-based Story Generation
  • Story Generation in Computer Games
  • Project Discussion
slide-4
SLIDE 4

Outline

  • Student Presentation:

“Towards Automated Game Design”

  • Student Presentation:

“Game AI as Storytelling”

  • Introduction to Computational Narrative
  • (brief introduction to automated planning)
  • Planning-based Story Generation
  • (very brief introduction to CBR)
  • Analogy/CBR-based Story Generation
  • Story Generation in Computer Games
  • Project Discussion
slide-5
SLIDE 5

Computational Narrative

  • Algorithmically analyze, structure and generate stories.
  • Generating stories is an “AI-complete” problem. Involves

many yet-unsolved problems:

  • believable characters / emotional modeling
  • natural language generation
  • common sense reasoning
  • narrative aesthetics
slide-6
SLIDE 6

One Hundred Thousand Billion Poems

slide-7
SLIDE 7
slide-8
SLIDE 8

The Policeman’s Beard is Half Constructed

Bill sings to Sarah. Sarah sings to

  • Bill. Perhaps they

will do other dangerous things

  • together. They may eat lamb or

stroke each other. They may chant of their difficulties and their

  • happiness. They have love but they

also have typewriters. That is interesting.

slide-9
SLIDE 9

Story Generation

  • Computers are far from generating novels at the levels of

human authors

  • However, many useful techniques have emerged from the

story generation research community. Specially applicable to video games

  • Videogames and story generation or storytelling are

closely related

slide-10
SLIDE 10

Choose Your Own Adventure

slide-11
SLIDE 11

Choose Your Own Adventure

slide-12
SLIDE 12

Outline

  • Student Presentation:

“Towards Automated Game Design”

  • Student Presentation:

“Game AI as Storytelling”

  • Introduction to Computational Narrative
  • (brief introduction to automated planning)
  • Planning-based Story Generation
  • (very brief introduction to CBR)
  • Analogy/CBR-based Story Generation
  • Story Generation in Computer Games
  • Project Discussion
slide-13
SLIDE 13

Automated Planning

  • Planning:
  • Find the sequence of actions that will take us from an initial state

to a target state

  • Automated Planning:
  • Typically solved with specialized search algorithms
slide-14
SLIDE 14

Automated Planning: Example

  • Blocks world

A Table C B A Table C B Initial State Target State Possible actions: Take(X) Put(X,Y)

slide-15
SLIDE 15

Automated Planning: Example

  • Action definition:
  • Take(X)
  • Preconditions:
  • We have nothing in our hands
  • X is a block
  • Nothing on top of X
  • Postconditions:
  • X is not on top of anything
  • X is in our hands
  • Put(X,Y)
  • Preconditions:
  • X is in our hands
  • Y is the table or Y is a block

with nothing on top

  • Postconditions:
  • X is not in our hands
  • X is on top of Y
slide-16
SLIDE 16

Automated Planning: Example

A Table C B A Table C B Initial State Target State Solution:

  • Take(B)
  • Put(B,Table)
  • Take(A)
  • Put(A,B)
  • Take(C)
  • Put(C,A)
slide-17
SLIDE 17

Automated Planning

  • Many approaches to solve the problem exist:
  • Simplest is known as “Forward Search”, and it means using A*
  • Forward Search:
  • Each possible configuration of the world is a state in the A* search
  • Heuristic measures how many of the conditions in the target state

are not satisfied, for example:

Current State:

  • n(A,Table)
  • n(B,A)
  • n(C,Table)

Target State:

  • n(B,Table)
  • n(A,B)
  • n(C,A)

h(s) = 3

slide-18
SLIDE 18

Planning with A*:

A Table C B A Table C B 3

S0 OPEN = [S0] CLOSED = []

slide-19
SLIDE 19

Planning with A*:

A Table C B A Table C B 3

OPEN = [S1,S2] CLOSED = [S0]

A Table C B 3

S1

A Table C B 3

S2 S0 Take(B) Take(C)

slide-20
SLIDE 20

Planning with A*:

A Table C B A Table C B 3

OPEN = [S2,S4,S3] CLOSED = [S0,S1]

A Table C B 3

S1

A Table C B 3

S2 S0 Take(B) Take(C)

A Table C B 3

S3

A Table C B 2

S4 Put(B,C) Put(B,Table)

slide-21
SLIDE 21

Automated Planning

  • The example I showed is what is known as “classic planning”
  • There are other variants of the planning problem:
  • Temporal planning (actions take time)
  • Probabilistic planning (actions have probabilistic effects)
  • Nonlinear planning (plans might have parallel actions)
  • Many other algorithms:
  • Means-ends-analysis
  • Graph-plan
  • FF
  • HTN
  • Many heuristics:
  • Relaxation
slide-22
SLIDE 22

Automated Planning

  • The idea that has to stick is:
  • Planning can be used to find combination a combination of actions

that take us from an initial state to a target state

  • Many real-life problems can be approached this way, and

in particular story generation

slide-23
SLIDE 23

Outline

  • Student Presentation:

“Towards Automated Game Design”

  • Student Presentation:

“Game AI as Storytelling”

  • Introduction to Computational Narrative
  • (brief introduction to automated planning)
  • Planning-based Story Generation
  • (very brief introduction to CBR)
  • Analogy/CBR-based Story Generation
  • Story Generation in Computer Games
  • Project Discussion
slide-24
SLIDE 24

Tale-spin

  • James Meehan 1976
  • “The program, simply described, simulates a small world
  • f characters who are motivated to act by having

problems to solve. When an event occurs, it is expressed in English, thus forming the text of the story. Central to the simulation, therefore, are the techniques for solving problems”

slide-25
SLIDE 25

Tale-spin

  • A story is generated in the following way, Given:
  • An initial state (entered by the user):
  • Characters
  • Setting (relationships between the characters, locations, etc.)
  • One character with a goal (hungry, thirsty, etc.)
  • A set of possible actions to perform (defined in the system)
  • Find a plan that makes the problem disappear:
  • Both the initial state and each event (action or goal) will be

translated to natural language, thus forming the story

slide-26
SLIDE 26

Tale-spin: Example

  • Once upon a time George ant lived near a patch of ground.

There was a nest in an ash tree. Wilma bird lived in the nest. There was some water in a river. Wilma knew that the water was in the river. George knew that the water was in the river. One day Wilma was very thirsty. Wilma wanted to get near some water. Wilma flew from her nest across a meadow through a valley to the river. Wilma drank the water. Wilma was not thirsty any more.

  • George was very thirsty. George wanted to get near some
  • water. George walked from his patch of ground across the

meadow through the valley to a river bank. George fell into the

  • water. George wanted to get near the valley. George couldn't

get near the valley. George wanted to get near the meadow. George couldn't get near the meadow. Wilma wanted George to get near the meadow. Wilma wanted to get near George. Wilma grabbed George with her claw. Wilma took George from the river through the valley to the meadow. George was devoted to Wilma. George owed everything to Wilma. Wilma let go of George. George fell to the meadow. The end.

slide-27
SLIDE 27

Tale-spin: Example

  • Once upon a time George ant lived near a patch of ground.

There was a nest in an ash tree. Wilma bird lived in the nest. There was some water in a river. Wilma knew that the water was in the river. George knew that the water was in the river. One day Wilma was very thirsty. Wilma wanted to get near some water. Wilma flew from her nest across a meadow through a valley to the river. Wilma drank the water. Wilma was not thirsty any more.

  • George was very thirsty. George wanted to get near some
  • water. George walked from his patch of ground across the

meadow through the valley to a river bank. George fell into the

  • water. George wanted to get near the valley. George couldn't

get near the valley. George wanted to get near the meadow. George couldn't get near the meadow. Wilma wanted George to get near the meadow. Wilma wanted to get near George. Wilma grabbed George with her claw. Wilma took George from the river through the valley to the meadow. George was devoted to Wilma. George owed everything to Wilma. Wilma let go of George. George fell to the meadow. The end.

Story 1 Story 2 Initial state goal goal

slide-28
SLIDE 28

Tale-spin

  • Tale-spin uses a form of means-ends analysis planning
  • Means-ends analysis starts with the conditions in the target

state not yet satisfied in the current state, and tries to find an action to satisfy them.

  • For example: “John bear is not hungry” can be satisfied with

the action “John bear ate X”.

  • But for John bear to eat something, he must have it. Etc.
  • Difference with A* planning (explained before) is:
  • A* searches forwards (from initial state to target state)
  • Means-ends-analysis searches backwards (from target state to initial

state)

slide-29
SLIDE 29

Tale-spin: Example

  • Initial state (user defined):
  • Once upon a time Sam bear lived in a cave. Sam knew that Sam

was in his cave. There was a beehive in an apple tree. Betty bee knew that the beehive was in the apple tree. Betty was in her

  • beehive. Betty knew that Betty was in her beehive. There was

some honey in Betty's beehive. Betty knew that the honey was in Betty's beehive. Betty had the honey. Betty knew that Betty had the

  • honey. Sam knew that Betty was in her beehive. Sam knew that

Betty had the honey. There was a rose flower in a flowerbed. Sam knew that the rose flower was in the flowerbed.

  • Problem:
  • Sam bear is hungry.
slide-30
SLIDE 30

Tale-spin: Example

Once upon a time Sam bear lived in a cave. Sam knew that Sam was in his

  • cave. There was a beehive in an apple tree. Betty bee knew that the

beehive was in the apple tree. Betty was in her beehive. Betty knew that Betty was in her beehive. There was some honey in Betty's beehive. Betty knew that the honey was in Betty's beehive. Betty had the honey. Betty knew that Betty had the honey. Sam knew that Betty was in her beehive. Sam knew that Betty had the honey. There was a rose flower in a

  • flowerbed. Sam knew that the rose flower was in the flowerbed.

Goal: Sam bear not hungry

slide-31
SLIDE 31

Tale-spin: Example

Once upon a time Sam bear lived in a cave. Sam knew that Sam was in his

  • cave. There was a beehive in an apple tree. Betty bee knew that the

beehive was in the apple tree. Betty was in her beehive. Betty knew that Betty was in her beehive. There was some honey in Betty's beehive. Betty knew that the honey was in Betty's beehive. Betty had the honey. Betty knew that Betty had the honey. Sam knew that Betty was in her beehive. Sam knew that Betty had the honey. There was a rose flower in a

  • flowerbed. Sam knew that the rose flower was in the flowerbed.

Goal: Sam bear not hungry

Tale-spin knows (for each type of goal) the set of actions that can satisfy it. In this case:

  • “go towards something edible”
  • “have something edible”

One is picked at random

slide-32
SLIDE 32

Tale-spin: Example

Once upon a time Sam bear lived in a cave. Sam knew that Sam was in his

  • cave. There was a beehive in an apple tree. Betty bee knew that the

beehive was in the apple tree. Betty was in her beehive. Betty knew that Betty was in her beehive. There was some honey in Betty's beehive. Betty knew that the honey was in Betty's beehive. Betty had the honey. Betty knew that Betty had the honey. Sam knew that Betty was in her beehive. Sam knew that Betty had the honey. There was a rose flower in a

  • flowerbed. Sam knew that the rose flower was in the flowerbed.

Goal: Sam bear not hungry Goal: have something edible

slide-33
SLIDE 33

Tale-spin: Example

Once upon a time Sam bear lived in a cave. Sam knew that Sam was in his

  • cave. There was a beehive in an apple tree. Betty bee knew that the

beehive was in the apple tree. Betty was in her beehive. Betty knew that Betty was in her beehive. There was some honey in Betty's beehive. Betty knew that the honey was in Betty's beehive. Betty had the honey. Betty knew that Betty had the honey. Sam knew that Betty was in her beehive. Sam knew that Betty had the honey. There was a rose flower in a

  • flowerbed. Sam knew that the rose flower was in the flowerbed.

Goal: Sam bear not hungry Goal: have honey

Tale-spin knows (for each type of character) the set of thins they eat. In this case, a bear:

  • Honey
  • Salmon
  • etc.

One is picked at random

slide-34
SLIDE 34

Tale-spin: Example

Once upon a time Sam bear lived in a cave. Sam knew that Sam was in his

  • cave. There was a beehive in an apple tree. Betty bee knew that the

beehive was in the apple tree. Betty was in her beehive. Betty knew that Betty was in her beehive. There was some honey in Betty's beehive. Betty knew that the honey was in Betty's beehive. Betty had the honey. Betty knew that Betty had the honey. Sam knew that Betty was in her beehive. Sam knew that Betty had the honey. There was a rose flower in a

  • flowerbed. Sam knew that the rose flower was in the flowerbed.

Goal: Sam bear not hungry Goal: have honey

Every time a goal is posted, Tale-spin generates text, to motivate the actions of the characters. In this case: “Sam bear wanted to get some honey”

slide-35
SLIDE 35

Tale-spin: Example

Once upon a time Sam bear lived in a cave. Sam knew that Sam was in his

  • cave. There was a beehive in an apple tree. Betty bee knew that the

beehive was in the apple tree. Betty was in her beehive. Betty knew that Betty was in her beehive. There was some honey in Betty's beehive. Betty knew that the honey was in Betty's beehive. Betty had the honey. Betty knew that Betty had the honey. Sam knew that Betty was in her beehive. Sam knew that Betty had the honey. There was a rose flower in a

  • flowerbed. Sam knew that the rose flower was in the flowerbed.

Goal: Sam bear not hungry Goal: have honey

slide-36
SLIDE 36

Tale-spin: Example

Once upon a time Sam bear lived in a cave. Sam knew that Sam was in his

  • cave. There was a beehive in an apple tree. Betty bee knew that the

beehive was in the apple tree. Betty was in her beehive. Betty knew that Betty was in her beehive. There was some honey in Betty's beehive. Betty knew that the honey was in Betty's beehive. Betty had the honey. Betty knew that Betty had the honey. Sam knew that Betty was in her beehive. Sam knew that Betty had the honey. There was a rose flower in a

  • flowerbed. Sam knew that the rose flower was in the flowerbed.

Goal: Sam bear not hungry Goal: have honey

Tale-spin searches among the possible actions that can satisfy “have honey”:

  • Take(honey)
  • Persuade-to-give(Betty,honey)
  • Persuade-to-abandon(Betty,honey)
  • etc.

Take(honey) cannot be used, since its precondition is that “honey” is not owned by someone else. So, Tale-spin selects one of the

  • thers at random.
slide-37
SLIDE 37

Tale-spin: Example

Once upon a time Sam bear lived in a cave. Sam knew that Sam was in his

  • cave. There was a beehive in an apple tree. Betty bee knew that the

beehive was in the apple tree. Betty was in her beehive. Betty knew that Betty was in her beehive. There was some honey in Betty's beehive. Betty knew that the honey was in Betty's beehive. Betty had the honey. Betty knew that Betty had the honey. Sam knew that Betty was in her beehive. Sam knew that Betty had the honey. There was a rose flower in a

  • flowerbed. Sam knew that the rose flower was in the flowerbed.

Goal: Sam bear not hungry Goal: have honey Goal: persuade Betty to abandon honey

slide-38
SLIDE 38

Tale-spin: Example

  • Tale-spin generates a lot of sentences (one per goal, action, effect of

action, etc.)

  • When the final text is produced, those sentences that do not

contribute to the story are removed. For example:

  • Raw text:
  • Sam TOOK THE HONEY.

Sam HAD THE HONEY. Sam KNEW THAT Sam TOOK THE HONEY. Sam KNEW THAT Sam HAD THE HONEY. Sam ATE THE HONEY. THE HONEY WAS GONE. Sam WAS NOT HUNGRY. Sam THOUGHT THAT THE HONEY WAS GONE.

  • THE END.
  • Final text:
  • “Sam took the honey. Sam ate the honey. Sam was not hungry. The end.”
slide-39
SLIDE 39

Tale-spin

  • Goals: hungry, have something, persuade someone, go

somewhere, etc.

  • For each goal: a collection of strategies and preconditions
  • Domain knowledge:
  • Which actions can certain characters execute
  • Which food do different animals like
  • Inter-character relations and how to they affect the actions the can

execute

  • etc.
  • Using those 3 things: stories are generated at random, from the

set of possible ways to solve goals.

slide-40
SLIDE 40

Tale-spin

  • Planning-based story generation
  • Generated stories are about solving problems
  • Each individual character plans on its own (no joint

behaviors between characters):

  • Tale-spin is “character centric”
  • Stories are always “coherent” (actions of characters are motivated)
slide-41
SLIDE 41

Universe

  • Michael Lebowitz 1985
  • Tale-spin generates stories by simulating individual

characters’ behavior

  • Universe tries to go a different route:
  • Planning at the author level
  • A single planner will coordinate the behavior of all agents, to make

sure that the story that comes out is interesting

  • In Tale-spin: characters plan
  • In Universe: author plans
slide-42
SLIDE 42

Universe

  • Three basic elements:
  • Characters
  • Goals: author goals (not character goals)
  • Plot-fragments: similar to “actions” in planning, but they might

involve multiple characters

  • Universe generates stories like this (soap-opera-like):
  • Generates a random pool of characters
  • Author sets a goal
  • Universe generates a story fragment to reach that goal
  • Author sets another goal
  • Universe generates another story fragment to reach the second

goal

  • Etc.
slide-43
SLIDE 43

Universe: Plot Fragment

slide-44
SLIDE 44

Universe: Example

  • Characters:
  • Neil, Liz, Stephano, Marlena, Renee, Tony
  • Liz married to Tony, but in love with Neil
  • Stephano father of Liz (doesn’t like Neil)
  • Author sets the goal: (churn Neil Liz)
  • Universe generates:
  • Stephano threatens Liz: “forget him!”
  • Liz tells Neil she doesn’t love him
  • Marlena is worried about Neil
  • Marlena seduces Neil
  • Renee tries to kill Stephano
  • Liz and Tony got divorced
slide-45
SLIDE 45

Universe

Initial state: characters and their relationships Goal: (churn Liz Neil)

slide-46
SLIDE 46

Universe

Initial state: characters and their relationships Goal: (churn Liz Neil) Plot fragment subgoal subgoal

slide-47
SLIDE 47

Universe

Initial state: characters and their relationships Goal: (churn Liz Neil) Plot fragment subgoal subgoal

Universe selects one of the plot- fragments from the library that can satisfy the goal at random, and posts any subgoals it might have.

slide-48
SLIDE 48

Universe

Initial state: characters and their relationships Goal: (churn Liz Neil) Plot fragment subgoal subgoal Plot fragment subgoal

slide-49
SLIDE 49

Universe

Initial state: characters and their relationships Goal: (churn Liz Neil) Plot fragment subgoal subgoal Plot fragment subgoal Plot fragment

slide-50
SLIDE 50

Universe

Initial state: characters and their relationships Goal: (churn Liz Neil) Plot fragment subgoal subgoal Plot fragment subgoal Plot fragment

Some goals can be satisfied by plot fragments that do not have subgoals, ending the search for that branch.

slide-51
SLIDE 51

Universe

  • The approach generates complex plots where characters

interact in interesting ways

  • Unlike Tale-spin, where characters only act self-interestedly, and

any interesting interaction is pure chance

  • However:
  • In Tale-spin all actions of the characters are well motivated

(characters have their own goals)

  • In Universe characters might do things with no reason
  • Plot-fragments have preconditions on the personality of the

characters, but it’s not enough to guarantee character believability

slide-52
SLIDE 52

Other Approaches

  • Fabulist:
  • Character-centric stories are believable, but plots are no coherent

(bad stories): Tale-Spin

  • Author-centric stories have good plots, but characters are not

believable (they are forced to perform actions for the sake of the plot)

  • Fabulist tries to combine the two
  • More details in the student presentation earlier in this class J
slide-53
SLIDE 53

Outline

  • Student Presentation:

“Towards Automated Game Design”

  • Student Presentation:

“Game AI as Storytelling”

  • Introduction to Computational Narrative
  • (brief introduction to automated planning)
  • Planning-based Story Generation
  • (very brief introduction to CBR)
  • Analogy/CBR-based Story Generation
  • Story Generation in Computer Games
  • Project Discussion
slide-54
SLIDE 54

Case-Based Reasoning

  • AI paradigm based on solving new problems by reusing

past solutions

  • Past experiences: cases
  • Problem
  • Solution
  • Past experience captured in a “case base” (collection of

cases)

  • Solve a problem: retrieve a relevant past case (similar

problem) and adapt the solution (assess the result).

slide-55
SLIDE 55

Case-Based Reasoning

slide-56
SLIDE 56

Case-Based Reasoning

  • This sounds very similar to “nearest neighbor” (recall 2 weeks

ago)

  • Nearest Neighbor:
  • Given a new problem P and a training set T (collection of examples)
  • Find the most similar example (X,Y) (most similar X to P)
  • Return Y
  • Case-Based Reasoning:
  • Given a new problem P and a case-base CB (collection of cases)
  • Find the most similar case (X,Y) (most similar X to P)
  • Adapt Y to solve P -> Y’
  • Return Y’
slide-57
SLIDE 57

Case-Based Reasoning

  • CBR has been typically studied in domains with complex

data representation

  • Nearest neighbor (and standard machine learning, in general) has

typically focused in simple data representations (feature vectors)

  • CBR has addressed complex representations (graphs, multimedia,

natural-language, etc.)

  • Adaptation in CBR is a very hard (and not very well

understood) problem

  • CBR works!
  • Almost all major companies use CBR at some point in their

production line (e.g. engine failure detection in airplanes by Boeing, ink generation for phones by Nokia, etc.)

slide-58
SLIDE 58

Outline

  • Student Presentation:

“Towards Automated Game Design”

  • Student Presentation:

“Game AI as Storytelling”

  • Introduction to Computational Narrative
  • (brief introduction to automated planning)
  • Planning-based Story Generation
  • (very brief introduction to CBR)
  • Analogy/CBR-based Story Generation
  • Story Generation in Computer Games
  • Project Discussion
slide-59
SLIDE 59

Analogy/CBR-based Story Generation

  • Automated planning can be used to generate stories, but:
  • Limited affordance: only goal-oriented stories, stories written by

humans are not like that.

  • Alternative models for story generation have been

explored (hoping to increase the range of story types computers can generate):

  • Analogy
  • CBR
slide-60
SLIDE 60

Analogy/CBR-based Story Generation

  • Automated planning can be used to generate stories, but:
  • Limited affordance: only goal-oriented stories, stories written by

humans are not like that.

  • Alternative models for story generation have been

explored (hoping to increase the range of story types computers can generate):

  • Analogy
  • CBR

Student presentation next week: SAM Algorithm

slide-61
SLIDE 61

Minstrel

  • Scott Turner 1993
  • Story generation as problem solving with creativity
  • Minstrel is a model of creative problem solving applied to the

problem of story generation.

  • Turner observed that Tale-spin was just “shuffling the pieces” to

generate stories. Nothing new was being created, other than what the system already knew.

  • Thus, Minstrel is an effort into modeling creativity, and how can a

computer system generate stories that are creative and interesting.

slide-62
SLIDE 62

Minstrel

  • Minstrel is a CBR model, so it contains a case-base:
  • Collection of small story snippets
  • Case-base (or “memory”) contains all the domain-specific knowledge

known to Minstrel

  • Additionally, Minstrel has general knowledge about “plans”,

“goals”, “states of the world”.

  • To generate a story, the author gives Minstrel a problem: e.g.

“Knight does something and kills himself”

  • Minstrel will take that as a problem, and try to generate a

creative solution as to how did that happen. The result is a story.

slide-63
SLIDE 63

Minstrel: Failure Driven Creativity

  • Creativity is triggered when the CBR method cannot find a

solution to a problem:

  • e.g.: “give me a scene for a knight killing a dragon”
  • Minstrel then modifies the original problem:
slide-64
SLIDE 64

Minstrel: Failure Driven Creativity

  • Creativity is triggered when the CBR method cannot find a

solution to a problem:

  • e.g.: “give me a scene for a knight killing a dragon”
  • Minstrel then modifies the original problem:

If any of these problem can be solved. The solution can be transformed back to the

  • riginal problem:

A creative solution

slide-65
SLIDE 65

Minstrel: Failure Driven Creativity

  • Creativity is triggered when the CBR method cannot find a

solution to a problem:

  • e.g.: “give me a scene for a knight killing a dragon”
  • Minstrel then modifies the original problem:

TRANSFORM RECALL ADAPT TRAM

slide-66
SLIDE 66

MINSTREL: TRAMS

  • Example TRAM: Change domain
  • TRANSFORM: Find another domain and map all the
  • bjects in the problem (keep mapping)
  • RECALL: …
  • ADAPT: Use the mapping from the transform step to

adapt the solution to the original domain

  • Example: “A knight saves a princess” to “A businessman

saves a friend”

slide-67
SLIDE 67

MINSTREL

slide-68
SLIDE 68

MINSTREL

This is where MINSTREL solves problems. It can be recursive. When a TRAM transforms a problem, the resulting problem can be solved using another TRAM, etc.

slide-69
SLIDE 69

MINSTREL: Example

Once upon a time there was a lady of the court named Jennifer. Jennifer loved a knight named Grunfeld. Grunfeld loved Jennifer. Jennifer wanted revenge on a lady of the court name Darlene because she had the berries which she picked in the woods and Jennifer wanted to have the berries. Jennifer wanted to scare Darlene. Jennifer wanted a dragon to move towards Darlene so that Darlene believed it would eat her. Jennifer wanted to appear to be a dragon so that a dragon would move towards Darlene. Jennifer drank a magic

  • potion. Jennifer transformed into a dragon. A dragon move towards
  • Darlene. A dragon was near Darlene.

Grunfeld wanted to impress the king. Grunfeld wanted to move towards the woods so that he could fight a dragon. Grunfeld moved towards the woods. Grunfeld was near the woods. Grunfeld fought a

  • dragon. The dragon died. The dragon was Jennifer. Jennifer wanted to
  • live. Jennifer tried to drink a magic potion but failed. Grunfeld was filled

with grief. Jennifer was buried in the woods. Grunfeld became a hermit.

slide-70
SLIDE 70

MINSTREL: Example

Once upon a time there was a lady of the court named Jennifer. Jennifer loved a knight named Grunfeld. Grunfeld loved Jennifer. Jennifer wanted revenge on a lady of the court name Darlene because she had the berries which she picked in the woods and Jennifer wanted to have the berries. Jennifer wanted to scare Darlene. Jennifer wanted a dragon to move towards Darlene so that Darlene believed it would eat her. Jennifer wanted to appear to be a dragon so that a dragon would move towards Darlene. Jennifer drank a magic

  • potion. Jennifer transformed into a dragon. A dragon move towards
  • Darlene. A dragon was near Darlene.

Grunfeld wanted to impress the king. Grunfeld wanted to move towards the woods so that he could fight a dragon. Grunfeld moved towards the woods. Grunfeld was near the woods. Grunfeld fought a

  • dragon. The dragon died. The dragon was Jennifer. Jennifer wanted to
  • live. Jennifer tried to drink a magic potion but failed. Grunfeld was filled

with grief. Jennifer was buried in the woods. Grunfeld became a hermit.

A “princess” attempts a revenge And fails.

slide-71
SLIDE 71

Outline

  • Student Presentation:

“Towards Automated Game Design”

  • Student Presentation:

“Game AI as Storytelling”

  • Introduction to Computational Narrative
  • (brief introduction to automated planning)
  • Planning-based Story Generation
  • (very brief introduction to CBR)
  • Analogy/CBR-based Story Generation
  • Story Generation in Computer Games
  • Project Discussion
slide-72
SLIDE 72

Story Generation in Computer Games

  • Games can be seen as interactive stories
  • Story generation has several applications to games:
  • Drama management (story modification)
  • Quest generation (generate parts of the story, sub-plots)
  • Full-story generation (generate the complete story for a new game)
  • Very hyped, but very few games implement any of it (it’s a

hard problem!):

  • Façade (drama management)
  • Skyrim (quest generation)
  • We’ll see more of this in the future!
slide-73
SLIDE 73

Skyrim: Radiant Quest

slide-74
SLIDE 74

Skyrim: Radiant Quest

Quests generated using patterns and parameters. E.g. “kill-enemy” pattern:

  • Start location
  • Enemy location
  • Reward

By selecting semi-random enemies and rewards, you can generate unineteresting, but almost infinite number of quests.

slide-75
SLIDE 75

Plot Generation through Story Generation

  • Skyrim’s patterns are not different than Universe’s Plot

Fragments

  • Skyrim uses simple non-recursive patterns (a sub-quest is

completely generated by a single pattern)

  • More complex quests can be generated by using complex

patterns, like in Universe:

  • Patterns with subgoals
slide-76
SLIDE 76

Plot Fragments / Patterns

  • Roles:
  • List of items/characters/props/locations involved in the pattern
  • Preconditions:
  • Conditions that must be satisfied for the pattern to apply
  • Subgoals:
  • List of events or subgoals that the pattern has
  • Postconditions:
  • Which goals/effects/side-effects does this pattern accomplish
slide-77
SLIDE 77

Plot Fragments / Patterns

  • Roles:
  • Item: X1
  • Thief: X2
  • Location: X3
  • Owner: X4
  • Person X5
  • Reward X6
  • Preconditions:
  • X2 and X4 alive, X3 is far away from location(X4), S2 is stronger than X4
  • Subgoals:
  • X4 contacts player and asks for a secret meeting
  • X4 asks player to retrieve X1 which was stolen by X2 and hidden at X3
  • X4 tells player that to enter in X3 there is a secret password, only known by X5
  • Player finds X5
  • Player asks password from X5
  • Player goes to X3 and uses password to enter
  • Player kills X2
  • Player retrieves X1
  • Player returns X1 to X4
  • Player gets reward X6
  • Postconditions:
  • Obtain reward X6, friends with X4, killed X2
slide-78
SLIDE 78

Plot Fragments / Patterns

  • Roles:
  • Item: X1
  • Thief: X2
  • Location: X3
  • Owner: X4
  • Person X5
  • Reward X6
  • Preconditions:
  • X2 and X4 alive, X3 is far away from location(X4), S2 is stronger than X4
  • Subgoals:
  • X4 contacts player and asks for a secret meeting
  • X4 asks player to retrieve X1 which was stolen by X2 and hidden at X3
  • X4 tells player that to enter in X3 there is a secret password, only known by X5
  • Player finds X5
  • Player asks password from X5
  • Player goes to X3 and uses password to enter
  • Player kills X2
  • Player retrieves X1
  • Player returns X1 to X4
  • Player gets reward X6
  • Postconditions:
  • Obtain reward X6, friends with X4, killed X2

Many of these subgoals could be solved through additional plot fragments, making the plot more complex

slide-79
SLIDE 79

Plot Fragments / Patterns

  • Roles:
  • Item: X1
  • Location: X2
  • Person X3
  • Preconditions:
  • X3 alive
  • Subgoals:
  • Player asks X3 about X1
  • X3 tells player that X1 is in X2
  • Player goes to X2
  • Player finds X1
  • Postconditions:
  • Obtain X1

Some patterns should be simple, and not decomposable into

  • subgoals. To ensure that

the task of plot generation can end.

slide-80
SLIDE 80

Plot Generation through Story Generation

  • Having a large enough library of plot fragments,

Universe’s or Minstrel’s techniques can be used to generate complex plots in RPG/Adventure games

  • Story generation can also be applied to Drama

Management (talk earlier today):

  • If DM detects current plot will not work with current player: generate

a new plot automatically

  • Story generation better suited for optional quests:
  • Game companies need to guarantee that main plot-line is strong

(human authored)

  • Sub-plots can be automatically generated
slide-81
SLIDE 81

Project 3 Idea

  • Create a small library of patterns (5 – 10)
  • Generate complex sub-plots in Anchorhead
  • Only if project 3 is your focus: story generation is more

complex than it seems!

slide-82
SLIDE 82

Outline

  • Student Presentation:

“Towards Automated Game Design”

  • Student Presentation:

“Game AI as Storytelling”

  • Introduction to Computational Narrative
  • (brief introduction to automated planning)
  • Planning-based Story Generation
  • (very brief introduction to CBR)
  • Analogy/CBR-based Story Generation
  • Story Generation in Computer Games
  • Project Discussion
slide-83
SLIDE 83

Project Discussion

  • Project 1:
  • Questions? Technical Problems? Assistance?
  • Project 2:
  • Questions? Technical Problems? Assistance?
  • Project 3:
  • Questions? Technical Problems? Assistance?