Outline read Chapter suggested exercises - - PDF document

outline read chapter suggested exercises
SMART_READER_LITE
LIVE PREVIEW

Outline read Chapter suggested exercises - - PDF document

Outline read Chapter suggested exercises Learning from examples Generaltosp ecic ordering o v er h yp otheses V ersion spaces and


slide-1
SLIDE 1 Outline read Chapter
  • suggested
exercises
  • Learning
from examples
  • Generaltosp
ecic
  • rdering
  • v
er h yp
  • theses
  • V
ersion spaces and candidate elimination algorithm
  • Pic
king new examples
  • The
need for inductiv e bias Note simple approac h assuming no noise illustrate s k ey concepts
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-2
SLIDE 2 T raining Examples for Enjo ySp
  • rt
Sky T emp Humid Wind W ater F
  • recst
Enjo ySpt Sunn y W arm Normal Strong W arm Same Y es Sunn y W arm High Strong W arm Same Y es Rain y Cold High Strong W arm Change No Sunn y W arm High Strong Co
  • l
Change Y es What is the general concept
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-3
SLIDE 3 Represen ting Hyp
  • theses
Man y p
  • ssible
represen tations Here h is conjunction
  • f
constrain ts
  • n
attributes Eac h constrain t can b e
  • a
sp ecc v alue eg W ater
  • W
ar m
  • dont
care eg W ater
  • no
v alue allo w ed egW ater F
  • r
example Sky AirT emp Humid Wind W ater F
  • recst
hS unny
  • S
tr
  • ng
  • S
amei
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-4
SLIDE 4 Protot ypical Concept Learning T ask
  • Giv
en
  • Instances
X
  • P
  • ssible
da ys eac h describ ed b y the attributes Sky A irT emp Humidity Wind Water F
  • r
e c ast
  • T
arget function c E nj
  • y
S por t
  • X
  • f
g
  • Hyp
  • theses
H
  • Conjunctions
  • f
literals Eg h C
  • l
d H ig h
  • i
  • T
raining examples D
  • P
  • sitiv
e and negativ e examples
  • f
the target function hx
  • cx
  • i
  • hx
m
  • cx
m i
  • Determine
A h yp
  • thesis
h in H suc h that hx
  • cx
for all x in D
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-5
SLIDE 5 The inductiv e learning h yp
  • thesis
An y h yp
  • thesis
found to appro ximate the target function w ell
  • v
er a sucien tly large set
  • f
training examples will also appro ximate the target function w ell
  • v
er
  • ther
unobserv ed examples
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-6
SLIDE 6 Instance Hyp
  • theses
and More GeneralThan

h = <Sunny, ?, ?, Strong, ?, ?> h = <Sunny, ?, ?, ?, ?, ?> h = <Sunny, ?, ?, ?, Cool, ?> 2 h h 3 h

Instances X Hypotheses H

Specific General 1

x

2

x

x = <Sunny, Warm, High, Strong, Cool, Same> x = <Sunny, Warm, High, Light, Warm, Same> 1 1 2 1 2 3

  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-7
SLIDE 7 FindS Algorithm
  • Initiali
ze h to the most sp ecic h yp
  • thesis
in H
  • F
  • r
eac h p
  • sitiv
e training instance x
  • F
  • r
eac h attribute constrain t a i in h If the constrain t a i in h is satised b y x Then do nothing Else replace a i in h b y the next more general constrain t that is satised b y x
  • Output
h yp
  • thesis
h
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-8
SLIDE 8 Hyp
  • thesis
Space Searc h b y FindS

Instances X Hypotheses H

Specific General 1

x

2

x

x 3

x4

h0 h1 h2,3 h4 + + + x = <Sunny Warm High Strong Cool Change>, + 4 x = < S u n n y W a r m N

  • r

m a l S t r

  • n

g W a r m S a m e > , + 1 x = <Sunny Warm High Strong Warm Same>, + 2 x = <Rainy Cold High Strong Warm Change>, - 3 h = <Sunny Warm Normal Strong Warm Same 1 h = <Sunny Warm ? Strong Warm Same> 2 h = <Sunny Warm ? Strong ? ? > 4 h = <Sunny Warm ? Strong Warm Same> 3 h = <∅, ∅, ∅, ∅, ∅, ∅>

  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-9
SLIDE 9 Complain ts ab
  • ut
FindS
  • Cant
tell whether it has learned concept
  • Cant
tell when training data inconsisten t
  • Pic
ks a maximally sp ecic h wh y
  • Dep
ending
  • n
H
  • there
migh t b e sev eral
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-10
SLIDE 10 V ersion Spaces A h yp
  • thesis
h is consisten t with a set
  • f
training examples D
  • f
target concept c if and
  • nly
if hx
  • cx
for eac h training example hx cxi in D
  • C
  • nsistenth
D
  • hx
cxi
  • D
  • hx
  • cx
The v ersion space V S H D
  • with
resp ect to h yp
  • thesis
space H and training examples D
  • is
the subset
  • f
h yp
  • theses
from H consisten t with all training examples in D
  • V
S H D
  • fh
  • H
jC
  • nsistenth
D g
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-11
SLIDE 11 The ListThenElimin ate Algorithm
  • V
er sionS pace
  • a
list con taining ev ery h yp
  • thesis
in H
  • F
  • r
eac h training example hx cxi remo v e from V er sionS pace an y h yp
  • thesis
h for whic h hx
  • cx
  • Output
the list
  • f
h yp
  • theses
in V er sionS pace
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-12
SLIDE 12 Example V ersion Space

S: <Sunny, Warm, ?, ?, ?, ?> <Sunny, ?, ?, Strong, ?, ?> <?, Warm, ?, Strong, ?, ?> <Sunny, Warm, ?, Strong, ?, ?> { } G: <Sunny, ?, ?, ?, ?, ?>, <?, Warm, ?, ?, ?, ?> { }

  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-13
SLIDE 13 Represen ting V ersion Spaces The General b
  • undary
G
  • f
v ersion space V S H D is the set
  • f
its maximally general mem b ers The Sp ecic b
  • undary
S
  • f
v ersion space V S H D is the set
  • f
its maximally sp ecic mem b ers Ev ery mem b er
  • f
the v ersion space lies b et w een these b
  • undaries
V S H D
  • fh
  • H
js
  • S
g
  • Gg
  • h
  • sg
where x
  • y
means x is more general
  • r
equal to y
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-14
SLIDE 14 Candidate Eliminati
  • n
Algorithm G
  • maximally
general h yp
  • theses
in H S
  • maximally
sp ecic h yp
  • theses
in H F
  • r
eac h training example d do
  • If
d is a p
  • sitiv
e example
  • Remo
v e from G an y h yp
  • thesis
inconsisten t with d
  • F
  • r
eac h h yp
  • thesis
s in S that is not consisten t with d
  • Remo
v e s from S
  • Add
to S all minimal generalizations h
  • f
s suc h that
  • h
is consisten t with d and
  • some
mem b er
  • f
G is more general than h
  • Remo
v e from S an y h yp
  • thesis
that is more general than another h yp
  • thesis
in S
  • If
d is a negativ e example
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-15
SLIDE 15
  • Remo
v e from S an y h yp
  • thesis
inconsisten t with d
  • F
  • r
eac h h yp
  • thesis
g in G that is not consisten t with d
  • Remo
v e g from G
  • Add
to G all minimal sp ecializat i
  • ns
h
  • f
g suc h that
  • h
is consisten t with d and
  • some
mem b er
  • f
S is more sp ecic than h
  • Remo
v e from G an y h yp
  • thesis
that is less general than another h yp
  • thesis
in G
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-16
SLIDE 16 Example T race

{<?, ?, ?, ?, ?, ?>}

S0:

{<Ø, Ø, Ø, Ø, Ø, Ø>}

G 0:

  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-17
SLIDE 17 What Next T raining Example

S: <Sunny, Warm, ?, ?, ?, ?> <Sunny, ?, ?, Strong, ?, ?> <?, Warm, ?, Strong, ?, ?> <Sunny, Warm, ?, Strong, ?, ?> { } G: <Sunny, ?, ?, ?, ?, ?>, <?, Warm, ?, ?, ?, ?> { }

  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-18
SLIDE 18 Ho w Should These Be Classied

S: <Sunny, Warm, ?, ?, ?, ?> <Sunny, ?, ?, Strong, ?, ?> <?, Warm, ?, Strong, ?, ?> <Sunny, Warm, ?, Strong, ?, ?> { } G: <Sunny, ?, ?, ?, ?, ?>, <?, Warm, ?, ?, ?, ?> { }

hS unny W ar m N
  • r
mal S tr
  • ng
C
  • ol
C hang ei hR ainy C
  • ol
N
  • r
mal Lig ht W ar m S amei hS unny W ar m N
  • r
mal Lig ht W ar m S amei
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-19
SLIDE 19 What Justies this Inductiv e Leap
  • hS
unny W ar m N
  • r
mal S tr
  • ng
C
  • ol
C hang ei
  • hS
unny W ar m N
  • r
mal Lig ht W ar m S amei S
  • hS
unny W ar m N
  • r
mal
  • i
Wh y b eliev e w e can classify the unseen hS unny W ar m N
  • r
mal S tr
  • ng
W ar m S amei
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-20
SLIDE 20 An UNBiased Learner Idea Cho
  • se
H that expresses ev ery teac hable concept ie H is the p
  • w
er set
  • f
X
  • Consider
H
  • disjunctions
conjunctions negations
  • v
er previous H
  • Eg
hS unny W ar m N
  • r
mal
  • i
  • h
  • C
hang ei What are S
  • G
in this case S
  • G
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-21
SLIDE 21 Inductiv e Bias Consider
  • concept
learning algorithm L
  • instances
X
  • target
concept c
  • training
examples D c
  • fhx
cxig
  • let
Lx i
  • D
c
  • denote
the classicati
  • n
assigned to the instance x i b y L after training
  • n
data D c
  • Denition
The inductiv e bias
  • f
L is an y minimal set
  • f
assertions B suc h that for an y target concept c and corresp
  • nding
training examples D c x i
  • X
B
  • D
c
  • x
i
  • Lx
i
  • D
c
  • where
A
  • B
means A logicall y en tails B
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-22
SLIDE 22 Inductiv e Systems and Equiv alen t Deductiv e Systems

Candidate Elimination Algorithm Using Hypothesis Space Training examples New instance Equivalent deductive system Theorem Prover Training examples New instance

Inductive bias made explicit

Classification of new instance, or "don’t know" Classification of new instance, or "don’t know" Inductive system H Assertion " contains the target concept" H

  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-23
SLIDE 23 Three Learners with Dieren t Biases
  • R
  • te
le arner Store examples Classify x i it matc hes previously
  • bserv
ed example
  • V
ersion sp ac e c andidate elimination algorithm
  • FindS
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill
slide-24
SLIDE 24 Summary P
  • in
ts
  • Concept
learning as searc h through H
  • Generaltosp
ecic
  • rdering
  • v
er H
  • V
ersion space candidate elimination algorithm
  • S
and G b
  • undaries
c haracterize learners uncertain t y
  • Learner
can generate useful queries
  • Inductiv
e leaps p
  • ssible
  • nly
if learner is biased
  • Inductiv
e learners can b e mo delled b y equiv alen t deductiv e systems
  • lecture
slides for textb
  • k
Machine L e arning T Mitc hell McGra w Hill