Let the AI do the Talk
Adventures with Natural Language Generation
@MarcoBonzanini
PyParis 2018
Let the AI do the Talk Adventures with Natural Language Generation - - PowerPoint PPT Presentation
Let the AI do the Talk Adventures with Natural Language Generation @MarcoBonzanini PyParis 2018 PyData London Conference 12-14 July 2019 @PyDataLondon NATURAL LANGUAGE GENERATION Natural Language Processing 5 Natural Language Processing
PyParis 2018
5
6
7
8
9
10
11
12
13
15
16
17
18
https://en.wikipedia.org/wiki/Infinite_monkey_theorem
19
from random import choice from string import printable def monkey_hits_keyboard(n):
print("The monkey typed:") print(''.join(output))
20
>>> monkey_hits_keyboard(30) The monkey typed: % a9AK^YKx OkVG)u3.cQ,31("!ac% >>> monkey_hits_keyboard(30) The monkey typed: fWE,ou)cxmV2IZ l}jSV'XxQ**9'|
21
22
23
24
25
26
27
28
29
30
31
32
33
34
most likely next word
35
36
37
38
40
41
x1 x2 h1 y1 h2 h3 Input layer Output layer Hidden layer(s)
42
43
x1 w2 w1 x2
44
x1 w2 w1 x2
45
46
47
48
50
51
52
53
http://colah.github.io/posts/2015-08-Understanding-LSTMs/
54
http://colah.github.io/posts/2015-08-Understanding-LSTMs/
55
56
57
58
http://colah.github.io/posts/2015-08-Understanding-LSTMs/
59
https://en.wikipedia.org/wiki/Long_short-term_memory
61
62
63
64
65
model = Sequential() model.add( LSTM( 128, input_shape=(maxlen,len(chars)) ) ) model.add(Dense(len(chars), activation='softmax'))
66
Define the network
model.compile( loss='categorical_crossentropy',
)
67
Configure the network
model.fit(x, y, batch_size=128, epochs=60, callbacks=[print_callback]) model.save(‘char_model.h5’)
68
Train the network
for i in range(output_size): ... preds = model.predict(x_pred, verbose=0)[0] next_index = sample(preds, diversity) next_char = indices_char[next_index] generated += next_char
69
Generate text
for i in range(output_size): ... preds = model.predict(x_pred, verbose=0)[0] next_index = sample(preds, diversity) next_char = indices_char[next_index] generated += next_char
70
Seed text
71
72
73
74
77
78
https://www.youtube.com/watch?v=WCUNPb-5EYI
http://colah.github.io/posts/2015-08-Understanding-LSTMs/
http://karpathy.github.io/2015/05/21/rnn-effectiveness/
Pics:
Microsoft_Cortana_light.svg.png
Readings & Credits