SLIDE 22 Multi-agent learning Fictitious Play
Coding theory and entropy (continued)
Another encoding could be
Co de 2:
m1
↔
m2
↔
10 m3
↔
110 m4
↔
111 To prevent ambiguity, no code word may be a prefix of some other code word. A useless coding would be
Co de 3:
m1
↔
m2
↔
1 m3
↔
00 m4
↔
01 Under Code 3, the sequence 0101 may mean different things, such as m1, m2, m1, m2, or m1, m2, m4. (There are still other possibilities.)
- The objective is to search for an
e ient en o ding, i.e., an encoding
that minimises the number of bits per message.
relative frequen y of messages
is known, we can for every code compute the expected number of bits per message, hence its efficiency.
Gerard Vreeswijk. Last modified on February 27th, 2012 at 18:35 Slide 22