SLIDE 21 Multi-agent learning Fictitious Play
Coding theory and entropy (continued)
Another encoding could be Code 2: m1
↔
m2
↔
10 m3
↔
110 m4
↔
111 To prevent ambiguity, no code word may be a prefix of some other code word. A useless coding would be Code 3: m1
↔
m2
↔
1 m3
↔
00 m4
↔
01 Under Code 3, the sequence 0101 may mean different things, such as m1, m2, m1, m2, or m1, m2, m4. (There are still other possibilities.)
- The objective is to search for an
efficient encoding, i.e., an encoding that minimises the number of bits per message.
- If the relative frequency of
messages is known, we can for every code compute the expected number of bits per message, and hence its efficiency.
Gerard Vreeswijk. Slides last processed on Tuesday 2nd March, 2010 at 13:53h. Slide 21