SLIDE 5 Entropy and Conditional Entropy Mutual Information and Kullback–Leibler Divergence
Observations
1 The amount of information is related to the number of possible
- utcomes: message B is a result of two possible outcomes, while
message T is a result of eight possible outcomes.
2 The amount of information obtained in learning the two messages
should be additive, while the number of possible outcomes is multiplicative. Let f (·) be a function that measures the amount of information:
f (·) f (·)
# of possible
# of possible
Amount of info. from learning B Amount of info. from learning T
f (·)
# of possible
# of possible
Amount of info. from learning B Amount of info. from learning T
× +
What function produces additive outputs with multiplicative inputs? Logarithmic Function
5 / 42 I-Hsiang Wang IT Lecture 2