Information Theory
Lecture 1
- Course introduction
- Entropy, relative entropy and mutual information:
Cover & Thomas (CT) 2.1–5
- Important inequalities: CT2.6–8, 2.10
Mikael Skoglund, Information Theory 1/26
Information Theory
- Founded by Claude Shannon in 1948.
- C. E. Shannon, “A mathematical theory of communication,”
Bell Sys. Tech. Journal, vol. 27, pp. 379-423, 623-656, 1948
- “The fundamental problem of communication is that of
reproducing at one point either exactly or approximately a message selected at another point.”
- Information theory is concerned with
- communication, information, entropy, coding, achievable
performance, performance bounds, limits, inequalities,. . .
Mikael Skoglund, Information Theory 2/26