1896 1920 1987 2006
Computing and Communications
- 2. Information Theory
- Entropy
Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn
1
Computing and Communications 2. Information Theory -Entropy Ying - - PowerPoint PPT Presentation
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy and conditional
1896 1920 1987 2006
1
2
3
4
5
6
7
– Theorem 1. Minimum compression rate of the source is its entropy rate H – Theorem 2. Maximum reliable rate over the channel is its mutual information I – Theorem 3. End-to-end reliable communication happens if and only if H < I, i.e. there is no loss in performance by using a digital interface between source and channel coding
– after almost 70 years, all communication systems are designed based
– the limits not only serve as benchmarks for evaluating communication schemes, but also provide insights on designing good ones – basic information theoretic limits in Shannon’s theorems have now been successfully achieved using efficient algorithms and codes
8
9
– log is to the base 2, and entropy is expressed in bits
– define , since
10
log 0 as x x x 0log0
11
12
13
14
15
16
17
18
19
20
21
22
0log 0, 0log 0 and log p p q such that ( ) 0 and ( ) 0, then ( || ) . x p x q x D p q
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
2
convex functions: , | |, , log (for 0)
x
x x e x x x concave functions: log and (for 0) x x x linear functions are both convex and concave ax b
39
40
41
42
43
44
45
46
47
48
49
50