Lecture 2 Lossless Source Coding
I-Hsiang Wang
Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw
October 2, 2016
1 / 50 I-Hsiang Wang IT Lecture 2
Lecture 2 Lossless Source Coding I-Hsiang Wang Department of - - PowerPoint PPT Presentation
Lecture 2 Lossless Source Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 2, 2016 1 / 50 I-Hsiang Wang IT Lecture 2 The engineering problem motivating the study of this
1 / 50 I-Hsiang Wang IT Lecture 2
N (compression ratio/rate) ?
2 / 50 I-Hsiang Wang IT Lecture 2
1 Encoder: Represent the source sequence s[1 : N] by a binary source codeword
2 Decoder: From the source codeword w, reconstruct the source sequence either losslessly or
3 Efficiency: Determined by the code rate R ≜ K
N bits/symbol time
3 / 50 I-Hsiang Wang IT Lecture 2
1 Exact: the reconstructed sequence
2 Lossy: the reconstructed sequence
4 / 50 I-Hsiang Wang IT Lecture 2
5 / 50 I-Hsiang Wang IT Lecture 2
(Another reason to use a random source model is due to engineering reasons, as mentioned in Lecture 1.)
6 / 50 I-Hsiang Wang IT Lecture 2
Variable Length
N .
7 / 50 I-Hsiang Wang IT Lecture 2
N given that the error
e
8 / 50 I-Hsiang Wang IT Lecture 2
1 First, focusing on memoryless sources, introduce a powerful tool called typical sequences, and
2 Second, extend the typical sequence framework to sources with memory, and prove a similar
i.i.d.
9 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem
10 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Typicality and AEP
11 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Typicality and AEP
(cf. Lecture 2 "Operational Meaning of Entropy")
12 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Typicality and AEP
n
i=1 1 {xi = a}.
p
i.i.d.
ε
13 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Typicality and AEP
ε
ε
2
ε
ε
ε
4
5
6
14 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Typicality and AEP
i=1 PX (xi), that is, the probability that the DMS generates the
1 ∀ xn ∈ T (n)
ε
(by definition of typical sequences and entropy)
2
n→∞ P
ε
ε
(by the law of large numbers (LLN))
3 |T (n)
ε
(by summing up the lower bound in property 1 over the typical set)
4 |T (n)
ε
15 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Typicality and AEP
ε
ε
1 The typical set has probability
2 Within the typical set, all
16 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Typicality and AEP
ε
ε
17 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Lossless Source Coding Theorem
18 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Lossless Source Coding Theorem
1 A
2 The error probability for a
e
3 A rate R is said to be achievable if there exist a sequence of
e
19 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Lossless Source Coding Theorem
e
e
20 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Lossless Source Coding Theorem
ε
ε
ε
21 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Lossless Source Coding Theorem
ε
e
(N) e
ε
ε
e
22 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Lossless Source Coding Theorem
e
e
(N) e
ε
ε
e
e
23 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Lossless Source Coding Theorem
24 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Lossless Source Coding Theorem
Exercise 1 Show that Lemma 1 can be sharpened as follows
if U, V both take values in U.
25 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Lossless Source Coding Theorem
e
e
26 / 50 I-Hsiang Wang IT Lecture 2
Typical Sequences and a Lossless Source Coding Theorem Lossless Source Coding Theorem
e
N − P(N) e
e
Exercise 2 Prove the lossless source coding theorem for DMS by using Theorem 1 in Lecture 2.
27 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory
28 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
29 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
1 A measure of uncertainty for random processes 2 A general AEP for random processes with memory
30 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
31 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
(likely to be ∞)
n→∞ 1 nH (X1, X2, . . . , Xn )
n→∞ H
32 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
Exercise 3 (H and
Consider a random process {Xi} where X1, X3, . . . are i.i.d. and X2k = X2k−1 for all k ∈ N. Show that H ({Xi} ) exists, but
33 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
nH (X1, . . . , Xn ) and bn ≜ H
n
k=1 bk due to chain rule.
n→∞ bn = c =
n→∞ an = c, where an ≜ 1 n n
k=1
34 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
Exercise 4 Show that for a stationary {Xi}, 1
nH (X1, . . . , Xn ) is decreasing in n, and H
nH (X1, . . . , Xn ).
35 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
nH (X1, . . . , Xn ) = 1 n n
k=1
36 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
n→∞ H
n→∞ H (Xn |Xn−1 )
Stationarity
37 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
38 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Entropy Rate of Random Processes
β α+β α α+β
39 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Typicality for Sources with Memory
40 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Typicality for Sources with Memory
41 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Typicality for Sources with Memory
ε
ε
ε
ε
1 ∀ xn ∈ A(n)
ε
(definition)
2
n→∞ P
ε
(by LLN)
42 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Typicality for Sources with Memory
i=1 PX (xi), that is, the probability that the DMS generates
n log P (xn) − H (X )
ε
43 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Typicality for Sources with Memory
n
i=1
n
i=1
Exercise 5 Show that T (n)
ε
δ
where δ = εH (X ), where in general for some ε > 0, there is no δ′ > 0 such that A(n)
δ′
ε
.
44 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Typicality for Sources with Memory
1 ∀ xn ∈ A(n)
ε
(by definition)
2
n→∞ P
ε
(by LLN)
3 |A(n)
ε
(by 1 and 2)
4 |A(n)
ε
(by 1 and 2)
n log P (Xn) = 1 n n
i=1
1 PX(Xi) p
1 PX(X)
45 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Typicality for Sources with Memory
p
n→∞
n−1
l=0
46 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Typicality for Sources with Memory
n→∞ − 1
n log P (Xn) p
47 / 50 I-Hsiang Wang IT Lecture 2
Weakly Typical Sequences and Sources with Memory Typicality for Sources with Memory
1 N H
48 / 50 I-Hsiang Wang IT Lecture 2
49 / 50 I-Hsiang Wang IT Lecture 2
1 H = H (S ) for DMS {Si}. 2 H = H ({Si} ) for ergodic DSS {Si}.
ε
ε
1 ∀ xn ∈ T (n) ε
ε
2 limn→∞ P
ε
ε
3 |T (n) ε
ε
4 |T (n) ε
ε
50 / 50 I-Hsiang Wang IT Lecture 2