the setting of the research

The setting of the research ( s ) { 0 , 1 } 1 ( ( s )) = s s S - PowerPoint PPT Presentation

On a Redundancy of AIFV- m Codes for m = 3, 5 Ryusei Fujita , Ken-ichi Iwata , Hirosuke Yamamoto University of Fukui The University of Tokyo 2020 IEEE International Symposium on Information Theory 21-26 June 2020 The


  1. On a Redundancy of AIFV- m Codes for m = 3, 5 Ryusei Fujita † , Ken-ichi Iwata † , Hirosuke Yamamoto †† † University of Fukui †† The University of Tokyo 2020 IEEE International Symposium on Information Theory 21-26 June 2020

  2. The setting of the research ϕ ( s ) ∈ { 0 , 1 } ∗ ϕ − 1 ( ϕ ( s )) = s s ∈ S ✲ ϕ ✲ ✲ ϕ − 1 S source symbol uniquely decodable noiseless channel Source Encoder Decoder allowing delay at most m bits S : stationary memoryless source with a probability distribution { p s , s ∈ S} R ϕ : redundancy of a code ϕ for a given source S R ϕ : = L ϕ − H average codeword length of a code ϕ L ϕ : = � H : = − p s log 2 p s entropy of a source S s ∈S We derive a new upper bound on the Redundancy of AIFV- m Codes for m = 3, 5. 2 / 44

  3. A brief history of redundancy of the optimal FV codes Evaluate the upper bounds on redundancy R ϕ when p 1 : = max s ∈S p s is known. In 1978, Gallager [1] proved  if p 1 < 1 p 1 + 0 . 0086 2 ,    R Huffman ≤ for Huffman codes,  if p 1 ≥ 1  2 − p 1 − h ( p 1 ) 2 ,  h ( x ) : = − x log 2 x − (1 − x ) log 2 (1 − x ) if 0 < x < 1 . In 2017, Hu, Yamamoto, Honda [2] proposed AIFV- m codes, and proved √  2 − 2 p 1 + p 2 if 1 5 − 1 1 − h ( p 1 ) 2 ≤ p 1 ≤ ,    2   2 + p 1 − 2 p 2 R AIFV − 2 ≤ for AIFV-2 codes, √  1 5 − 1  − h ( p 1 ) if ≤ p 1 < 1    2 1 + p 1 R AIFV − m = 1 / m if m ∈ { 2 , 3 , 4 } . They conjectured that R AIFV − m ≤ 1 / m for any positive integer m . This research evaluates an upper bound on R AIFV − 3 of AIFV- 3 when p 1 is known, and proves the worst-case redundancy of AIFV- 5 codes, R AIFV − 5 = 1 / 5 . 3 / 44

  4. Upper Bounds on Redundancy R Huffman when p 1 is known 1 R Huffman [1] R. Gallager, “Variations on a Theme by Huffman,” IEEE Trans. Inf. Theory, 1978. : = L Huffman − H ≤ 1 [6] D. Manstetten, “Tight Bounds on the Redundancy of Huffman Codes,” IEEE Trans. Inf. Theory, 1992. 1 ⁄ 2 ✻ Gallager [1] ✻ R Huffman ≤ 2 − p 1 − h ( p 1 ) Manstetten [6] if 1 / 2 ≤ p 1 ≤ 1 if 1 / 127 ≤ p 1 ≤ 1 / 2 p 1 : = max s ∈S p s 0 1 ⁄ 10 2 ⁄ 10 3 ⁄ 10 4 ⁄ 10 1 ⁄ 2 6 ⁄ 10 7 ⁄ 10 8 ⁄ 10 9 ⁄ 10 1 4 / 44

  5. Upper Bounds on Redundancy R AIFV − 2 when p 1 is known Redundancy 1 [2] W. Hu, H. Yamamoto, and J. Honda, “Worst-case Redundancy of Optimal Binary AIFV Codes and Their Extended Codes,” IEEE Trans. Inf. Theory, 2017. Theorem [2, Theorem 4] 1 ⁄ 2 ✻ R AIFV − 2 √ Gallager [1], R Huffman ≤ 2 − p 1 − h ( p 1 )  if 1 5 − 1 2 − 2 p 1 + p 2 1 − h ( p 1 ) 2 ≤ p 1 ≤  ❳❳❳  2  1 ⁄ 4  ✲  ≤  √  2 + p 1 − 2 p 2  ✶ ✏ ✏✏✏✏ 5 − 1  1 − h ( p 1 ) if ≤ p 1 < 1  ❳❳ 1 + p 1 2 ❳ Hu, et al. [2], R AIFV − 2 0 ≤ 1 / 2 for 0 < p 1 < 1 1 ⁄ 2 3 ⁄ 4 1 √ p 1 : = max s ∈S p s 5 − 1 2 5 / 44

  6. Upper Bounds on Redundancy R AIFV − 3 when p 1 is known Redundancy Theorem 3 in the paper [New] 1 ⁄ 2 R AIFV − 3 √  3 − 3 p 1 + p 2 5 − 1 if 1 1 − h ( p 1 ) 2 ≤ p 1 ≤ ,    2 − p 1 2  1 ⁄ 3 1 ⁄ 3     √   4 − p 1 − 3 p 2 1 + p 3  5 − 1  1 (2 − p 1 )(1 + p 1 ) − h ( p 1 ) if < p 1 ≤ β 1 ,   1 ⁄ 4 2        3 − p 1 − 2 p 2 1 + p 3  if β 1 ≤ p 1 ≤ 3 R AIFV − 2  ≤ 1 − h ( p 1 ) 4 , 1 ⁄ 6  1 + p 1       9 − 5 p 2   if 3 R AIFV − 3  4(1 + p 1 ) − h ( p 1 ) 1 4 < p 1 ≤ β 2 ,          p 1 23 + 24 p 1 − 35 p 3 0   1 ) − h ( p 1 ) 1 if β 2 ≤ p 1 < 1 ,   1 ⁄ 2 3 ⁄ 4 12(1 + p 1 + p 2 1 √ β 1 β 2 5 − 1 2 ≈ 0 . 6889 ≈ 0 . 8287 6 / 44

  7. The worst case redundancy R AIFV − 3 of AIFV- 3 codes Redundancy Theorem [Hu et al, Theorem 7] 1 ⁄ 2 If p 1 ≥ 1 / 2 , then R AIFV − 3 ≤ min { R AIFV − 2 , f 3 ( p 1 ) } ≤ 1 / 3 . 1 ⁄ 3 f 3 ( p 1 ) : = 2 − p 1 + − p 13 − 2 p 12 − p 1 + 2 − h ( p 1 ) p 12 + p 1 + 1 1 ⁄ 4 given by an AIFV-3 code: ✻ f 3 ( p 1 ) R AIFV − 2 p 1 0 1 ⁄ 2 1 7 / 44

  8. The worst case redundancy R AIFV − 3 of AIFV- 3 codes Redundancy Theorem [Hu et al, Theorem 7] 1 ⁄ 2 If p 1 ≥ 1 / 2 , then R AIFV − 3 ≤ min { R AIFV − 2 , f 3 ( p 1 ) } ≤ 1 / 3 . 1 ⁄ 3 1 ⁄ 4 Theorem [Hu et al, Theorem 5] ✻ If p 1 ≤ 1 / 2 , then f 3 ( p 1 ) R AIFV − 2 ≤ 1 / 4 , R AIFV − 2 and R AIFV − 3 ≤ R AIFV − 2 ≤ 1 / 4 ≤ 1 / 3 . p 1 0 1 ⁄ 2 1 8 / 44

  9. The worst case redundancy R AIFV − 3 of AIFV- 3 codes Redundancy Theorem [Hu et al, Theorem 7] 1 ⁄ 2 If p 1 ≥ 1 / 2 , then R AIFV − 3 ≤ min { R AIFV − 2 , f 3 ( p 1 ) } ≤ 1 / 3 . 1 ⁄ 3 1 ⁄ 4 Theorem [Hu et al, Theorem 5] ✻ If p 1 ≤ 1 / 2 , then f 3 ( p 1 ) R AIFV − 2 ≤ 1 / 4 , R AIFV − 2 and R AIFV − 3 ≤ R AIFV − 2 ≤ 1 / 4 ≤ 1 / 3 . p 1 0 R AIFV − 3 ≤ 1 / 3 for 0 < p 1 < 1 1 ⁄ 2 1 9 / 44

  10. The worst case redundancy R AIFV − 3 of AIFV- 3 codes Redundancy Theorem [Hu et al, Theorem 7] 1 ⁄ 2 If p 1 ≥ 1 / 2 , then R AIFV − 3 ≤ min { R AIFV − 2 , f 3 ( p 1 ) } ≤ 1 / 3 . 1 ⁄ 3 1 ⁄ 4 Theorem [Hu et al, Theorem 5] ✻ If p 1 ≤ 1 / 2 , then f 3 ( p 1 ) R AIFV − 2 ≤ 1 / 4 , R AIFV − 2 and R AIFV − 3 ≤ R AIFV − 2 ≤ 1 / 4 ≤ 1 / 3 . p 1 0 R AIFV − 3 ≤ 1 / 3 and lim p 1 → 1 R AIFV − 3 = 1 / 3 1 ⁄ 2 1 The worst case redundancy of AIFV-3 codes is 1 / 3 10 / 44

  11. The worst case redundancy R AIFV − 3 of AIFV- 3 codes Redundancy Theorem [Hu et al, Theorem 7] 1 ⁄ 2 If p 1 ≥ 1 / 2 , then R AIFV − 3 ≤ min { R AIFV − 2 , f 3 ( p 1 ) } ≤ 1 / 3 . 1 ⁄ 3 1 ⁄ 4 Theorem [Hu et al, Theorem 5] ✻ If p 1 ≤ 1 / 2 , then f 3 ( p 1 ) R AIFV − 2 ≤ 1 / 4 � 1 / 5 . R AIFV − 2 p 1 0 1 ⁄ 2 1 We need a new lemma to prove R AIFV − 5 ≤ 1 / 5 . 11 / 44

  12. The worst case redundancy R AIFV − 3 of AIFV- 3 codes Redundancy In our research, 1 ⁄ 2 Theorem 3 in the paper shows R AIFV − 3 ≤ 1 / 3 if p 1 ≥ 1 / 2 1 ⁄ 3 using new upper bounds on R AIFV − 3 . 1 ⁄ 4 R AIFV − 2 R AIFV − 3 p 1 0 1 ⁄ 2 1 12 / 44

  13. The worst case redundancy R AIFV − 3 of AIFV- 3 codes Redundancy In our research, 1 ⁄ 2 Theorem 3 in the paper shows R AIFV − 3 ≤ 1 / 3 if p 1 ≥ 1 / 2 1 ⁄ 3 using new upper bounds on R AIFV − 3 . 1 ⁄ 4 R AIFV − 2 Lemma 1 in the paper shows R AIFV − 3 ≤ 16 9 − log 2 3 if p 1 ≤ 1 / 2 , R AIFV − 3 p 1 0 1 ⁄ 2 1 13 / 44

  14. The worst case redundancy R AIFV − 3 of AIFV- 3 codes Redundancy In our research, 1 ⁄ 2 Theorem 3 in the paper shows R AIFV − 3 ≤ 1 / 3 if p 1 ≥ 1 / 2 1 ⁄ 3 using new upper bounds on R AIFV − 3 . 1 ⁄ 4 R AIFV − 2 Lemma 1 in the paper shows R AIFV − 3 ≤ 16 9 − log 2 3 if p 1 ≤ 1 / 2 , ≤ 1 / 5 R AIFV − 3 p 1 0 ≤ 1 / 3 . 1 ⁄ 2 1 14 / 44

  15. The worst case redundancy R AIFV − 3 of AIFV- 3 codes Redundancy In our research, 1 ⁄ 2 Theorem 3 in the paper shows R AIFV − 3 ≤ 1 / 3 if p 1 ≥ 1 / 2 1 ⁄ 3 using new upper bounds on R AIFV − 3 . 1 ⁄ 4 R AIFV − 2 Lemma 1 in the paper shows R AIFV − 3 ≤ 16 9 − log 2 3 if p 1 ≤ 1 / 2 , ≤ 1 / 5 R AIFV − 3 p 1 0 ≤ 1 / 3 . 1 ⁄ 2 1 Lemma 1 shows R AIFV − 5 ≤ R AIFV − 3 ≤ 1 / 5 if p 1 ≤ 1 / 2 . 15 / 44

  16. The worst case redundancy R AIFV − 4 of AIFV- 4 codes Redundancy Theorem [2, Theorem 8] 1 ⁄ 2 R AIFV − 4 ≤ min { R AIFV − 2 , f 4 ( p 1 ) } ≤ 1 / 4 if p 1 ≥ 1 / 2 R AIFV − 4 ≤ R AIFV − 2 ≤ 1 / 4 if p 1 ≤ 1 / 2 f 4 ( x ) : = 2 − x − h ( x ) 1 ⁄ 4 + ( x + 1)(3 − 4 x ) + x 2 (2 − 3 x ) + x 3 (1 − x ) x 3 + x 2 + x + 1 given in [2, Eq. (52)] using an AIFV-4 code. ✻ f 4 ( p 1 ) R AIFV − 2 p 1 0 1 ⁄ 2 1 16 / 44

  17. The worst case redundancy R AIFV − 5 of AIFV- 5 codes Redundancy Theorem [New] 1 ⁄ 2 R AIFV − 5 ≤ min { R AIFV − 3 , f 5 ( p 1 ) } ≤ 1 / 5 if p 1 ≥ 1 / 2 R AIFV − 5 ≤ R AIFV − 3 ≤ 1 / 5 if p 1 ≤ 1 / 2 f 5 ( x ) : = 2 − x − h ( x ) + ( x + 1)(4 − 5 x ) + x 2 (3 − 4 x ) + x 3 (2 − 3 x ) + x 4 (1 − x ) 1 ⁄ 4 x 4 + x 3 + x 2 + x + 1 1 ⁄ 5 given using an AIFV-5 code. 1 ⁄ 6 ✻ f 5 ( p 1 ) R AIFV − 3 p 1 0 1 ⁄ 2 1 17 / 44

  18. Proof of Gallager’s Bound (1. Notation ) n : = |S| , the size of the source alphabet S Each source probability p i , i = 1 , 2 , · · · , n is assigned to each leaf of t Huffman . Simple example of Huffman tree t Huffman with n = 4 t Huffman � � a b c d S ∼ 0 1 1 / 2 1 / 4 1 / 8 1 / 8 [1 / 2] p 1 p 2 p 3 p 4 p 1 0 1 [1 / 4] p 2 0 1 [1 / 8] [1 / 8] p 3 p 4 18 / 44

  19. Proof of Gallager’s Bound (1. Notation ) n : = |S| , the size of the source alphabet S Each source probability p i , i = 1 , 2 , · · · , n is assigned to each leaf of t Huffman . Each internal node has the sum of the probabilities of its two children. Numbering all the nodes except the root in order of decreasing probability and increasing depth starting from 1 to 2 n − 2 . q k denotes the probability of the node k of t Huffman , and q 1 ≥ q 2 ≥ · · · ≥ q 2 n − 2 . Simple example of Huffman tree t Huffman with n = 4 t Huffman � � a b c d S ∼ 1 / 2 1 / 4 1 / 8 1 / 8 q 1 q 2 [1 / 2] [1 / 2] p 1 p 2 p 3 p 4 p 1 q 3 q 4 [1 / 4] [1 / 4] p 2 q 5 q 6 [1 / 8] [1 / 8] p 3 p 4 19 / 44

Recommend


More recommend