on the polarization of r nyi entropy
play

On the Polarization of Rnyi Entropy Mengfan Zheng Based on joint - PowerPoint PPT Presentation

. . . . . . . . . . . . . . On the Polarization of Rnyi Entropy Mengfan Zheng Based on joint work with Ling Liu and Cong Ling Dept. of Electrical and Electronic Engineering Imperial College London m.zheng@imperial.ac.uk 8 May,


  1. . . . . . . . . . . . . . . On the Polarization of Rényi Entropy Mengfan Zheng Based on joint work with Ling Liu and Cong Ling Dept. of Electrical and Electronic Engineering Imperial College London m.zheng@imperial.ac.uk 8 May, 2019 M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . . . . 1 / 57

  2. . . . . . . . . . . . . . . . Motivation – measure information in the average sense – work well in communication theory – insuffjcient in some other areas such as cryptography information measures M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . . . 2 / 57 • Shannon entropy/Mutual information • Rényi entropy: more general, widely adopted in cryptography, etc. • Polarization/polar codes: powerful tool, well-studied under Shannon’s • Polarization of Rényi entropy not well understood yet

  3. . 2 . . . . . . . Outline 1 Preliminaries Shannon’s Information Measures From Shannon to Rényi Introduction . Channel Coding Polar Codes 3 Polarization of Conditional Rényi Entropy Polarization Result Proof and Discussion 4 Possible Applications in Cryptography 5 Open Problems M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 / 57

  4. . 1 . . . . . . . . . . Preliminaries . 2 Introduction 3 Polarization of Conditional Rényi Entropy 4 Possible Applications in Cryptography 5 Open Problems M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 / 57

  5. . . . . . . . . . . . . . . . Notations 0 1 1 . M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . 5 / 57 . . . . . . . . . . . ( X , Y ) ∼ P X , Y [ N ] : index set { 1 , 2 , ..., N } . Vectors: X or X a : b ≜ { X a , X a + 1 , ..., X b } where a ≤ b . X A ( A ⊂ [ N ] ): the subvector { X i : i ∈ A} of X 1 : N . G N = B N F ⊗ n : the generator matrix of polar codes, where N = 2 n , [ 1 ] B N is the bit-reversal matrix, and F =

  6. . Shannon’s Information Measures . . . . . . . . . 1 Preliminaries From Shannon to Rényi . 2 Introduction 3 Polarization of Conditional Rényi Entropy 4 Possible Applications in Cryptography 5 Open Problems M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 / 57

  7. . . . . . . . . . . . . . . Shannon Entropy (Shannon) Entropy: 1 Joint entropy: Conditional entropy: Chain rule: M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . 7 / 57 . . . ∑ H ( X ) = E P log P ( X ) = − P ( x ) log P ( x ) x ∈X ∑ ∑ H ( X , Y ) = − P ( x , y ) log P ( y , x ) x ∈X y ∈Y ∑ ∑ ∑ H ( Y | X ) = P ( x ) H ( Y | X = x ) = − P ( x , y ) log P ( y | x ) x ∈X x ∈X y ∈Y H ( X , Y ) = H ( X ) + H ( Y | X )

  8. . . . . . . . . . . . . . . . . Relative Entropy and Mutual Information The relative entropy or Kullback–Leibler distance between two Mutual information: the average information that Y gives about X M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . 8 / 57 . . . . . . . . . . . probability mass functions P ( x ) and Q ( x ) : P ( x ) log P ( x ) Q ( x ) = E P log P ( x ) ∑ D ( P || Q ) = Q ( x ) x ∈X P ( x , y ) log P ( x , y ) ∑ ∑ I ( X ; Y ) = P ( x ) P ( y ) = D ( P ( x , y ) || P ( x ) P ( y )) x ∈X y ∈Y

  9. . Shannon’s Information Measures . . . . . . . . . 1 Preliminaries From Shannon to Rényi . 2 Introduction 3 Polarization of Conditional Rényi Entropy 4 Possible Applications in Cryptography 5 Open Problems M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 / 57

  10. . . . . . . . . . . . . . . . From Shannon to Rényi Defjnition (Rényi Entropy [Rényi’61]) 1 (1) Three other special cases of the Rényi entropy: M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . 10 / 57 . . . . . . . . . . . The Rényi entropy of a random variable X ∈ X of order α is defjned as ∑ P X ( x ) α . H α ( X ) = 1 − α log x ∈X As α → 1, the Rényi entropy reduces to the Shannon entropy. Max-entropy : H 0 ( X ) = log |X| Min-entropy : H ∞ ( X ) = min i ( − log p i ) = − log max i p i Collision entropy : H 2 ( X ) = − log ∑ n i = − log P ( X = Y ) i = 1 p 2

  11. . . . . . . . . . . . . . . . . . Rényi Entropy M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . 11 / 57 Figure: Rényi entropies of a Bern ( p ) random variable.

  12. . . . . . . . . . . . . . . Rényi Divergence Defjnition (Rényi divergence [Rényi’61]) defjned as 1 (2) divergence. M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . 12 / 57 . . . . The Rényi divergence of order α of P from another distribution Q on X is ∑ P ( x ) α Q ( x ) 1 − α . D α ( P || Q ) = α − 1 log x ∈X Also, as α → 1, the Rényi divergence reduces to the Kullback–Leibler

  13. . . . . . . . . . . . . Conditional Rényi Entropy . Unlike the conditional Shannon entropy, there is no generally accepted defjnition of the conditional Rényi entropy yet. Defjnition (Conditional Rényi Entropy [Jizba-Arimitsu’04]) 1 (3) This type of Rényi conditional entropy satisfjes the chain rule: (4) M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . 13 / 57 . . . . The conditional Rényi entropy of order α of X given Y is defjned as { x , y }∈X×Y P X , Y ( x , y ) α ∑ H α ( X | Y ) = 1 − α log . ∑ y ∈Y P Y ( y ) α H α ( X | Y ) + H α ( Y ) = H α ( X , Y ) .

  14. . . . . . . . . . . . . . . . . Conditional Rényi Entropy (Cont.) Defjnition (Conditional Rényi Entropy [Cachin’97]) (5) M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . 14 / 57 . . . . The conditional Rényi Entropy of order α of X given Y is defjned as ∑ H ′ α ( X | Y ) = P Y ( y ) H α ( X | y ) . y ∈Y

  15. . . . . . . . . . . . . Conditional Rényi Entropy (Cont.) . Defjnition (Conditional Rényi Entropy [Arimoto’77]) H A (6) Defjnition (Conditional Rényi Entropy [Hayashi’11]) H H 1 (7) M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . 15 / 57 . . . . . . . . . . . . The conditional Rényi Entropy of order α of X given Y is defjned as α [ ∑ P X | Y ( x | y ) α ] 1 ∑ α α ( X | Y ) = 1 − α log P Y ( y ) y ∈Y x ∈X The conditional Rényi Entropy of order α of X given Y is defjned as ∑ ∑ P X | Y ( x | y ) α α ( X | Y ) = 1 − α log P Y ( y ) y ∈Y x ∈X

  16. . 1 . . . . . . . . . . Preliminaries . 2 Introduction 3 Polarization of Conditional Rényi Entropy 4 Possible Applications in Cryptography 5 Open Problems M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 / 57

  17. . . . . . . . . . . . . . . . . Model of Digital Communication – Compresses the data to remove redundancy – Adds redundancy/structure to protect against channel errors M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . . 17 / 57 • Source Coding • Channel Coding

  18. . 2 . . . . . . . . . 1 Preliminaries Introduction . Channel Coding Polar Codes 3 Polarization of Conditional Rényi Entropy 4 Possible Applications in Cryptography 5 Open Problems M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 / 57

  19. . . . . . . . . . . . . . . . . . The Channel Coding Problem M. Zheng (ICL) On the Polarization of Rényi Entropy 8 May, 2019 . . . . . . . . . . . . . . . . . . . . . . . 19 / 57 • m ∈ M = { 1 , 2 , ..., M } • Input X ∈ X , output Y ∈ Y • Memoryless: P ( y n | x 1 : n , y 1 : n − 1 ) = P ( y n | x n ) • DMC = Discrete Memoryless Channel

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend