On the Polarization of Rnyi Entropy Mengfan Zheng Based on joint - - PowerPoint PPT Presentation

on the polarization of r nyi entropy
SMART_READER_LITE
LIVE PREVIEW

On the Polarization of Rnyi Entropy Mengfan Zheng Based on joint - - PowerPoint PPT Presentation

. . . . . . . . . . . . . . On the Polarization of Rnyi Entropy Mengfan Zheng Based on joint work with Ling Liu and Cong Ling Dept. of Electrical and Electronic Engineering Imperial College London m.zheng@imperial.ac.uk 8 May,


slide-1
SLIDE 1

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

On the Polarization of Rényi Entropy

Mengfan Zheng Based on joint work with Ling Liu and Cong Ling

  • Dept. of Electrical and Electronic Engineering

Imperial College London m.zheng@imperial.ac.uk

8 May, 2019

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 1 / 57

slide-2
SLIDE 2

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Motivation

  • Shannon entropy/Mutual information

– measure information in the average sense – work well in communication theory – insuffjcient in some other areas such as cryptography

  • Rényi entropy: more general, widely adopted in cryptography, etc.
  • Polarization/polar codes: powerful tool, well-studied under Shannon’s

information measures

  • Polarization of Rényi entropy not well understood yet
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 2 / 57

slide-3
SLIDE 3

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Outline

1

Preliminaries Shannon’s Information Measures From Shannon to Rényi

2

Introduction Channel Coding Polar Codes

3

Polarization of Conditional Rényi Entropy Polarization Result Proof and Discussion

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 3 / 57

slide-4
SLIDE 4

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries

2

Introduction

3

Polarization of Conditional Rényi Entropy

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 4 / 57

slide-5
SLIDE 5

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Notations

(X, Y) ∼ PX,Y [N]: index set {1, 2, ..., N}. Vectors: X or Xa:b ≜ {Xa, Xa+1, ..., Xb} where a ≤ b. XA (A ⊂ [N]): the subvector {Xi : i ∈ A} of X1:N. GN = BNF⊗n: the generator matrix of polar codes, where N = 2n, BN is the bit-reversal matrix, and F = [1 1 1 ] .

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 5 / 57

slide-6
SLIDE 6

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries Shannon’s Information Measures From Shannon to Rényi

2

Introduction

3

Polarization of Conditional Rényi Entropy

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 6 / 57

slide-7
SLIDE 7

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Shannon Entropy

(Shannon) Entropy: H(X) = EP log 1 P(X) = − ∑

x∈X

P(x) log P(x) Joint entropy: H(X, Y) = − ∑

x∈X

y∈Y

P(x, y) log P(y, x) Conditional entropy: H(Y|X) = ∑

x∈X

P(x)H(Y|X = x) = − ∑

x∈X

y∈Y

P(x, y) log P(y|x) Chain rule: H(X, Y) = H(X) + H(Y|X)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 7 / 57

slide-8
SLIDE 8

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Relative Entropy and Mutual Information

The relative entropy or Kullback–Leibler distance between two probability mass functions P(x) and Q(x): D(P||Q) = ∑

x∈X

P(x) log P(x) Q(x) = EP log P(x) Q(x) Mutual information: the average information that Y gives about X I(X; Y) = ∑

x∈X

y∈Y

P(x, y) log P(x, y) P(x)P(y) = D(P(x, y)||P(x)P(y))

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 8 / 57

slide-9
SLIDE 9

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries Shannon’s Information Measures From Shannon to Rényi

2

Introduction

3

Polarization of Conditional Rényi Entropy

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 9 / 57

slide-10
SLIDE 10

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

From Shannon to Rényi

Defjnition (Rényi Entropy [Rényi’61])

The Rényi entropy of a random variable X ∈ X of order α is defjned as Hα(X) = 1 1 − α log ∑

x∈X

PX(x)α. (1) As α → 1, the Rényi entropy reduces to the Shannon entropy. Three other special cases of the Rényi entropy: Max-entropy: H0(X) = log |X| Min-entropy: H∞(X) = mini(− log pi) = − log maxi pi Collision entropy: H2(X) = − log ∑n

i=1 p2 i = − log P(X = Y)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 10 / 57

slide-11
SLIDE 11

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Rényi Entropy

Figure: Rényi entropies of a Bern(p) random variable.

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 11 / 57

slide-12
SLIDE 12

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Rényi Divergence

Defjnition (Rényi divergence [Rényi’61])

The Rényi divergence of order α of P from another distribution Q on X is defjned as Dα(P||Q) = 1 α − 1 log ∑

x∈X

P(x)αQ(x)1−α. (2) Also, as α → 1, the Rényi divergence reduces to the Kullback–Leibler divergence.

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 12 / 57

slide-13
SLIDE 13

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Conditional Rényi Entropy

Unlike the conditional Shannon entropy, there is no generally accepted defjnition of the conditional Rényi entropy yet.

Defjnition (Conditional Rényi Entropy [Jizba-Arimitsu’04])

The conditional Rényi entropy of order α of X given Y is defjned as Hα(X|Y) = 1 1 − α log ∑

{x,y}∈X×Y PX,Y(x, y)α

y∈Y PY(y)α

. (3) This type of Rényi conditional entropy satisfjes the chain rule: Hα(X|Y) + Hα(Y) = Hα(X, Y). (4)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 13 / 57

slide-14
SLIDE 14

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Conditional Rényi Entropy (Cont.)

Defjnition (Conditional Rényi Entropy [Cachin’97])

The conditional Rényi Entropy of order α of X given Y is defjned as H′

α(X|Y) =

y∈Y

PY(y)Hα(X|y). (5)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 14 / 57

slide-15
SLIDE 15

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Conditional Rényi Entropy (Cont.)

Defjnition (Conditional Rényi Entropy [Arimoto’77])

The conditional Rényi Entropy of order α of X given Y is defjned as HA

α(X|Y) =

α 1 − α log ∑

y∈Y

PY(y) [ ∑

x∈X

PX|Y(x|y)α] 1

α

(6)

Defjnition (Conditional Rényi Entropy [Hayashi’11])

The conditional Rényi Entropy of order α of X given Y is defjned as HH

α(X|Y) =

1 1 − α log ∑

y∈Y

PY(y) ∑

x∈X

PX|Y(x|y)α (7)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 15 / 57

slide-16
SLIDE 16

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries

2

Introduction

3

Polarization of Conditional Rényi Entropy

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 16 / 57

slide-17
SLIDE 17

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Model of Digital Communication

  • Source Coding

– Compresses the data to remove redundancy

  • Channel Coding

– Adds redundancy/structure to protect against channel errors

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 17 / 57

slide-18
SLIDE 18

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries

2

Introduction Channel Coding Polar Codes

3

Polarization of Conditional Rényi Entropy

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 18 / 57

slide-19
SLIDE 19

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

The Channel Coding Problem

  • m ∈ M = {1, 2, ..., M}
  • Input X ∈ X, output Y ∈ Y
  • Memoryless: P(yn|x1:n, y1:n−1) = P(yn|xn)
  • DMC = Discrete Memoryless Channel
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 19 / 57

slide-20
SLIDE 20

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Channel Capacity

  • Capacity of a DMC channel: C = maxPx I(X; Y)

– Mutual information (not entropy itself) is what could be transmitted through the channel – Maximum is over all possible input distributions Px – ∃ only one maximum since I(X; Y) is concave in P(x) for fjxed P(y|x) – We want to fjnd the Px that maximizes I(X; Y) – Limits on C: 0 ≤ C ≤ min(H(X), H(Y)) ≤ min(log |X|, log |Y|)

  • Capacity for n uses of channel:

C(n) = 1 n max

Px1:n I(X1:n; Y1:n)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 20 / 57

slide-21
SLIDE 21

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Channel Coding

  • Assume Discrete Memoryless Channel (DMC) with known PY|X.
  • An (M, n)-code is

– A fjxed set of M codewords x(w) ∈ X n for w = 1 : M – A deterministic decoder g(y) ∈ 1 : M

  • The rate of an (M, n)-code: R = (log2 M)/n bits/transmission
  • Error probability: λw = P(g(y(w)) ̸= w) = ∑

y∈Yn P(y|x(w))δg(y)̸=w

– Maximum error probability: λ(n) = max

1≤w≤M λw

– Average error probability: P(n)

e

= 1 M

M

w=1

λw

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 21 / 57

slide-22
SLIDE 22

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Shannon’s ideas

  • Channel coding theorem: the basic theorem of information theory

– Proved in his original 1948 paper

  • How do you correct all errors?
  • Shannon’s ideas

– Allow an arbitrarily small but nonzero error probability – Use the channel many times in succession, so that the law of large numbers comes into efgect – Consider a randomly chosen code and show the expected average error probability is small – Use the idea of typical sequences – Show this means ∃ at least one code with small max error prob

  • Sadly it does not tell you how to construct the code
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 22 / 57

slide-23
SLIDE 23

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries

2

Introduction Channel Coding Polar Codes

3

Polarization of Conditional Rényi Entropy

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 23 / 57

slide-24
SLIDE 24

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

About Polar Codes

  • First low-complexity scheme which provably achieves the capacity of

binary-input memoryless symmetric channels.

  • Encoding complexity O(N log N)
  • Successive decoding complexity O(N log N)
  • Probability of error ≈ 2−

√ N

  • Not only good for channel coding, works equally well for source coding

and more complicated scenarios.

  • Main idea: channel/source polarization
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 24 / 57

slide-25
SLIDE 25

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

What is Polarization

  • Among all channels, there are two classes which are easy to

communicate optimally – The perfect channels: the output Y determines the input X – The useless channels: Y is independent of X

  • Polarization is a technique to convert noisy channels to a mixture of

extreme channels

  • The process is information-conserving
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 25 / 57

slide-26
SLIDE 26

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Channel Combining and Splitting

Basic operation (N = 2)

  • Channel combining:
  • Channel splitting:
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 26 / 57

slide-27
SLIDE 27

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Polar Transformation

Recursive polar transformations [Arıkan’09]:

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 27 / 57

slide-28
SLIDE 28

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Polarization of BEC(0.3), n=1:20

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 28 / 57

slide-29
SLIDE 29

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Polarization of BEC(0.3), n=1:20 (Cont.)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 29 / 57

slide-30
SLIDE 30

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Polarization Theory

Theorem (Channel polarization [Arıkan’09])

For any B-DMC W, the channels {W(i)

N } polarize in the sense that, for any

fjxed δ ∈ (0, 1), as N → ∞ through the power of 2, the fraction of indices i ∈ [N] for which I(W(i)

N ) ∈ (1 − δ, 1] goes to I(W), and the fraction with

I(W(i)

N ) ∈ [0, δ) goes to 1 − I(W).

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 30 / 57

slide-31
SLIDE 31

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Polar Codes

  • X1:N: N consecutive channel inputs to a B-DMC W(Y|X).
  • Let U1:N = X1:NGN, where N = 2n, GN = BNF⊗n.
  • For δN = 2−Nβ with β ∈ (0, 1/2), defjne

I = {i ∈ [N] : H(Ui|Y1:N, U1:i−1) ≤ δN}. (8)

  • Assign {ui}i∈I with information bits, and {ui}i∈IC with frozen bits.

Then compute x1:N = u1:NGN.

  • Upon receiving y1:N, the receiver uses a successive cancellation (SC)

decoder to decode: ¯ ui = { ui, i ∈ IC arg maxu∈{0,1} PUi|Y1:NU1:i−1(u|y1:N, u1:i−1), i ∈ I . (9)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 31 / 57

slide-32
SLIDE 32

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries

2

Introduction

3

Polarization of Conditional Rényi Entropy

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 32 / 57

slide-33
SLIDE 33

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Existing Results

  • In [Alsan-Telatar’14] it is shown that the following chain rule inequality

holds for the polar transformation for α ≤ 1, H∗

α(U1U2|Y1Y2) ≥ H∗ α(U1|Y1Y2) + H∗ α(U2|Y1Y2U1),

whenever U1, U2 are i.i.d. uniform on F2.

  • The inequality holds with equality if and only if the channel W is perfect,
  • r the channel W is completely noisy, or α = 1.
  • Note that H∗

α(X|Y) in [Alsan-Telatar’14] is defjned as

H∗

α(X|Y) = Hα(X) +

α 1 − α log ∑

y∈Y

[ ∑

x∈X

PX(x)PY|X(y|x)α] 1

α .

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 33 / 57

slide-34
SLIDE 34

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries

2

Introduction

3

Polarization of Conditional Rényi Entropy Polarization Result Proof and Discussion

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 34 / 57

slide-35
SLIDE 35

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Polarization of Conditional Rényi Entropy (Cont.)

Theorem (Polarization of Conditional Rényi Entropy)

For any B-DMC PY|X (or any discrete memoryless source (X, Y) ∼ PX,Y

  • ver X × Y with X = {0, 1} and Y an arbitrary countable set) and any

α ≥ 0, as N → ∞ through the power of 2, the fraction of indices i ∈ [N] ≜ {1, 2..., N} with HN(i) ∈ (1 − δ, 1] goes to Hα(X|Y), and the fraction with HN(i) ∈ [0, δ) goes to 1 − Hα(X|Y).

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 35 / 57

slide-36
SLIDE 36

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Polarization of Conditional Rényi Entropy (Cont.)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 36 / 57

slide-37
SLIDE 37

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries

2

Introduction

3

Polarization of Conditional Rényi Entropy Polarization Result Proof and Discussion

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 37 / 57

slide-38
SLIDE 38

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Proof

Lemma (Basic polar transformation)

For α ≥ 0, we have Hα(U2|Y1Y2U1) ≤ min{Hα(X1|Y1), Hα(X2|Y2)}. (10) Hα(U1|Y1Y2) ≥ max{Hα(X1|Y1), Hα(X2|Y2)}, (11) Hα(U1U2|Y1Y2) = Hα(U1|Y1Y2) + Hα(U2|Y1Y2U1). (12)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 38 / 57

slide-39
SLIDE 39

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Proof (Cont.)

Defjne a random walk {Bn; n ≥ 0} in the infjnite binary tree The random walk starts at the root node with B0 = (0, 1), and moves to one of the two child nodes in the next level with equal probability at each integer time. If Bn = (n, i), Bn+1 equals (n + 1, 2i − 1) or (n + 1, 2i) with probability 1/2 each.

Figure: The tree process for the recursive channel construction [Arıkan’09].

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 39 / 57

slide-40
SLIDE 40

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Proof (Cont.)

  • Denote H(0, 1) = Hα(X|Y) and H(n, i) = Hα(Ui|Y1:2n, U1:i−1) for n ≥ 1,

i = [2n], and defjne a random process {Hn; n ≥ 0} with Hn = H(Bn).

  • The random process {Hn; n ≥ 0} is a martingale due to the chain rule

equality of (12), i.e., E[Hn+1|B0, B1, ..., Bn] = Hn.

  • Since {Hn; n ≥ 0} is a uniformly integrable martingale, it converges a.e.

to an RV H∞ such that E[|Hn − H∞|] = 0. Then we have E[|Hn − Hn+1|] → 0. (13)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 40 / 57

slide-41
SLIDE 41

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Proof (Cont.)

  • Note that

E[|Hn − Hn+1|] = 1 2 ( E[|Hα(Ui|Y1:2n, U1:i−1) − Hα(U2i−1|Y1:2n+1, U1:2i−2)|] + E[|Hα(Ui|Y1:2n, U1:i−1) − Hα(U2i|Y1:2n+1, U1:2i−1)|] ) . Thus, (13) forces the inequalities in (10) and (11) to hold with equalities.

  • We can then show that in this case, Hα(Ui|Y1:2n, U1:i−1) tends to either

0 or 1 as n → ∞.

  • The convergence result together with the chain rule equality of (12)

imply that the fraction of {i : H(n, i) ∈ (1 − δ, 1]} goes to Hα(X|Y) as n → ∞.

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 41 / 57

slide-42
SLIDE 42

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Discussion

  • Under Rényi entropies of difgerent orders, the same synthetic

sub-channel may exhibit opposite extremal states.

  • Intuitively, if a synthetic sub-channel is totally deterministic or uniform,

its Rényi entropies of difgerent orders should be the same.

  • A paradox?
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 42 / 57

slide-43
SLIDE 43

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

An Example

  • Let |Y| = 2N ≜ M.
  • Consider such a PX,Y: 1

L of probability pairs

( PX,Y(0, y), PX,Y(1, y) ) are completely deterministic with accumulated probability 1

N.

  • Without loss of generality, assume PX,Y(0, yi) =

L NM and PX,Y(1, yi) = 0

for i ∈ A, where A ⊂ [2N] is the set of deterministic pairs.

  • The rest L−1

L

fraction of probability pairs are completely uniform, i.e., PX,Y(0, yi) = PX,Y(1, yi) =

(N−1)L 2NM(L−1) for i ∈ AC.

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 43 / 57

slide-44
SLIDE 44

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

An Example (Cont.)

  • Then

Hα(X|Y) = 1 1 − α log ∑

i∈A ( L NM)α + 21−α ∑ i∈AC ( (N−1)L NM(L−1))α

i∈A ( L NM)α + ∑ i∈AC ( (N−1)L NM(L−1))α

= 1 1 − α log

M L ( L NM)α + 21−α M(L−1) L

( (N−1)L

NM(L−1))α M L ( L NM)α + M(L−1) L

( (N−1)L

NM(L−1))α

= 1 1 − α log (L − 1)α−1 + 21−α(N − 1)α (L − 1)α−1 + (N − 1)α . (14)

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 44 / 57

slide-45
SLIDE 45

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

An Example (Cont.)

  • For a considered α = α0 > 1, let

L − 1 = 2−1(N − 1)

α0−0.5/α0 α0−1

.

  • Then we have

Hα0(X|Y) = 1 1 − α0 log 21−α0[ (N − 1)

α0− 0.5

α0 + (N − 1)α0]

21−α0(N − 1)

α0− 0.5

α0 + (N − 1)α0

.

  • It is clear that as N → ∞,

(N−1)α0 (N−1)

α0− 0.5 α0

= (N − 1)0.5/α0 → ∞.

  • Thus Hα0(X|Y) → 1.
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 45 / 57

slide-46
SLIDE 46

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

An Example (Cont.)

  • Now let α′ = α0 + 1. From (14) we have

Hα′(X|Y) = 1 −α0 log 2−α0(N − 1)

α2 0−0.5 α0−1 + 2−α0(N − 1)α0+1

2−α0(N − 1)

α2 0−0.5 α0−1 + (N − 1)α0+1

.

  • In this case, as N → ∞, (N−1)α0+1

(N−1)

α2 0−0.5 α0−1

= (N − 1)

−0.5 α2 0−1 → 0.

  • Thus Hα′(X|Y) → 0.
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 46 / 57

slide-47
SLIDE 47

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

An Example (Cont.)

4 8 16 32 64 128 256 512

N

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

H'' (X|Y) =2 =3

Figure: Conditional Rényi entropies of difgerent α.

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 47 / 57

slide-48
SLIDE 48

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries

2

Introduction

3

Polarization of Conditional Rényi Entropy

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 48 / 57

slide-49
SLIDE 49

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Secure Coding Schemes

  • Weak security scheme (requires

1 NI(Z1:N; M) → 0):

– Secret information bit positions chosen from L(N)

X|Y \ L(N) X|Z

  • Strong security scheme (requires

I(Z1:N; M) → 0): – Secret information bit positions chosen from L(N)

X|Y ∩ H(N) X|Z

where L(N)

X|Y = {i ∈ [N] : H(Ui|Y1:N, U1:i−1) ≤ δN}

H(N)

X|Z = {i ∈ [N] : H(Ui|Z1:N, U1:i−1) ≥ 1 − δN}

L(N)

X|Z = {i ∈ [N] : H(Ui|Z1:N, U1:i−1) ≤ δN}

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 49 / 57

slide-50
SLIDE 50

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Secure Coding Schemes (Cont.)

  • A more general defjnition of secrecy: ϵ-secrecy with respect to Hα(·|·)

[Iwamoto-Shikata’13] Hα(M) − Hα(M|C) ≤ ϵ

  • Can we design a polar code that satisfjes this condition?
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 50 / 57

slide-51
SLIDE 51

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Randomness Extraction

  • Source polarization:

Let X1:N = [X1, ..., XN] be i.i.d. Bern(p), N = 2n, and U1:N = X1:NGN. Then, for any ϵ ∈ (0, 1), 1 N|{j ∈ [N] : H(Uj|U1:j−1) ≥ 1 − ϵ}| → H(p)

  • Using polarization to do randomness extraction was discussed in

[Abbe’11].

  • Secret-key generation is an application [Chou et al’15].
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 51 / 57

slide-52
SLIDE 52

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

Preliminaries

2

Introduction

3

Polarization of Conditional Rényi Entropy

4

Possible Applications in Cryptography

5

Open Problems

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 52 / 57

slide-53
SLIDE 53

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Open Problems

  • No chain rule equality or inequality
  • The polarization rate of Rényi entropy?
  • How to determine the overall Rényi divergence?
  • What about other types of conditional Rényi entropy?
  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 53 / 57

slide-54
SLIDE 54

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Figure: Polarization of difgerent types of conditional Rényi entropies.

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 54 / 57

slide-55
SLIDE 55

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

References

  • A. Rényi (1961)

On measures of entropy and information Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics.

  • P. Jizba and T. Arimitsu (2004)

The world according to Rényi: thermodynamics of multifractal systems Annals of Physics, 312(1), 17-59.

  • S. Arimoto (1977)

Information measures and capacity of order α for discrete memoryless channels Topics in information theory.

  • M. Hayashi (2011)

Exponential decreasing rate of leaked information in universal random privacy amplifjcation IEEE Transactions on Information Theory, 57(6), 3989-4001.

  • C. Cachin (1997)

Entropy measures and unconditional security in cryptography Doctoral dissertation, ETH Zurich.

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 55 / 57

slide-56
SLIDE 56

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

References (Cont.)

  • E. Arıkan (2009)

Channel Polarization: A Method for Constructing Capacity-Achieving Codes for Symmetric Binary-Input Memoryless Channels IEEE Transactions on Information Theory, 55(7), 3051-3073.

  • M. Alsan and E. Telatar (2014)

Channel Polarization: Polarization Improves E0 IEEE Transactions on Information Theory, 60(5), 2714-2719.

  • M. Iwamoto and J. Shikata, (2013)

Information theoretic security for encryption based on conditional Rényi entropies IEEE Transactions on Information Theory, 60(5), 2714-2719.

  • E. Abbe (2011)

Randomness and dependencies extraction via polarization 2011 Information Theory and Applications Workshop, (pp. 1-7).

  • R. A. Chou, M. R. Bloch and E. Abbe (2015)

Polar coding for secret-key generation IEEE Transactions on Information Theory, 61(11), 6213-6237.

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 56 / 57

slide-57
SLIDE 57

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Thank you!

  • M. Zheng (ICL)

On the Polarization of Rényi Entropy 8 May, 2019 57 / 57