On the R´ enyi Entropy of Log-Concave Sequences
On the R enyi Entropy of Log-Concave Sequences James Melbourne - - PowerPoint PPT Presentation
On the R enyi Entropy of Log-Concave Sequences James Melbourne - - PowerPoint PPT Presentation
On the R enyi Entropy of Log-Concave Sequences On the R enyi Entropy of Log-Concave Sequences James Melbourne University of Minnesota melbo013@umn.edu Tomasz Tkocz Carnagie Mellon University ttkocz@andrew.cmu.edu ISIT June 8, 2020 On
On the R´ enyi Entropy of Log-Concave Sequences
Outline
- M. & Tkocz. “Reversals of R´
enyi Entropy Inequalities under Log-Concavity.”
arXiv:2005.10930 .
1
Definitions
2
Results
3
Methods
On the R´ enyi Entropy of Log-Concave Sequences Definitions
R´ enyi Entropy
Definition: For f density function with respect to a measure γ, and α ∈ (0, 1) ∪ (1, ∞) hα,γ(f ) = log
- f αdγ
1 − α , h∞,γ(f ) = − log f γ,∞, hγ,1(f ) = hγ(f ) = −
- f log fdγ,
hγ,0(f ) = log γ(supp(f )) X ∼ f , hγ,α(X) := hγ,α(f ) γ = Lebesgue: hα(f ) γ = counting measure: Hα(f )
On the R´ enyi Entropy of Log-Concave Sequences Definitions
Log-Concavity
Log-Concavity on the integers An f : Z → [0, ∞) with interval support, log-concave when f 2(n) ≥ f (n + 1)f (n − 1). Closed under convolution Weak limits Examples: Bernoulli, Binomial, Poisson, Geometric, Hypergeometric
On the R´ enyi Entropy of Log-Concave Sequences Definitions
Log-concavity
Where it appears: Combinatorics - n
i=0 aiX i has real roots, {ai} is log-concave
Stanley ’89 Brenti ’89 Br¨ ad´ en ’14
Convex Geometry -
Alexandrov-Fenchel inequality ⇒ “intrinsic
volumes” associated to convex bodies are log-concave
Stanley ’81 Amelunxen, Lotz, McCoy, & Tropp ’14 , McCoy & Tropp ’14
Probability - Theory of Negative dependence
Joag-Dev & Proschan ’83 Pemantle ’00 Borcea, Br¨ and´ en, and Liggett ’09 .
Information Theory - Maximum entropy properties Poisson
Johnson ’06 Johnson, Kontoyannis, & Madiman ’11
On the R´ enyi Entropy of Log-Concave Sequences Definitions
Continuous Log-concavity
f : Rd → [0, ∞) f ((1 − t)x + ty) ≥ f 1−t(x)f t(y). Rich theory, intersection of functional analysis, convex geometry, and probability with connections to multitude of fields, Statistics, Economics, Physics, as well as Information Theory. For background on connections to information theory
Madiman, M., & Xu ’17
Inspiration from the following: Theorem
Bobkov & Madiman ’11
X log-concave on Rd and α < β hβ(X) ≤ hα(X) ≤ hβ(X) + d log α
1 α−1
β
1 β−1
(1)
On the R´ enyi Entropy of Log-Concave Sequences Results
Results
Theorem (M. & Tkocz) For X log-concave on Z, and α ∈ [0, ∞], Hα(X) < H∞(X) + log α
1 α−1
H(X) < H∞(X) + log e Entropy preserved under rearranging, log concavity not. Jensen’s inequality ⇒ H∞(X) ≤ Hα(X) Theorem (M. & Tkocz) X a discrete distribution, with a log-concave arrangement on Z and α ∈ [0, ∞], H∞(X) ≤ Hα(X) < H∞(X) + log α
1 α−1
On the R´ enyi Entropy of Log-Concave Sequences Results
Results
Theorem (M. & Tkocz) X a discrete distribution, with a log-concave arrangement on Z and α ∈ [0, ∞], H∞(X) ≤ Hα(X) < H∞(X) + log α
1 α−1
Strict Sharp Geometric(p), p → 0
On the R´ enyi Entropy of Log-Concave Sequences Results
Results
Corollary (M. & Tkocz) For X, Y iid and log-concave, Hα(X − Y ) < Hα(X) + log c(α) c(α) =
- 2α
1 α−1 ,
if α ∈ (2, ∞], α
1 α−1 ,
if α ∈ (0, 2]. H(X − Y ) < H(X) + log e Upper bounds in Theorem and Corollary are strict. Sharp when α ∈ {2, ∞} Take X to be Geometric(p), p → 0.
On the R´ enyi Entropy of Log-Concave Sequences Methods
Technical Definitions
Definition: Two-sided geometric distribution Density ϕ on Z two-sided geometric distributionfor p, q ∈ [0, 1) ϕ(n) = (1 − p)(1 − q) 1 − pq f (n). with f (n) =
- pn
for n ≥ 0 q−n for n ≤ 0. Take 00 = 1.
On the R´ enyi Entropy of Log-Concave Sequences Methods
Technical Definitions
Majorization Density f majorizes g, f ≻ g when k
i=1 f ↓ i ≥ k i=1 g↓ i , holds ∀k.
f ↓
i denotes decreasing rearrangement.
X ≻ Y when X ∼ f , Y ∼ g and f ≻ g. Schur-Concavity Φ is Schur-concave when f ≻ g implies Φ(f ) ≤ Φ(g). (2) R´ enyi entropy is Schur-concave
On the R´ enyi Entropy of Log-Concave Sequences Methods
Reduction
Lemma (M. and Tkocz) For Y log-concave, there exists X two-sided exponential, with Y ≻ X and H∞(X) = H∞(Y ).
b b b b b b b b b b b b
log f log a+
b
log a−
b b
Figure: Y ∼ f , X ∼ a
On the R´ enyi Entropy of Log-Concave Sequences Methods
Reduction
Lemma (M. and Tkocz) For Y log-concave, there exits X two-sided exponential, with Y ≻ X and H∞(X) = H∞(Y ). By Schur concavity of R´ enyi entropy, Hα(X) ≥ Hα(Y ). Problem reduced to two-sided exponential Hα(Y ) − H∞(Y ) ≤ Hα(X) − H∞(X). (3)
On the R´ enyi Entropy of Log-Concave Sequences Methods
Reduced Problem
Suffices to Prove For p, q ∈ (0, 1), Hα(X) − H∞(X) = log
- 1
1−pα + 1 1−qα −1 1 1−p + 1 1−q −1
- 1 − α
< log α
1 α−1
(4) After some algebra, it is enough to show (For α = 1) F(α) = α
- 1
1−pα + 1 1−qα − 1
- is strictly increasing.
Calculus and some substitutions show F ′(α) > 0 α = 1 is a corollary of argument.
On the R´ enyi Entropy of Log-Concave Sequences Methods
Proof
Proof: Given Y , by majorization argument, exists two sided geometric X st Hα(Y ) − H∞(Y ) ≤ Hα(X) − H∞(X). By direct argument Hα(X) − H∞(X) < log α
1 α−1 .
Direct computation on Geometric distribution f (n) = (1 − p)np with p → 0 yields equality.
On the R´ enyi Entropy of Log-Concave Sequences Methods
Entropic Roger-Shephard for Discrete Log-Concave Variables
- M. & Tkocz
For X and Y iid and log-concave on Z then Hα(X − Y ) < Hα(X) + log c(α), (5) for universal c(α). Proof Sketch: H2(X − Y ) = H∞(X) By R´ enyi entropy comparison Hα(X − Y ) can be compared to H2(X − Y ) = H∞(X) which can be compared to Hα(X).
On the R´ enyi Entropy of Log-Concave Sequences Methods
Entropic Roger-Shephard for Discrete Log-Concave Variables
- M. & Tkocz
For X and Y iid and log-concave on Z then Hα(X) ≤ Hα(X − Y ) ≤ Hα(X) + log c(α). (6) c(α) = 2α
1 α−1 for α > 2, and c(α) = α 1 α−1 for α ≤ 2.
Consequences: H2(X − Y ) < H2(X) + log 2 (Sharp) H∞(X − Y ) < H∞(X) + log 2 (Sharp) H(X − Y ) < H(X) + log e
On the R´ enyi Entropy of Log-Concave Sequences Methods
Entropic Rogers-Shephard for Log-Concave Vectors
Conjecture: Madiman & Kontoyannis ’15 For X and Y iid log-concave random vectors in Rd h(X − Y ) ≤ h(X) + d log 2 (7) Rogers-Shephard ’57 : h0(X − Y ) ≤ h0(X) + log 2d d
- (8)
Equality for X uniform on simplex. 2d
d
- ∼ 4d, h0(X − Y ) ≤ h0(X) + d log 4
On the R´ enyi Entropy of Log-Concave Sequences Methods
Entropic Rogers-Shephard for Log-Concave Vectors
Theorem (M. & Tkocz) For X and Y iid random vectors in Rd and α ∈ [2, ∞] hα(X) ≤ hα(X − Y ) ≤ hα(X) + d log 2 (9) when α ∈ (0, 2) hα(X) ≤ hα(X − Y ) ≤ hα(X) + d log α
1 α−1
(10) For α ≥ 2, sharp for exponential distribution. (tensorize for d-dimensional result). h(X − Y ) ≤ h(X) + d log e
Bobkov & Madiman ’11
2d
d
- ∼ 4d, h0(X − Y ) ≤ h0(X) + d log 4
On the R´ enyi Entropy of Log-Concave Sequences Methods
Summary
For a broad class of discrete variables, R´ enyi entropies are equivalent up to an additive constant Deepen parallels between discrete and continuous log-concavity theories Reversals of “R´ enyi entropy power inequalities” Furthers connections between convex geometric and information theoretic inequalities
On the R´ enyi Entropy of Log-Concave Sequences Methods
The end
Thank you! An open question: Conjecture (M. & Tkocz) Let (yn)N
n=1 be a finite positive monotone and concave sequence,
that is yn ≥ yn−1+yn+1
2
, 1 < n < N. Then for every γ > 0, the function K(t) = (t + γ)
N
- n=1