Complete Monotonicity Conjecture of Heat Equation
MDDS, SJTU, 2019 Fan Cheng Shanghai Jiao Tong University
chengfan@sjtu.edu.cn
Complete Monotonicity Conjecture of Heat Equation MDDS, SJTU, 2019 - - PowerPoint PPT Presentation
How to Solve Gaussian Interference Channel Complete Monotonicity Conjecture of Heat Equation MDDS, SJTU, 2019 Fan Cheng Shanghai Jiao Tong University chengfan@sjtu.edu.cn From 2008 to 2019 2 2 2 , =
chengfan@sjtu.edu.cn
π2 2ππ¦2 π π¦, π’ = π ππ’ π(π¦, π’) β π = ββ« π¦logπ¦ dπ¦ π = π + π’π π βΌ πͺ(0,1)
ππ’ βΌ πͺ(0, π’)
Gaussian Channel: X and Z are mutually independent. The p.d.f of X is g(x) π
π’ is the convolution of X and ππ’.
π
π’ β π + ππ’
The probability density function (p.d.f.) of π
π’
π(π§; π’) = β« π(π¦) 1 2ππ’ π
(π§βπ¦)2 2π’
π ππ’ π(π§; π’) = 1 2 π2 ππ§2 π(π§; π’)
The p.d.f. of Y is the solution to the heat equation, and vice versa. Gaussian channel and heat equation are identical in mathematics.
A mathematical theory of communication, Bell System Technical Journal.
Ludwig Eduard Boltzmann 1844-1906 Vienna, Austrian Empire π = βππΆlnπ π = βππβ
π
ππlnππ ππ ππ’ = (ππ ππ’)force + (ππ ππ’)diff + (ππ ππ’)coll πΌ(π(π’))is nonβdecreasing
β‘ πΌ(π(π’)) is CM in π’, when π π’ satisfies Boltzmann equation β‘ False, disproved by E. Lieb in 1970s β‘ the particular Bobylev-Krook-Wu explicit solutions, this βtheoremβ holds true for π β€ 101 and breaks downs afterwards
National Academy of Sciences
A function is completely monotone (CM) iff all the signs of its derivatives are alternating in +/-: +, -, +, -,β¦β¦ (e.g., 1/π’, πβπ’ )
β‘ Equivalently, is πΌ(π + π’π) CM in t? β‘ The signs of the first two order derivatives were obtained β‘ Failed to obtain the 3rd and 4th. (It is easy to compute the derivatives, it is hard to obtain their signs)
βThis suggests thatβ¦β¦, etc., but I could not prove itβ
Central limit theorem Capacity region of Gaussian broadcast channel Capacity region of Gaussian Multiple-Input Multiple-Output broadcast channel Uncertainty principle
Generalization, new proof, new connection. E.g., Gaussian interference channel is
π ππ’ β(π +
1 π½(π+π) β₯ 1 π½(π) + 1 π½(π)
Costaβs EPI: π2β(π
π’) is concave in π’
Derived the first two derivatives by very involved calculus (1986)
IT society did not know McKeanβs paper until 2014
Log-Sobolev inequality
The first two derivatives are not commonly used in network information theory
In geometry, mathematician need the first derivative to estimate the speed
Relation with CLT
ο° Shannon Entropy power inequality ο° Fisher information inequality ο° β(π + π’π) ο° β π π’ is CM ο° When π(π’) satisfied Boltzmann equation, disproved ο° When π(π’) satisfied heat equation, unknown ο° We even donβt know what CM is!
Information theorists get lost in the past 70 years Mathematician ignored it ο΅ Raymond introduced this paper to me in 2008 ο΅ I made some progress with Chandra Nair in 2011 (MGL) ο΅ Complete monotonicity (CM) was discovered in 2012 ο΅ The third derivative in 2013 (Key breakthrough) ο΅ The fourth order in 2014 ο΅ Recently, CM ο GIC
π± π+ ππ π
π 2ππ’ β π +
π ππ’ π½ π +
1 2 ln 2πππ’, π½ π +
1 π’ . π½ is CM: +, -, +, -β¦
My own opinion:
π’ = ββ« π(π§, π’) ln π(π§, π’) ππ§: no closed-form expression
π’ = β« π
1 2
π ππ§
π’ = ββ« π
2
π β π
1 2
π2 2
ο¬ the consequences of the evolution of the entropy and of its subsequent derivatives along the solution to the heat equation have important consequences. ο¬ Indeed the argument of McKean about the signs of the first two derivatives are equivalent to the proof of the logarithmic Sobolev inequality.
Gaussian optimality for derivatives of differential entropy using linear matrix inequalities
π π’ = ΰΆ±
β
πβπ’π¦ ππ(π¦)
A new expression for entropy involved special functions in mathematical physics
π
π’ is CM in π’, then log π½(π π’) is convex in π’ (Conjecture 1 implies
1946
A fundamental breakthrough in mathematical physics, information theory and any disciplines related to Gaussian distribution
A new expression for Fisher information
Derivatives are an invariant
Though β(π + π’π) looks very messy, certain regularity exists
Application: Gaussian interference channel?
No Failure, as heat equation is a physical phenomenon
A Gauss constant (e.g. 2019), where Gaussian distribution fails. Painful!
ο° Two fundamental channel coding problem: BC and GIC ο° β ππ1 + ππ2 + π1 , β ππ1 + ππ2 + π2 exceed the capability of EPI ο° Han-Kobayashi inner bound ο° Many researchers have contributed to this model ο° Foundation of wireless communication
β
πβπ’π¦ stands for complete monotonicity
ππ(π¦) serves as the identity of π½ π + π’π
Fisher Information = Complete Monotonicity + Borel Measure
Very useful in analysis and geometry
The thick shell is removed from Fisher information
ππ(π¦) is relatively easier to study than Fisher information
WE know very little about ππ(π¦)
The current constraints on ππ(π¦) are too loose
Only the βspecial oneβ is useful, otherwise every CM function should have the same meaning in information theory