heat equation
play

Heat Equation Fan Cheng John Hopcroft Center Computer Science and - PowerPoint PPT Presentation

[Pop-up Salon, Maths, SJTU, 2019/01/10] On the Complete Monotonicity of Heat Equation Fan Cheng John Hopcroft Center Computer Science and Engineering Shanghai Jiao Tong University chengfan@sjtu.edu.cn Overview 2 2 2 ,


  1. [Pop-up Salon, Maths, SJTU, 2019/01/10] On the Complete Monotonicity of Heat Equation Fan Cheng John Hopcroft Center Computer Science and Engineering Shanghai Jiao Tong University chengfan@sjtu.edu.cn

  2. Overview πœ– 2 2πœ–π‘¦ 2 𝑔 𝑦, 𝑒 = πœ– πœ–π‘’ 𝑔(𝑦, 𝑒) (Heat equation) β„Ž π‘Œ + π‘’π‘Ž 𝑍 = π‘Œ + π‘’π‘Ž β„Ž π‘Œ = βˆ’βˆ« 𝑦log𝑦 d𝑦 π‘Ž ∼ π’ͺ(0,1) (Gaussian Channel) CMI = 2 (H. P. McKean, 1966) CMI β‰₯ πŸ“ . Conjecture: CMI = +∞ (Cheng, 2015)

  3. Outline β–‘ β€œSuper - H” Theorem β–‘ Boltzmann equation and heat equation β–‘ Shannon Entropy Power Inequality β–‘ Complete Monotonicity Conjecture

  4. Fire and Civilization Steam engine Drill Myth: west and east James Watts The Wealth of Nations Independence of US 1776

  5. Study of Heat πœ– 2 πœ–π‘’ 𝑔 𝑦, 𝑒 = 1 πœ– πœ–π‘¦ 2 𝑔(𝑦, 𝑒) 2 β–‘ The history begins with the work of Joseph Heat transfer Fourier around 1807 β–‘ In a remarkable memoir, Fourier invented both Heat equation and the method of Fourier analysis for its solution

  6. Information Age Gaussian Channel: π‘Ž 𝑒 ∼ π’ͺ(0, 𝑒) X and Z are mutually independent. The p.d.f of X is g(x) Y is the convolution of X and : Y = X+ π‘Ž 𝑒 π‘Ž 𝑒 The p.d.f. of Y (π‘§βˆ’π‘¦) 2 1 𝑔(𝑧; 𝑒) = ∫ 𝑕(𝑦) 𝑓 2𝑒 2πœŒπ‘’ A mathematical theory of πœ– 2 πœ–π‘’ 𝑔(𝑧; 𝑒) = 1 πœ– πœ–π‘§ 2 𝑔(𝑧; 𝑒) communication, Bell System Technical 2 Journal. 27 (3): 379 – 423. Fundamentally, Gaussian channel and heat equation are identical in mathematics (Gaussian mixture model)

  7. Entropy Formula Second law of thermodynamics: one way only Entropy

  8. Ludwig Boltzmann Boltzmann formula: 𝑇 = βˆ’π‘™ 𝐢 ln𝑋 Gibbs formula: 𝑇 = βˆ’π‘™ 𝑐 βˆ‘ π‘ž 𝑗 lnπ‘ž 𝑗 𝑗 Boltzmann equation: 𝑒𝑔 𝑒𝑒 = (πœ–π‘” πœ–π‘’) force + (πœ–π‘” πœ–π‘’) diff + (πœ–π‘” Ludwig Eduard Boltzmann πœ–π‘’) coll 1844-1906 Vienna, Austrian Empire H-theorem: 𝐼(𝑔(𝑒)) is nonβˆ’decreasing

  9. β€œSuper H - theorem” for Boltzmann Equation β–‘ Notation β€’ A function is completely monotone (CM) iff all the signs of its 1 𝑒 , 𝑓 βˆ’π‘’ ) derivatives are: +, -, +, - ,…… (e.g., β–‘ McKean’s Conjecture on Boltzmann equation (1966): β–‘ 𝐼(𝑔(𝑒)) is CM in 𝑒, when 𝑔 𝑒 satisfies Boltzmann equation β–‘ False, disproved by E. Lieb in 1970s β–‘ the particular Bobylev-Krook- Wu explicit solutions, this β€œtheorem” holds true for n ≀ 101 and breaks downs afterwards H. P. McKean, NYU. National Academy of Sciences

  10. β€œSuper H - theorem” for Heat Equation β–‘ Heat equation: Is 𝐼(𝑔(𝑒)) CM in 𝑒 , if 𝑔(𝑒) satisfies heat equation β–‘ Equivalently, is 𝐼(π‘Œ + π‘’π‘Ž) CM in t? β–‘ The signs of the first two order derivatives were obtained Failed to obtain the 3 rd and 4 th . (It is easy to compute the β–‘ derivatives, it is hard to obtain their signs) β€œThis suggests that……, etc., but I could not prove it” C. Villani, 2010 Fields Medalist

  11. Claude E. Shannon and EPI Central limit theorem Capacity region of Gaussian broadcast channel Capacity region of Gaussian Multiple-Input Multiple-Output broadcast channel Uncertainty principle All of them can be proved by Entropy power inequality (EPI) β–‘ Entropy power inequality (Shannon 1948): For any two independent continuous random variables X and Y, 𝑓 2β„Ž(π‘Œ+𝑍) β‰₯ 𝑓 2β„Ž(π‘Œ) + 𝑓 2β„Ž(𝑍) Equalities holds iff X and Y are Gaussian β–‘ Motivation: Gaussian noise is the worst noise. β–‘ Impact: A new characterization of Gaussian distribution in information theory β–‘ Comments: most profound! (Kolmogorov)

  12. Entropy Power Inequality β–‘ Shannon himself didn’t give a proof but an explanation, which turned out to be wrong β–‘ The first proof is given by A. J. Stam (1959), N. M. Blachman (1966) β–‘ Research on EPI Generalization, new proof, new connection. E.g., Gaussian interference channel is open, some stronger β€œEPI’’ should exist. β–‘ Stanford Information Theory School: Thomas Cover and his students: A. El Gamel, M. H. Costa, A. Dembo, A. Barron (1980- 1990) β–‘ Princeton Information Theory School: Sergio Verdu, etc. (2000s) Battle field of Shannon theory

  13. Ramification of EPI Gaussian perturbation: β„Ž(π‘Œ + π‘’π‘Ž) Shannon EPI πœ– Fisher Information: I π‘Œ + π‘’π‘Ž = πœ–π‘’ β„Ž(π‘Œ + π‘’π‘Ž)/2 Fisher Information is decreasing in 𝑒 Fisher information inequality (FII): 𝑓 2β„Ž(π‘Œ+ π‘’π‘Ž) is concave in 𝑒 1 1 1 𝐽(π‘Œ+𝑍) β‰₯ 𝐽(π‘Œ) + 𝐽(𝑍) Status Quo: FII can imply EPI and all its Tight Young’s inequality generalizations. However, life is always π‘Œ + 𝑍 𝑠 β‰₯ 𝑑 π‘Œ π‘ž 𝑍 π‘Ÿ hard. FII is far from enough

  14. On π‘Œ + π‘’π‘Ž π‘Œ is arbitrary and β„Ž(π‘Œ) may not exist When t β†’ 0, π‘Œ + π‘’π‘Ž β†’ π‘Œ . When 𝑒 β†’ ∞ , π‘Œ + π‘’π‘Ž β†’ Gaussian When t > 0, π‘Œ + π‘’π‘Ž and β„Ž(π‘Œ + π‘’π‘Ž) are infinitely differential π‘’π‘Ž is called mixed gaussian distribution (Gaussian Mixed Model π‘Œ + (GMM) in machine learning) π‘’π‘Ž is Gaussian channel/source in information theory π‘Œ + Gaussian noise is the worst additive noise Gaussian distribution maximizes β„Ž(π‘Œ) Entropy power inequality, central limit theorem, etc.

  15. Where we take off  Shannon Entropy power inequality Information theorists got lost  Fisher information inequality in the past 70 years  β„Ž(π‘Œ + π‘’π‘Ž) Mathematicians ignored it  β„Ž 𝑔 𝑒 is CM  When 𝑔(𝑒) satisfied Boltzmann equation, disproved  When 𝑔(𝑒) satisfied heat equation, unknown  We even don’t know what CM is! 𝒇 βˆ’π’– 𝒂) , Motivation: to study some inequalities; e.g., the convexity of π’Š(𝒀 + 𝑱 𝒀+ 𝒖𝒂 the concavity of 𝒖 β€œAny progress?” It is widely believed that there should be no β€œNone…” new EPI except Shannon EPI and FII.

  16. Discovery πœ– π‘’π‘Ž β‰₯ 0 (de Bruijn, 1958) 𝐽 π‘Œ + π‘’π‘Ž = 2πœ–π‘’ β„Ž π‘Œ + 𝐽 (1) = πœ– π‘’π‘Ž ≀ 0 (McKean1966, Costa 1985) πœ–π‘’ 𝐽 π‘Œ + Observation: 𝑱(𝒀 + 𝒖𝒂) is convex in 𝒖 1 1  β„Ž π‘Œ + 𝑒 . 𝐽 is CM: +, -, +, - … π‘’π‘Ž = 2 ln 2πœŒπ‘“π‘’, 𝐽 π‘Œ + π‘’π‘Ž =  If the observation is true, the first three derivatives are: +, -, +  Q: Is the 4 th order derivative -? Because π‘Ž is Gaussian!  The signs of derivatives of β„Ž(π‘Œ + π‘’π‘Ž) are independent of π‘Œ . Invariant!  Exactly the same problem in McKean 1966 To convince people, we must prove its convexity

  17. Challenge Let π‘Œ ∼ 𝑕(𝑦) οƒ˜ β„Ž 𝑍 𝑒 = βˆ’βˆ« 𝑔(𝑧, 𝑒) ln 𝑔(𝑧, 𝑒) 𝑒𝑧 , no closed form except some special 𝑕(𝑦) . 𝑔(𝑧, 𝑒) satisfies heat equation. 2 𝑔 οƒ˜ 𝐽 𝑍 1 𝑒 = ∫ 𝑔 𝑒𝑧 2 2 οƒ˜ 𝐽 1 𝑍 𝑔 𝑔 2 1 𝑒 = βˆ’βˆ« 𝑔 βˆ’ 𝑒𝑧 𝑔 2 οƒ˜ So what is 𝐽 (2) ? (Heat equation, integration by parts)

  18. Challenge (cont’d) It is trivial to calculate derivatives. It is hard to prove their signs

  19. Breakthrough Integration by parts: ∫ 𝑣𝑒𝑀 = 𝑣𝑀 βˆ’ ∫ 𝑀𝑒𝑣 First breakthrough since McKean 1966

  20. GCMC Gaussian complete monotonicity conjecture: 𝑱(𝒀 + 𝒖𝒂) is CM in 𝒖 Conjecture: 𝐦𝐩𝐑𝑱(𝒀 + 𝒖𝒂) is convex in 𝒖 Pointed out by C. Villani and G. Toscani the connection with McKean’s paper A general form: number partition. Hard to determine the coefficients. Hard to find 𝛾 𝑙,π‘˜ !

  21. Complete monotone function How to construct 𝑕(𝑦) ? A new expression for entropy involved special functions in mathematical physics Herbert R. Stahl, 2013

  22. Complete monotone function  A function f(t) is CM, then logf(t) is convex in t  𝐽 𝑍 𝑒 is CM in t, then log 𝐽(𝑍 𝑒 ) is convex in t  A function f(t) is CM, a Schur-convex function can be obtained by f(t)  Schur-convex β†’ Majority theory Remarks: The current tools in information theory don’t work. More sophisticated tools should be built to attack this problem. A new mathematical theory of information theory

  23. Potential application: Interference channel ο‚‘ A challenge question: what is the application of GCMC? ο‚‘ Mathematical speaking, a beautiful result on a fundamental problem will be very useful ο‚‘ Potential Application Central limit theorem Capacity region of Gaussian broadcast channel Capacity region of Gaussian Multiple-Input Multiple-Output broadcast channel Uncertainty principle Where EPI works Gaussian interference channel: open since 1970s CM is considered to be much Where EPI fails more powerful than EPI

  24. Remarks ο‚‘ If GCMC is true ο‚‘ A fundamental breakthrough in mathematical physics, information theory and any disciplines related to Gaussian distribution ο‚‘ A new expression for Fisher information ο‚‘ Derivatives are an invariant ο‚‘ Though β„Ž(π‘Œ + π‘’π‘Ž) looks very messy, certain regularity exists ο‚‘ Application: Gaussian interference channel? ο‚‘ If GCMC is false ο‚‘ No Failure, as heat equation is a physical phenomenon ο‚‘ A lucky number (e.g. 2019) where Gaussian distribution fails. Painful!

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend