on the shamai laroia approximation for the information
play

On the Shamai-Laroia Approximation for the Information Rate of the - PowerPoint PPT Presentation

On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014 Information Theory and Applications


  1. On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014 Information Theory and Applications Workshop San Diego, USA February 2014 Acknowledgment: Prof. Tsachy Weissman, FP7 Network of Excellence in Wireless COMmunications NEWCOM#, Israel Science Foundation (ISF). Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 1 / 31

  2. Outline Introduction 1 ISI Channel and I.I.D. Information Rate Analytical Lower Bounds on the Information Rate The Shamai-Laroia Conjecture (SLC) Low SNR Analysis 2 Counterexamples 3 High SNR Analysis 4 Conclusion 5 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 2 / 31

  3. Introduction ISI Channel and I.I.D. Information Rate Inter-Symbol Interference Channel Model Input-output relationship: L − 1 � y k = h l x k − l + n k l =0 Input Sequence x ∞ −∞ is assumed i.i.d , with P x = Ex 2 0 Noise Sequence n ∞ −∞ is Gaussian and i.i.d., with N 0 = En 2 0 Inter-symbol interference (ISI) coefficients h L − 1 0 Channel frequency response H ( θ ) = � L − 1 k =0 h k e − jkθ The simplest (non-discrete) model for a channel with memory Ubiquitous in wireless and wireline communications Results presented here extend straightforwardly to a complex setting Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 3 / 31

  4. Introduction ISI Channel and I.I.D. Information Rate Inter-Symbol Interference Channel Model Input-output relationship: L − 1 � y k = h l x k − l + n k l =0 Input Sequence x ∞ −∞ is assumed i.i.d , with P x = Ex 2 0 Noise Sequence n ∞ −∞ is Gaussian and i.i.d., with N 0 = En 2 0 Inter-symbol interference (ISI) coefficients h L − 1 0 Channel frequency response H ( θ ) = � L − 1 k =0 h k e − jkθ The simplest (non-discrete) model for a channel with memory Ubiquitous in wireless and wireline communications Results presented here extend straightforwardly to a complex setting Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 3 / 31

  5. Introduction ISI Channel and I.I.D. Information Rate Mutual Information Rate Given by 1 2 K + 1 I ( y K − K ; x K − K ) = I ( y ∞ −∞ ; x 0 | x − 1 I = lim −∞ ) K →∞ Is the rate of reliable communications achievable by a random code with codewords distributed as x ∞ −∞ For Gaussian input I has a simple expression � π � � I Gaussian = 1 1 + P x | H ( θ ) | 2 log dθ 2 π N 0 − π When the input is distributed on a finite set (constellation), no closed form expression is known Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 4 / 31

  6. Introduction ISI Channel and I.I.D. Information Rate Approximating the I.I.D. Information Rate Two main ways to investigate I : 1 Approximations and bounds based on Monte-Carlo simulations Provide the best accuracy But little theoretic insight High computational complexity, that grows quickly with the number of dominant ISI taps c.f. [Arnold-Loeliger-Vontobel-Kavcic-Zeng’06] 2 Analytical lower bounds Not as tight as their simulation-based counterparts But much easier to compute Useful in benchmarking communication schemes and other bounds May provide theoretical insight Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 5 / 31

  7. Introduction ISI Channel and I.I.D. Information Rate Approximating the I.I.D. Information Rate Two main ways to investigate I : 1 Approximations and bounds based on Monte-Carlo simulations Provide the best accuracy But little theoretic insight High computational complexity, that grows quickly with the number of dominant ISI taps c.f. [Arnold-Loeliger-Vontobel-Kavcic-Zeng’06] 2 Analytical lower bounds Not as tight as their simulation-based counterparts But much easier to compute Useful in benchmarking communication schemes and other bounds May provide theoretical insight Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 5 / 31

  8. Introduction Analytical Lower Bounds on the Information Rate Data Processing Lower Bounds For any sequence of coefficients a ∞ −∞ , −∞ ) ≥ I ( � −∞ ; x 0 | x − 1 k a k y − k ; x 0 | x − 1 I = I ( y ∞ −∞ ) Can be simplified to, I ≥ I ( x 0 ; x 0 + � k ≥ 1 α k x k + m ) � �� � additive noise term with α k = � l a l h − l − k / � l a l h − l � l / ( � l a l h − l ) 2 ) independent of x 0 l a 2 and m ∼ N (0 , N 0 Different choices of a ∞ −∞ yield different bounds Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 6 / 31

  9. Introduction Analytical Lower Bounds on the Information Rate Data Processing with a Sample Whitened Matched Filter a ∞ −∞ are chosen so that α k = 0 for every k > 0 (non-causal ISI eliminated) In this case the noise term is purely Gaussian The resulting bound was first proposed in [Shamai-Ozarow-Wyner’91]: I ≥ I ( x 0 ; x 0 + m ) = I x ( SNR ZF-DFE ) with I x ( γ ) the MI of a scalar Gaussian channel at SNR γ with input x 0 SNR ZF-DFE the output SNR of the zero-forcing decision feedback equalizer (DFE): � 1 � π � SNR ZF-DFE = P x � | H ( θ ) | 2 � exp log dθ N 0 2 π − π Very simple, but quite loose in medium and low SNR’s Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 7 / 31

  10. Introduction Analytical Lower Bounds on the Information Rate Data Processing with a Mean Square WMF Choose a ∞ −∞ so that the noise term has minimum variance I ≥ I ( x 0 ; x 0 + � ) � I MMSE k ≥ 1 ˆ α k x k + ˆ m � �� � min variance A tight bound in many cases Still difficult to compute and analyze Some techniques for further bounding were proposed Using probability-of-error bounds and Fano’s inequality [Shamai-Laroia’96] Using a mismatched mutual information approach [Jeong-Moon’12] However, none of the resulting bounds is both simple and tight Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 8 / 31

  11. Introduction Analytical Lower Bounds on the Information Rate Data Processing with a Mean Square WMF Choose a ∞ −∞ so that the noise term has minimum variance I ≥ I ( x 0 ; x 0 + � ) � I MMSE k ≥ 1 ˆ α k x k + ˆ m � �� � min variance A tight bound in many cases Still difficult to compute and analyze Some techniques for further bounding were proposed Using probability-of-error bounds and Fano’s inequality [Shamai-Laroia’96] Using a mismatched mutual information approach [Jeong-Moon’12] However, none of the resulting bounds is both simple and tight Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 8 / 31

  12. Introduction The Shamai-Laroia Conjecture (SLC) The Shamai-Laroia Conjecture [Shamai-Laroia’96] conjectured that I MMSE is lower bounded by replacing x ∞ 1 with g ∞ 1 , i.i.d. Gaussian of equal variance: I MMSE = I ( x 0 ; x 0 + � k ≥ 1 ˆ α k x k + ˆ m ) ≥ I ( x 0 ; x 0 + � m ) = I x ( SNR MMSE-DFE-U ) � I SL k ≥ 1 ˆ α k g k + ˆ SNR MMSE-DFE-U is the output SNR of the unbiased MMSE DFE: � 1 � π � � � 1 + P x | H ( θ ) | 2 SNR MMSE-DFE-U = exp log dθ − 1 2 π N 0 − π I SL — a simple, tight and useful approximation for I MMSE , I Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 9 / 31

  13. Introduction The Shamai-Laroia Conjecture (SLC) The Shamai-Laroia Conjecture — Example BPSK input, h = [0 . 408 , 0 . 817 , 0 . 408] (moderate ISI severity) 1.2 I MMSE I SLC Gaussian upper bound Shamai-Ozarow-Wyner lower bound 1 0.8 Information [bits] 0.6 0.4 0.2 0 −10 −5 0 5 10 15 20 P x /N 0 [dB] Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 10 / 31

  14. Introduction The Shamai-Laroia Conjecture (SLC) The Shamai-Laroia Conjecture — Example, cont’ BPSK input, h = [0 . 408 , 0 . 817 , 0 . 408] (moderate ISI severity) 0.82 1 0.99 0.8 0.98 0.78 Information [bits] Information [bits] 0.97 0.76 0.96 0.95 0.74 0.94 0.72 0.93 10 11 12 13 14 15 16 4.6 4.8 5 5.2 5.4 5.6 5.8 6 6.2 P x /N 0 [dB] P x /N 0 [dB] I MMSE I SLC Gaussian upper bound Shamai-Ozarow-Wyner lower bound Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 11 / 31

  15. Introduction The Shamai-Laroia Conjecture (SLC) The “Strong SLC” and its Refutation A stronger version of the SLC reads I ( x 0 ; x 0 + � k ≥ 1 α k x k + m ) ≥ I ( x 0 ; x 0 + � k ≥ 1 α k g k + m ) for every α ∞ α ∞ −∞ and m (not just ˆ −∞ and ˆ m ) [Abbe-Zheng’12] gave a counterexample Based on a geometrical tool using the Hermite polynomials Cannot straightforwardly refute the original SLC Uses continuous-valued input distributions (finite alphabet is more interesting) Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 12 / 31

  16. Introduction The Shamai-Laroia Conjecture (SLC) The “Strong SLC” and its Refutation A stronger version of the SLC reads I ( x 0 ; x 0 + � k ≥ 1 α k x k + m ) ≥ I ( x 0 ; x 0 + � k ≥ 1 α k g k + m ) for every α ∞ α ∞ −∞ and m (not just ˆ −∞ and ˆ m ) [Abbe-Zheng’12] gave a counterexample Based on a geometrical tool using the Hermite polynomials Cannot straightforwardly refute the original SLC Uses continuous-valued input distributions (finite alphabet is more interesting) Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA 2014 12 / 31

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend