geometrie stochastique et theorie de l information
play

GEOMETRIE STOCHASTIQUE ET THEORIE DE LINFORMATION F. Baccelli - PowerPoint PPT Presentation

GEOMETRIE STOCHASTIQUE ET THEORIE DE LINFORMATION F. Baccelli INRIA & ENS En collaboration avec V. Anantharam, UC Berkeley SMAI 2011, Mai 2011 1 Structure of the Lecture Shannon Capacity and Error


  1. ✬ ✩ GEOMETRIE STOCHASTIQUE ET THEORIE DE L’INFORMATION F. Baccelli INRIA & ENS En collaboration avec V. Anantharam, UC Berkeley SMAI 2011, Mai 2011 ✫ ✪

  2. ✬ ✩ 1 Structure of the Lecture Shannon Capacity and Error Exponents for Point Processes – Additive White Gaussian Noise AWGN – Additive Stationary Ergodic Noise ASEN Shannon Capacity and Error Exponents for Additive Noise Channels with Power Constraints G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  3. ✬ ✩ 2 AWGN DISPLACEMENT OF A POINT PROCESS µ n : (simple) stationary ergodic point process on I R n . λ n = e nR : intensity of µ n . k } : points of µ n (codewords). { T n P 0 n : Palm probability of µ n . I { D n k } : i.i.d. sequence of displacements, independent of µ n : D n k = ( D n k (1) , . . . , D n k ( n )) i.i.d. over the coordinates and N (0 , σ 2 ) (noise). Z n k = T n k + D n k : displacement of the p.p. (received messages) G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  4. ✬ ✩ 3 AWGN UNDER MLE DECODING {V n k } : Voronoi cell of T n k in µ n . Error probability under MLE decoding: � k 1 T n k ∈ B n (0 ,A ) 1 Z n ∈V n k / P 0 n ( Z n ∈ V n P 0 n ( D n ∈ V n k p e ( n ) = I 0 / 0 ) = I 0 / 0 ) = lim � k 1 T n A →∞ k ∈ B n (0 ,A ) Theorem 1-wgn Poltyrev [94] 1. If R < − 1 2 log(2 πeσ 2 ) , there exists a sequence of point pro- cesses µ n (e.g. Poisson) with intensity e nR s.t. p e ( n ) → 0 , n → ∞ 2. If R > − 1 2 log(2 πeσ 2 ) , for all sequences of point processes µ n with intensity e nR , p e ( n ) → 1 , n → ∞ G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  5. ✬ ✩ 4 Proof of 2 [AB 08] – V n ( r ) : volume of the n -ball or radius r . 0 | = V n ( √ nL n ) , – By monotonicity arguments, if |V n � � n � ∈ B n (0 , √ nL n )) = I 1 0 ( i ) 2 ≥ L 2 P 0 n ( D n ∈ V n P 0 n ( D n P 0 D n I 0 / 0 ) ≥ I 0 / n n n i =1 – By the SLLN, �� � � � � n � 1 � � 0 ( i ) 2 − σ 2 P 0 D n � ≥ ǫ = η ǫ ( n ) → n →∞ 0 I � � n � n i =1 – Hence n ( σ 2 − ǫ ≥ L 2 P 0 n ( D n ∈ V n P 0 0 ) ≥ I n ) − η ǫ ( n ) I 0 / � n ( σ 2 − ǫ )) < |V n P 0 = 1 − I n ( V n ( 0 | ) − η ǫ ( n ) G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  6. ✬ ✩ 5 Proof of 2 [AB 08] ( continued ) – By Markov ineq. � E 0 n ( |V n 0 | ) I n ( σ 2 − ǫ ))) ≤ P 0 n ( |V n 0 | > V n ( I � n ( σ 2 − ǫ )) V n ( – By classical results on the Voronoi tessellation 0 | ) = 1 E 0 n ( |V n = e − nR I λ n – By classical results n n 2 r n 2 r n π π 2 + 1) ∼ V n ( r ) = � n � n √ πn Γ( n 2 2 e – Hence E 0 n ( |V n 0 | ) I 2 log(2 πe ( σ 2 − ǫ )) → n →∞ 0 ∼ e − nR e − n � n ( σ 2 − ǫ )) V ( since R > − 1 2 log(2 πeσ 2 ) . G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  7. ✬ ✩ 6 AWN DISPLACEMENT OF A POINT PROCESS Same framework as above concerning the p.p. µ n . { D n k } : i.i.d. sequence of centered displacements, independent of µ n . D n k = ( D n k (1) , . . . , D n k ( n )) : i.i.d. coordinates with a density f with well defined differential entropy � h ( D ) = − f ( x ) log( f ( x )) dx I R If D is N (0 , σ 2 ) , h ( D ) = 1 2 log(2 πeσ 2 ) G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  8. ✬ ✩ 7 AWN UNDER TYPICALITY DECODING Aim: For all n , find a partition {C n R n , jointly stationary k } of I with µ n such that P 0 n ( D n ∈ C n 0 ) → n →∞ 0 p e ( n ) = I 0 / Theorem 1-wn 1. If R < − h ( D ) , there exists a sequence of point processes µ n (e.g. Poisson) with intensity e nR and a partition s.t. p e ( n ) → 0 , n → ∞ 2. If R > − h ( D ) , for all sequences of point processes µ n with intensity e nR , for all jointly stationary partitions, p e ( n ) → 1 , n → ∞ G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  9. ✬ ✩ 8 Proof of 1 Let µ n be a Poisson p.p. with intensity e nR with R + h ( D ) < 0 . For all n and δ , let � � � � � � n � � − 1 � � R n : A n ( y (1) , . . . , y ( n )) ∈ I log f ( y ( i )) − h ( D ) δ = � < δ � � n i =1 P n 0 (( D n 0 (1) , . . . , D n 0 ( n )) ∈ A n δ ) → n →∞ 1 By the SLLN, I G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  10. ✬ ✩ 9 Proof of 1 ( continued ) C n k contains – all the locations x which belong to the set T n k + A n δ and to no other set of the form T n l + A n δ ; – all the locations x that are ambiguous and which are closer to T n k than to any other point; – all the locations which are uncovered and which are closer to T n k than to any other point. G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  11. ✬ ✩ 10 Proof of 1 ( continued ) µ n = µ n − ǫ 0 under I P 0 Let � n Basic bound: P 0 n ( D n ∈ C n P 0 n ( D n ∈ A n P 0 n ( D n 0 ∈ A n µ n ( D n 0 − A n 0 ) ≤ I I 0 / 0 / δ ) + I δ , � δ ) > 0) The first term tends to 0 because of the SLLN. For the second, use Slivnyak’s theorem to bound it from above by P ( µ n ( D n 0 − A n E ( µ n ( D n 0 − A n I δ ) > 0) ≤ I δ )) E ( µ n ( − A n δ )) = e nR | A n = I δ | G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  12. ✬ ✩ 11 Proof of 1 ( continued ) But � � n � � n e n 1 P ( D n 0 ∈ A n i =1 log( f ( y ( i ))) dy 1 ≥ I δ ) = f ( y ( i )) dy = n i =1 A n A n δ δ � e n ( − h ( D ) − δ ) dy = e − n ( h ( D )+ δ ) | A n ≥ δ | A n δ so that | A n δ | ≤ e n ( h ( D )+ δ ) Hence the second term is bounded above by e nR e n ( h ( D )+ δ ) → n →∞ 0 since R + h ( D ) < 0 . G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  13. ✬ ✩ 12 EXAMPLES Examples of A n δ sets for white noise with variance σ 2 : – Gaussian case: difference of two concentric L 2 n –balls of radius approximately √ nσ . – Symmetric exponential case: difference of two concentric L 1 n –balls of radius approximately n σ 2 . √ √ – Uniform case: n -cube of side 2 3 σ . G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  14. ✬ ✩ 13 ADDITIVE STATIONARY AND ERGODIC DISPLACEMENT OF A POINT PROCESS Setting – Same framework as above concerning the p.p. µ n . – {D} k : i.i.d. sequence of centered, stationary and ergodic displacement processes, independent of the p.p.s. k = ( D k (1) , . . . , D k ( n )) with density f n on I – For all n , D n R n . G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  15. ✬ ✩ 14 ADDITIVE STATIONARY AND ERGODIC DISPLACEMENT OF A POINT PROCESS ( continued ) D : with well defined differential entropy rate h ( D ) – H ( D n ) differential entropy of D n = ( D (1) , . . . , D ( n )) – h ( D ) defined by � 1 n →∞ − 1 nH ( D n ) = lim ln( f n ( x n )) f n ( x n ) dx n . h ( D ) = lim n n →∞ R n I Typicality sets � � � � � � � − 1 x n = ( x (1) , . . . , x ( n )) ∈ I R n : � � A n n log( f n ( x n )) − h ( D ) δ = � < δ . G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  16. ✬ ✩ 15 ASEN UNDER TYPICALITY DECODING Theorem 1-sen 1. If R < − h ( D ) , there exists a sequence of point processes µ n (e.g. Poisson) with intensity e nR and a partition s.t. p e ( n ) → 0 , n → ∞ 2. If R > − h ( D ) , for all sequences of point processes µ n with intensity e nR , for all jointly stationary partitions, p e ( n ) → 1 , n → ∞ Proof: similar to that of the i.i.d. case. G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  17. ✬ ✩ 16 COLORED GAUSSIAN NOISE EXAMPLE {D} regular stationary and ergodic Gaussian process with spectral density g ( β ) , covariance matrix Γ n : E [ D ( i ) D ( j )] = Γ n ( i, j ) = r ( | i − j | ) and � 2 π E [ D (0) D ( k )] = 1 e ikβ g ( β ) dβ. 2 π 0 G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

  18. ✬ ✩ 17 COLORED GAUSSIAN NOISE EXAMPLE ( continued ) – Differential entropy rate:     � π h ( D ) = 1  1  .  2 eπ exp  2 ln ln( g ( β )) dβ 2 π − π – Typicality sets: � � � � 1 n x n − 1 + d ( n ) � � A n n ( x n ) t Γ − 1 δ = � < 2 δ, � with   � π d ( n ) = 1  1  → n →∞ 0 . n ln(Det(Γ n )) − ln( g ( β )) dβ 2 π − π G´ eom´ etrie Stochastique et Th´ eorie de l’Information ✫ ✪ V. A. & F. B.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend