channel capacity and isoperimetric inequality
play

Channel Capacity and Isoperimetric Inequality Iftach Haitner Tel - PowerPoint PPT Presentation

Application of Information Theory, Lecture 5 Channel Capacity and Isoperimetric Inequality Iftach Haitner Tel Aviv University. November 25, 2014 Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 1 / 21 Part


  1. Shannon’s result ◮ Shannon showed that you can reduce the error rate towards 0, without reducing the transmission rate towards 0 ◮ For any c < C p , exists a code with transmission rate c that is correct w.p. ◮ Example: for p = . 1, C p > 1 2 . Hence, for sending x = ( x 1 , . . . , x m ) , one can send 2 m bits, such that x is recovered w.p. close to 1 ◮ More generally, ∀ p ∈ [ 0 , 1 ] ∃ C p such that for sending x ∈ { 0 , 1 } m , one can send ≈ m C p bits, and x is recovered w.p. close to 1 ◮ C p might be 0 (i.e., for p = 1 2 ) ◮ A revolution in EE and the whole world Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 5 / 21

  2. Error correction code Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  3. Error correction code ◮ Message to send x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  4. Error correction code ◮ Message to send x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m ◮ Encoding scheme: f : { 0 , 1 } m �→ { 0 , 1 } n ( n > m ) Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  5. Error correction code ◮ Message to send x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m ◮ Encoding scheme: f : { 0 , 1 } m �→ { 0 , 1 } n ( n > m ) ◮ Decoding scheme: g : { 0 , 1 } n �→ { 0 , 1 } m Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  6. Error correction code ◮ Message to send x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m ◮ Encoding scheme: f : { 0 , 1 } m �→ { 0 , 1 } n ( n > m ) ◮ Decoding scheme: g : { 0 , 1 } n �→ { 0 , 1 } m m n — transmission rate ◮ Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  7. Error correction code ◮ Message to send x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m ◮ Encoding scheme: f : { 0 , 1 } m �→ { 0 , 1 } n ( n > m ) ◮ Decoding scheme: g : { 0 , 1 } n �→ { 0 , 1 } m m n — transmission rate ◮ ◮ Sender sends f ( x ) rather than x Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  8. Error correction code ◮ Message to send x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m ◮ Encoding scheme: f : { 0 , 1 } m �→ { 0 , 1 } n ( n > m ) ◮ Decoding scheme: g : { 0 , 1 } n �→ { 0 , 1 } m m n — transmission rate ◮ ◮ Sender sends f ( x ) rather than x ◮ Receiver decodes the message by applying g Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  9. Error correction code ◮ Message to send x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m ◮ Encoding scheme: f : { 0 , 1 } m �→ { 0 , 1 } n ( n > m ) ◮ Decoding scheme: g : { 0 , 1 } n �→ { 0 , 1 } m m n — transmission rate ◮ ◮ Sender sends f ( x ) rather than x ◮ Receiver decodes the message by applying g encoding decoding channel x − → f ( x ) − → f ( x ) ⊕ Z − → g ( f ( x ) ⊕ Z ) ◮ ���� ���� � �� � m bits n bits bitwise XOR Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) (i.e., over { 0 , 1 } with Pr [ Z i = 1 ] = p ) Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  10. Error correction code ◮ Message to send x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m ◮ Encoding scheme: f : { 0 , 1 } m �→ { 0 , 1 } n ( n > m ) ◮ Decoding scheme: g : { 0 , 1 } n �→ { 0 , 1 } m m n — transmission rate ◮ ◮ Sender sends f ( x ) rather than x ◮ Receiver decodes the message by applying g encoding decoding channel x − → f ( x ) − → f ( x ) ⊕ Z − → g ( f ( x ) ⊕ Z ) ◮ ���� ���� � �� � m bits n bits bitwise XOR Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) (i.e., over { 0 , 1 } with Pr [ Z i = 1 ] = p ) ◮ We hope g ( f ( x ) ⊕ Z ) = x Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  11. Error correction code ◮ Message to send x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m ◮ Encoding scheme: f : { 0 , 1 } m �→ { 0 , 1 } n ( n > m ) ◮ Decoding scheme: g : { 0 , 1 } n �→ { 0 , 1 } m m n — transmission rate ◮ ◮ Sender sends f ( x ) rather than x ◮ Receiver decodes the message by applying g encoding decoding channel x − → f ( x ) − → f ( x ) ⊕ Z − → g ( f ( x ) ⊕ Z ) ◮ ���� ���� � �� � m bits n bits bitwise XOR Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) (i.e., over { 0 , 1 } with Pr [ Z i = 1 ] = p ) ◮ We hope g ( f ( x ) ⊕ Z ) = x ◮ ECCs are everywhere Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  12. Error correction code ◮ Message to send x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m ◮ Encoding scheme: f : { 0 , 1 } m �→ { 0 , 1 } n ( n > m ) ◮ Decoding scheme: g : { 0 , 1 } n �→ { 0 , 1 } m m n — transmission rate ◮ ◮ Sender sends f ( x ) rather than x ◮ Receiver decodes the message by applying g encoding decoding channel x − → f ( x ) − → f ( x ) ⊕ Z − → g ( f ( x ) ⊕ Z ) ◮ ���� ���� � �� � m bits n bits bitwise XOR Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) (i.e., over { 0 , 1 } with Pr [ Z i = 1 ] = p ) ◮ We hope g ( f ( x ) ⊕ Z ) = x ◮ ECCs are everywhere ◮ ECC Vs compression Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 6 / 21

  13. Shannon’s theorem Theorem 1 ∃ m ε , s.t. ∀ m > m ε and n > m ( 1 ∀ p ∃ C p , s.t. ∀ ε > 0 C p + ε ) , ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. ∀ x ∈ { 0 , 1 } m : z ← Z =( Z 1 ,..., Z n ) [ g ( f ( x ) ⊕ z ) � = x ] ≤ ε Pr for Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) . Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 7 / 21

  14. Shannon’s theorem Theorem 1 ∃ m ε , s.t. ∀ m > m ε and n > m ( 1 ∀ p ∃ C p , s.t. ∀ ε > 0 C p + ε ) , ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. ∀ x ∈ { 0 , 1 } m : z ← Z =( Z 1 ,..., Z n ) [ g ( f ( x ) ⊕ z ) � = x ] ≤ ε Pr for Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) . ◮ C p = 1 − h ( p ) — the channel capacity Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 7 / 21

  15. Shannon’s theorem Theorem 1 ∃ m ε , s.t. ∀ m > m ε and n > m ( 1 ∀ p ∃ C p , s.t. ∀ ε > 0 C p + ε ) , ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. ∀ x ∈ { 0 , 1 } m : z ← Z =( Z 1 ,..., Z n ) [ g ( f ( x ) ⊕ z ) � = x ] ≤ ε Pr for Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) . ◮ C p = 1 − h ( p ) — the channel capacity ⇒ C p = 0 . 5310 > 1 p = . 1 = 2 Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 7 / 21

  16. Shannon’s theorem Theorem 1 ∃ m ε , s.t. ∀ m > m ε and n > m ( 1 ∀ p ∃ C p , s.t. ∀ ε > 0 C p + ε ) , ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. ∀ x ∈ { 0 , 1 } m : z ← Z =( Z 1 ,..., Z n ) [ g ( f ( x ) ⊕ z ) � = x ] ≤ ε Pr for Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) . ◮ C p = 1 − h ( p ) — the channel capacity ⇒ C p = 0 . 5310 > 1 p = . 1 = 2 ⇒ C p ≈ 1 p = . 25 = 5 Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 7 / 21

  17. Shannon’s theorem Theorem 1 ∃ m ε , s.t. ∀ m > m ε and n > m ( 1 ∀ p ∃ C p , s.t. ∀ ε > 0 C p + ε ) , ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. ∀ x ∈ { 0 , 1 } m : z ← Z =( Z 1 ,..., Z n ) [ g ( f ( x ) ⊕ z ) � = x ] ≤ ε Pr for Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) . ◮ C p = 1 − h ( p ) — the channel capacity ⇒ C p = 0 . 5310 > 1 p = . 1 = 2 ⇒ C p ≈ 1 p = . 25 = 5 ◮ Tight theorem Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 7 / 21

  18. Shannon’s theorem Theorem 1 ∃ m ε , s.t. ∀ m > m ε and n > m ( 1 ∀ p ∃ C p , s.t. ∀ ε > 0 C p + ε ) , ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. ∀ x ∈ { 0 , 1 } m : z ← Z =( Z 1 ,..., Z n ) [ g ( f ( x ) ⊕ z ) � = x ] ≤ ε Pr for Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) . ◮ C p = 1 − h ( p ) — the channel capacity ⇒ C p = 0 . 5310 > 1 p = . 1 = 2 ⇒ C p ≈ 1 p = . 25 = 5 ◮ Tight theorem ◮ We prove a weaker variant that holds w.h.p. over x ← { 0 , 1 } m Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 7 / 21

  19. Hamming distance Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 8 / 21

  20. Hamming distance ◮ For y = ( y 1 , . . . , y n ) ∈ { 0 , 1 } n , let | y | = � i y i — Hamming weight of y Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 8 / 21

  21. Hamming distance ◮ For y = ( y 1 , . . . , y n ) ∈ { 0 , 1 } n , let | y | = � i y i — Hamming weight of y ◮ | y − y ′ | = | y ⊕ y ′ | — Hamming distance of y from y ′ ; # of places differ. Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 8 / 21

  22. Proving the theorem Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  23. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  24. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  25. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  26. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). ◮ We show ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ ε Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  27. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). ◮ We show ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ ε ◮ g ( y ) returns argmin x ′ ∈{ 0 , 1 } m | y − f ( x ′ ) | Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  28. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). ◮ We show ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ ε ◮ g ( y ) returns argmin x ′ ∈{ 0 , 1 } m | y − f ( x ′ ) | ◮ So it all boils down to finding f s.t. � � | f ( x ) − y | < min x ′ ∈{ 0 , 1 } m \{ x } | f ( x ′ ) − y | Pr x ←{ 0 , 1 } m ; y = f ( x ) ⊕ Z ≥ 1 − ε Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  29. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). ◮ We show ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ ε ◮ g ( y ) returns argmin x ′ ∈{ 0 , 1 } m | y − f ( x ′ ) | ◮ So it all boils down to finding f s.t. � � | f ( x ) − y | < min x ′ ∈{ 0 , 1 } m \{ x } | f ( x ′ ) − y | Pr x ←{ 0 , 1 } m ; y = f ( x ) ⊕ Z ≥ 1 − ε ◮ Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  30. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). ◮ We show ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ ε ◮ g ( y ) returns argmin x ′ ∈{ 0 , 1 } m | y − f ( x ′ ) | ◮ So it all boils down to finding f s.t. � � | f ( x ) − y | < min x ′ ∈{ 0 , 1 } m \{ x } | f ( x ′ ) − y | Pr x ←{ 0 , 1 } m ; y = f ( x ) ⊕ Z ≥ 1 − ε ◮ Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  31. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). ◮ We show ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ ε ◮ g ( y ) returns argmin x ′ ∈{ 0 , 1 } m | y − f ( x ′ ) | ◮ So it all boils down to finding f s.t. � � | f ( x ) − y | < min x ′ ∈{ 0 , 1 } m \{ x } | f ( x ′ ) − y | Pr x ←{ 0 , 1 } m ; y = f ( x ) ⊕ Z ≥ 1 − ε ◮ Idea: for p ′ > p to be determined later, find f s.t. w.h.p. over x and Z : (1) | f ( x ) ⊕ Z , f ( x ) | ≤ p ′ n (2) | f ( x ) ⊕ Z , f ( x ′ ) | > p ′ n for all x ′ � = x Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  32. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). ◮ We show ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ ε ◮ g ( y ) returns argmin x ′ ∈{ 0 , 1 } m | y − f ( x ′ ) | ◮ So it all boils down to finding f s.t. � � | f ( x ) − y | < min x ′ ∈{ 0 , 1 } m \{ x } | f ( x ′ ) − y | Pr x ←{ 0 , 1 } m ; y = f ( x ) ⊕ Z ≥ 1 − ε ◮ Idea: for p ′ > p to be determined later, find f s.t. w.h.p. over x and Z : (1) | f ( x ) ⊕ Z , f ( x ) | ≤ p ′ n (2) | f ( x ) ⊕ Z , f ( x ′ ) | > p ′ n for all x ′ � = x Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  33. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). ◮ We show ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ ε ◮ g ( y ) returns argmin x ′ ∈{ 0 , 1 } m | y − f ( x ′ ) | ◮ So it all boils down to finding f s.t. � � | f ( x ) − y | < min x ′ ∈{ 0 , 1 } m \{ x } | f ( x ′ ) − y | Pr x ←{ 0 , 1 } m ; y = f ( x ) ⊕ Z ≥ 1 − ε ◮ Idea: for p ′ > p to be determined later, find f s.t. w.h.p. over x and Z : (1) | f ( x ) ⊕ Z , f ( x ) | ≤ p ′ n (2) | f ( x ) ⊕ Z , f ( x ′ ) | > p ′ n for all x ′ � = x ◮ We choose f uniformly at random (what does it mean?) Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  34. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). ◮ We show ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ ε ◮ g ( y ) returns argmin x ′ ∈{ 0 , 1 } m | y − f ( x ′ ) | ◮ So it all boils down to finding f s.t. � � | f ( x ) − y | < min x ′ ∈{ 0 , 1 } m \{ x } | f ( x ′ ) − y | Pr x ←{ 0 , 1 } m ; y = f ( x ) ⊕ Z ≥ 1 − ε ◮ Idea: for p ′ > p to be determined later, find f s.t. w.h.p. over x and Z : (1) | f ( x ) ⊕ Z , f ( x ) | ≤ p ′ n (2) | f ( x ) ⊕ Z , f ( x ′ ) | > p ′ n for all x ′ � = x ◮ We choose f uniformly at random (what does it mean?) ◮ Non-constructive proof Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  35. Proving the theorem ◮ Fix p ∈ [ 0 , 1 2 ) and ε > 0, and let m > m ε and n ≥ m ( 1 C p + ε ) , for m ε to be determined by the analysis. (Recall C p = 1 − h ( p ) ). ◮ We show ∃ f : { 0 , 1 } m �→ { 0 , 1 } n and g : { 0 , 1 } n �→ { 0 , 1 } m , s.t. Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ ε ◮ g ( y ) returns argmin x ′ ∈{ 0 , 1 } m | y − f ( x ′ ) | ◮ So it all boils down to finding f s.t. � � | f ( x ) − y | < min x ′ ∈{ 0 , 1 } m \{ x } | f ( x ′ ) − y | Pr x ←{ 0 , 1 } m ; y = f ( x ) ⊕ Z ≥ 1 − ε ◮ Idea: for p ′ > p to be determined later, find f s.t. w.h.p. over x and Z : (1) | f ( x ) ⊕ Z , f ( x ) | ≤ p ′ n (2) | f ( x ) ⊕ Z , f ( x ′ ) | > p ′ n for all x ′ � = x ◮ We choose f uniformly at random (what does it mean?) ◮ Non-constructive proof ◮ Probabilistic method Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 9 / 21

  36. Proving there exists good f Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  37. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  38. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  39. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } ( 1 ) By weak low of large numbers, ∃ n ′ ∈ N s.t. ∀ n ≥ n ′ and ∀ x ∈ { 0 , 1 } m : ∈ B p ′ ( f ( x ))] ≤ ε α n := Pr z ← Z [( f ( x ) ⊕ z ) / (for any fixed f ) 2 Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  40. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } ( 1 ) By weak low of large numbers, ∃ n ′ ∈ N s.t. ∀ n ≥ n ′ and ∀ x ∈ { 0 , 1 } m : ∈ B p ′ ( f ( x ))] ≤ ε α n := Pr z ← Z [( f ( x ) ⊕ z ) / (for any fixed f ) 2 ◮ Fact (proved later): b ( p ′ ) = | B p ′ ( y ) | ≤ 2 n · h ( p ′ ) Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  41. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } ( 1 ) By weak low of large numbers, ∃ n ′ ∈ N s.t. ∀ n ≥ n ′ and ∀ x ∈ { 0 , 1 } m : ∈ B p ′ ( f ( x ))] ≤ ε α n := Pr z ← Z [( f ( x ) ⊕ z ) / (for any fixed f ) 2 ◮ Fact (proved later): b ( p ′ ) = | B p ′ ( y ) | ≤ 2 n · h ( p ′ ) ∀ x � = x ′ ∈ { 0 , 1 } m : Pr f , Z [ f ( x ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] = b ( p ′ ) = ⇒ 2 n Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  42. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } ( 1 ) By weak low of large numbers, ∃ n ′ ∈ N s.t. ∀ n ≥ n ′ and ∀ x ∈ { 0 , 1 } m : ∈ B p ′ ( f ( x ))] ≤ ε α n := Pr z ← Z [( f ( x ) ⊕ z ) / (for any fixed f ) 2 ◮ Fact (proved later): b ( p ′ ) = | B p ′ ( y ) | ≤ 2 n · h ( p ′ ) ∀ x � = x ′ ∈ { 0 , 1 } m : Pr f , Z [ f ( x ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] = b ( p ′ ) = ⇒ 2 n Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  43. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } ( 1 ) By weak low of large numbers, ∃ n ′ ∈ N s.t. ∀ n ≥ n ′ and ∀ x ∈ { 0 , 1 } m : ∈ B p ′ ( f ( x ))] ≤ ε α n := Pr z ← Z [( f ( x ) ⊕ z ) / (for any fixed f ) 2 ◮ Fact (proved later): b ( p ′ ) = | B p ′ ( y ) | ≤ 2 n · h ( p ′ ) ∀ x � = x ′ ∈ { 0 , 1 } m : Pr f , Z [ f ( x ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] = b ( p ′ ) ≤ 2 n · h ( p ′ ) = 2 − nC p ′ = ⇒ 2 n 2 n Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  44. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } ( 1 ) By weak low of large numbers, ∃ n ′ ∈ N s.t. ∀ n ≥ n ′ and ∀ x ∈ { 0 , 1 } m : ∈ B p ′ ( f ( x ))] ≤ ε α n := Pr z ← Z [( f ( x ) ⊕ z ) / (for any fixed f ) 2 ◮ Fact (proved later): b ( p ′ ) = | B p ′ ( y ) | ≤ 2 n · h ( p ′ ) ∀ x � = x ′ ∈ { 0 , 1 } m : Pr f , Z [ f ( x ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] = b ( p ′ ) ≤ 2 n · h ( p ′ ) = 2 − nC p ′ = ⇒ 2 n 2 n ∀ x ∈ { 0 , 1 } m : Pr f , Z [ ∃ x ′ � = x ∈ { 0 , 1 } m : f ( x ′ ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] ≤ 2 m − nC p ′ = ⇒ Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  45. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } ( 1 ) By weak low of large numbers, ∃ n ′ ∈ N s.t. ∀ n ≥ n ′ and ∀ x ∈ { 0 , 1 } m : ∈ B p ′ ( f ( x ))] ≤ ε α n := Pr z ← Z [( f ( x ) ⊕ z ) / (for any fixed f ) 2 ◮ Fact (proved later): b ( p ′ ) = | B p ′ ( y ) | ≤ 2 n · h ( p ′ ) ∀ x � = x ′ ∈ { 0 , 1 } m : Pr f , Z [ f ( x ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] = b ( p ′ ) ≤ 2 n · h ( p ′ ) = 2 − nC p ′ = ⇒ 2 n 2 n ∀ x ∈ { 0 , 1 } m : Pr f , Z [ ∃ x ′ � = x ∈ { 0 , 1 } m : f ( x ′ ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] ≤ 2 m − nC p ′ = ⇒ = ⇒ ∃ f s.t. β m , n := Pr x ←{ 0 , 1 } m [ ∃ x ′ � = x ∈ { 0 , 1 } m : f ( x ′ ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] ≤ 2 m − nC p ′ Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  46. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } ( 1 ) By weak low of large numbers, ∃ n ′ ∈ N s.t. ∀ n ≥ n ′ and ∀ x ∈ { 0 , 1 } m : ∈ B p ′ ( f ( x ))] ≤ ε α n := Pr z ← Z [( f ( x ) ⊕ z ) / (for any fixed f ) 2 ◮ Fact (proved later): b ( p ′ ) = | B p ′ ( y ) | ≤ 2 n · h ( p ′ ) ∀ x � = x ′ ∈ { 0 , 1 } m : Pr f , Z [ f ( x ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] = b ( p ′ ) ≤ 2 n · h ( p ′ ) = 2 − nC p ′ = ⇒ 2 n 2 n ∀ x ∈ { 0 , 1 } m : Pr f , Z [ ∃ x ′ � = x ∈ { 0 , 1 } m : f ( x ′ ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] ≤ 2 m − nC p ′ = ⇒ = ⇒ ∃ f s.t. β m , n := Pr x ←{ 0 , 1 } m [ ∃ x ′ � = x ∈ { 0 , 1 } m : f ( x ′ ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] ≤ 2 m − nC p ′ C p ′ + 1 − log ε β m , n ≤ ε C p ′ − log ε + 1 = m ( 1 m = ⇒ 2 , for n ≥ ) m Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  47. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } ( 1 ) By weak low of large numbers, ∃ n ′ ∈ N s.t. ∀ n ≥ n ′ and ∀ x ∈ { 0 , 1 } m : ∈ B p ′ ( f ( x ))] ≤ ε α n := Pr z ← Z [( f ( x ) ⊕ z ) / (for any fixed f ) 2 ◮ Fact (proved later): b ( p ′ ) = | B p ′ ( y ) | ≤ 2 n · h ( p ′ ) ∀ x � = x ′ ∈ { 0 , 1 } m : Pr f , Z [ f ( x ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] = b ( p ′ ) ≤ 2 n · h ( p ′ ) = 2 − nC p ′ = ⇒ 2 n 2 n ∀ x ∈ { 0 , 1 } m : Pr f , Z [ ∃ x ′ � = x ∈ { 0 , 1 } m : f ( x ′ ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] ≤ 2 m − nC p ′ = ⇒ = ⇒ ∃ f s.t. β m , n := Pr x ←{ 0 , 1 } m [ ∃ x ′ � = x ∈ { 0 , 1 } m : f ( x ′ ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] ≤ 2 m − nC p ′ C p ′ + 1 − log ε β m , n ≤ ε C p ′ − log ε + 1 = m ( 1 m = ⇒ 2 , for n ≥ ) m 2 , for m ≥ m ′ = 2 ( 1 − log ε ) ( 2 ) β m , n ≤ ε and n ≥ m ( 1 C p + ε 2 + 1 − log ε ) = m ( 1 C p + ε ) ε m Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  48. Proving there exists good f ◮ Fix p ′ > p such that 1 C p ≤ ε 1 C p ′ − 2 ◮ For y ∈ { 0 , 1 } n , let B p ′ ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ p ′ n } ( 1 ) By weak low of large numbers, ∃ n ′ ∈ N s.t. ∀ n ≥ n ′ and ∀ x ∈ { 0 , 1 } m : ∈ B p ′ ( f ( x ))] ≤ ε α n := Pr z ← Z [( f ( x ) ⊕ z ) / (for any fixed f ) 2 ◮ Fact (proved later): b ( p ′ ) = | B p ′ ( y ) | ≤ 2 n · h ( p ′ ) ∀ x � = x ′ ∈ { 0 , 1 } m : Pr f , Z [ f ( x ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] = b ( p ′ ) ≤ 2 n · h ( p ′ ) = 2 − nC p ′ = ⇒ 2 n 2 n ∀ x ∈ { 0 , 1 } m : Pr f , Z [ ∃ x ′ � = x ∈ { 0 , 1 } m : f ( x ′ ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] ≤ 2 m − nC p ′ = ⇒ = ⇒ ∃ f s.t. β m , n := Pr x ←{ 0 , 1 } m [ ∃ x ′ � = x ∈ { 0 , 1 } m : f ( x ′ ) ⊕ Z ∈ B p ′ ( f ( x ′ ))] ≤ 2 m − nC p ′ C p ′ + 1 − log ε β m , n ≤ ε C p ′ − log ε + 1 = m ( 1 m = ⇒ 2 , for n ≥ ) m 2 , for m ≥ m ′ = 2 ( 1 − log ε ) ( 2 ) β m , n ≤ ε and n ≥ m ( 1 C p + ε 2 + 1 − log ε ) = m ( 1 C p + ε ) ε m ◮ Hence, for m > m ε = max { m ′ , n ′ } and n > m ( 1 C p + ε ) , it holds that Pr x ←{ 0 , 1 } m [ g ( f ( x ) ⊕ Z ) � = x ] ≤ α n + β m , n ≤ ε . Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 10 / 21

  49. Why C p = 1 − h ( p ) ?

  50. Why C p = 1 − h ( p ) ? ◮ Let X ← { 0 , 1 } , Z ∼ ( 1 − p , p ) and Y = X ⊕ Z

  51. Why C p = 1 − h ( p ) ? ◮ Let X ← { 0 , 1 } , Z ∼ ( 1 − p , p ) and Y = X ⊕ Z X Y 1 − p 0 0 p p 1 1 1 − p Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 11 / 21

  52. Why C p = 1 − h ( p ) ? ◮ Let X ← { 0 , 1 } , Z ∼ ( 1 − p , p ) and Y = X ⊕ Z X Y 1 − p 0 0 p p 1 1 1 − p ◮ I ( X ; Y ) = H ( Y ) − H ( Y | X ) = H ( Y ) − H ( Z ) = 1 − h ( p ) = C p Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 11 / 21

  53. Why C p = 1 − h ( p ) ? ◮ Let X ← { 0 , 1 } , Z ∼ ( 1 − p , p ) and Y = X ⊕ Z X Y 1 − p 0 0 p p 1 1 1 − p ◮ I ( X ; Y ) = H ( Y ) − H ( Y | X ) = H ( Y ) − H ( Z ) = 1 − h ( p ) = C p ◮ Received bit “gives" C p information about transmitted bit Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 11 / 21

  54. Why C p = 1 − h ( p ) ? ◮ Let X ← { 0 , 1 } , Z ∼ ( 1 − p , p ) and Y = X ⊕ Z X Y 1 − p 0 0 p p 1 1 1 − p ◮ I ( X ; Y ) = H ( Y ) − H ( Y | X ) = H ( Y ) − H ( Z ) = 1 − h ( p ) = C p ◮ Received bit “gives" C p information about transmitted bit ◮ Hence, to recover m bits, we need to send at least m · 1 C p bits Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 11 / 21

  55. Size of bounding ball Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 12 / 21

  56. Size of bounding ball Claim 2 � n � 2 ] and n ∈ N : it holds that � ⌊ pn ⌋ For p ∈ [ 0 , 1 ≤ 2 n · h ( p ) k = 0 k Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 12 / 21

  57. Size of bounding ball Claim 2 � n � 2 ] and n ∈ N : it holds that � ⌊ pn ⌋ For p ∈ [ 0 , 1 ≤ 2 n · h ( p ) k = 0 k � n � ≈ 2 n · h ( p ) ) Proof in a few slides (we already saw that pn Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 12 / 21

  58. Size of bounding ball Claim 2 � n � 2 ] and n ∈ N : it holds that � ⌊ pn ⌋ For p ∈ [ 0 , 1 ≤ 2 n · h ( p ) k = 0 k � n � ≈ 2 n · h ( p ) ) Proof in a few slides (we already saw that pn Corollary 3 For y ∈ { 0 , 1 } n and p ∈ [ 0 , 1 2 ] , let B p ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ pn } . Then | B p ( y ) | = � ⌊ pn ⌋ � n � ≤ 2 n · h ( p ) k = 0 k Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 12 / 21

  59. Size of bounding ball Claim 2 � n � 2 ] and n ∈ N : it holds that � ⌊ pn ⌋ For p ∈ [ 0 , 1 ≤ 2 n · h ( p ) k = 0 k � n � ≈ 2 n · h ( p ) ) Proof in a few slides (we already saw that pn Corollary 3 For y ∈ { 0 , 1 } n and p ∈ [ 0 , 1 2 ] , let B p ( y ) = { y ∈ { 0 , 1 } n : | y ′ − y | ≤ pn } . Then | B p ( y ) | = � ⌊ pn ⌋ � n � ≤ 2 n · h ( p ) k = 0 k Very useful estimation. Weaker variants follows by AEP or Stirling, Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 12 / 21

  60. Tightness Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  61. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  62. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) X − → f ( X ) − → f ( X ) ⊕ Z − → g ( f ( X ) ⊕ Z ) ◮ ���� ���� � �� � � �� � m bits n bits Y g ( Y ) Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  63. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) X − → f ( X ) − → f ( X ) ⊕ Z − → g ( f ( X ) ⊕ Z ) ◮ ���� ���� � �� � � �� � m bits n bits Y g ( Y ) ◮ Assuming Pr [ g ( Y ) = X ] ≥ 1 − ε , we show nC p ≥ m ( 1 − ε ) − 1 Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  64. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) X − → f ( X ) − → f ( X ) ⊕ Z − → g ( f ( X ) ⊕ Z ) ◮ ���� ���� � �� � � �� � m bits n bits Y g ( Y ) ◮ Assuming Pr [ g ( Y ) = X ] ≥ 1 − ε , we show nC p ≥ m ( 1 − ε ) − 1 ◮ Compare to nC p > m ( 1 + ε C p ) in Thm 1 Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  65. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) X − → f ( X ) − → f ( X ) ⊕ Z − → g ( f ( X ) ⊕ Z ) ◮ ���� ���� � �� � � �� � m bits n bits Y g ( Y ) ◮ Assuming Pr [ g ( Y ) = X ] ≥ 1 − ε , we show nC p ≥ m ( 1 − ε ) − 1 ◮ Compare to nC p > m ( 1 + ε C p ) in Thm 1 ◮ Hence, lim ε → 0 m n = C p Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  66. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) X − → f ( X ) − → f ( X ) ⊕ Z − → g ( f ( X ) ⊕ Z ) ◮ ���� ���� � �� � � �� � m bits n bits Y g ( Y ) ◮ Assuming Pr [ g ( Y ) = X ] ≥ 1 − ε , we show nC p ≥ m ( 1 − ε ) − 1 ◮ Compare to nC p > m ( 1 + ε C p ) in Thm 1 ◮ Hence, lim ε → 0 m n = C p ◮ By Fano, H ( X | Y ) ≤ h ( ε ) + ε log ( 2 m − 1 ) ≤ 1 + ε m Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  67. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) X − → f ( X ) − → f ( X ) ⊕ Z − → g ( f ( X ) ⊕ Z ) ◮ ���� ���� � �� � � �� � m bits n bits Y g ( Y ) ◮ Assuming Pr [ g ( Y ) = X ] ≥ 1 − ε , we show nC p ≥ m ( 1 − ε ) − 1 ◮ Compare to nC p > m ( 1 + ε C p ) in Thm 1 ◮ Hence, lim ε → 0 m n = C p ◮ By Fano, H ( X | Y ) ≤ h ( ε ) + ε log ( 2 m − 1 ) ≤ 1 + ε m ◮ I ( X ; Y ) = H ( X ) − H ( X | Y ) ≥ m − ε m − 1 = m ( 1 − ε ) − 1 Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  68. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) X − → f ( X ) − → f ( X ) ⊕ Z − → g ( f ( X ) ⊕ Z ) ◮ ���� ���� � �� � � �� � m bits n bits Y g ( Y ) ◮ Assuming Pr [ g ( Y ) = X ] ≥ 1 − ε , we show nC p ≥ m ( 1 − ε ) − 1 ◮ Compare to nC p > m ( 1 + ε C p ) in Thm 1 ◮ Hence, lim ε → 0 m n = C p ◮ By Fano, H ( X | Y ) ≤ h ( ε ) + ε log ( 2 m − 1 ) ≤ 1 + ε m ◮ I ( X ; Y ) = H ( X ) − H ( X | Y ) ≥ m − ε m − 1 = m ( 1 − ε ) − 1 ◮ H ( Y | X ) = H ( X , Y ) − H ( X ) = H ( X , Z ) − H ( X ) = H ( Z ) = nh ( p ) Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  69. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) X − → f ( X ) − → f ( X ) ⊕ Z − → g ( f ( X ) ⊕ Z ) ◮ ���� ���� � �� � � �� � m bits n bits Y g ( Y ) ◮ Assuming Pr [ g ( Y ) = X ] ≥ 1 − ε , we show nC p ≥ m ( 1 − ε ) − 1 ◮ Compare to nC p > m ( 1 + ε C p ) in Thm 1 ◮ Hence, lim ε → 0 m n = C p ◮ By Fano, H ( X | Y ) ≤ h ( ε ) + ε log ( 2 m − 1 ) ≤ 1 + ε m ◮ I ( X ; Y ) = H ( X ) − H ( X | Y ) ≥ m − ε m − 1 = m ( 1 − ε ) − 1 ◮ H ( Y | X ) = H ( X , Y ) − H ( X ) = H ( X , Z ) − H ( X ) = H ( Z ) = nh ( p ) ◮ I ( X ; Y ) = H ( Y ) − H ( Y | X ) = n − nh ( p ) Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  70. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) X − → f ( X ) − → f ( X ) ⊕ Z − → g ( f ( X ) ⊕ Z ) ◮ ���� ���� � �� � � �� � m bits n bits Y g ( Y ) ◮ Assuming Pr [ g ( Y ) = X ] ≥ 1 − ε , we show nC p ≥ m ( 1 − ε ) − 1 ◮ Compare to nC p > m ( 1 + ε C p ) in Thm 1 ◮ Hence, lim ε → 0 m n = C p ◮ By Fano, H ( X | Y ) ≤ h ( ε ) + ε log ( 2 m − 1 ) ≤ 1 + ε m ◮ I ( X ; Y ) = H ( X ) − H ( X | Y ) ≥ m − ε m − 1 = m ( 1 − ε ) − 1 ◮ H ( Y | X ) = H ( X , Y ) − H ( X ) = H ( X , Z ) − H ( X ) = H ( Z ) = nh ( p ) ◮ I ( X ; Y ) = H ( Y ) − H ( Y | X ) = n − nh ( p ) ◮ Hence, m ( 1 − ε ) ≤ I ( X ; Y ) + 1 = n ( 1 − h ( p )) + 1 = nC p + 1 Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  71. Tightness ◮ X ← { 0 , 1 } m , Z = ( Z 1 , . . . , Z n ) where Z 1 , . . . , Z n iid ∼ ( 1 − p , p ) X − → f ( X ) − → f ( X ) ⊕ Z − → g ( f ( X ) ⊕ Z ) ◮ ���� ���� � �� � � �� � m bits n bits Y g ( Y ) ◮ Assuming Pr [ g ( Y ) = X ] ≥ 1 − ε , we show nC p ≥ m ( 1 − ε ) − 1 ◮ Compare to nC p > m ( 1 + ε C p ) in Thm 1 ◮ Hence, lim ε → 0 m n = C p ◮ By Fano, H ( X | Y ) ≤ h ( ε ) + ε log ( 2 m − 1 ) ≤ 1 + ε m ◮ I ( X ; Y ) = H ( X ) − H ( X | Y ) ≥ m − ε m − 1 = m ( 1 − ε ) − 1 ◮ H ( Y | X ) = H ( X , Y ) − H ( X ) = H ( X , Z ) − H ( X ) = H ( Z ) = nh ( p ) ◮ I ( X ; Y ) = H ( Y ) − H ( Y | X ) = n − nh ( p ) ◮ Hence, m ( 1 − ε ) ≤ I ( X ; Y ) + 1 = n ( 1 − h ( p )) + 1 = nC p + 1 ◮ . . . Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 13 / 21

  72. General communication channel

  73. General communication channel Y = Q ( X ) X Q : [ k ] �→ [ k ] that channel (a probabilistic p 1 , 1 function) 1 1 p 1 , 4 2 2 p i , j = Pr [ Q ( i ) = j ] . p 2 , 4 . . . . . p k , 2 k k Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 14 / 21

  74. General communication channel Y = Q ( X ) X Q : [ k ] �→ [ k ] that channel (a probabilistic p 1 , 1 function) 1 1 p 1 , 4 2 2 p i , j = Pr [ Q ( i ) = j ] . p 2 , 4 . . . . . ◮ x = ( x 1 , . . . , x m ) ∈ { 0 , 1 } m p k , 2 k k Iftach Haitner (TAU) Application of Information Theory, Lecture 5 November 25, 2014 14 / 21

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend