information theoretic security information theoretic
play

INFORMATION-THEORETIC SECURITY INFORMATION-THEORETIC SECURITY - PowerPoint PPT Presentation

INFORMATION-THEORETIC SECURITY INFORMATION-THEORETIC SECURITY Lecture 2 - Elements of Information Theory Matthieu Bloch December 3, 2019 1 TOOLS: TOOLS: CONCENTRATION INEQUALITIES CONCENTRATION INEQUALITIES Lemma (Markov's inequality) Let


  1. INFORMATION-THEORETIC SECURITY INFORMATION-THEORETIC SECURITY Lecture 2 - Elements of Information Theory Matthieu Bloch December 3, 2019 1

  2. TOOLS: TOOLS: CONCENTRATION INEQUALITIES CONCENTRATION INEQUALITIES Lemma (Markov's inequality) Let be a non-negative real-valued random variable. Then for all X t > 0 E [ X ] P ( X ≥ t ) ≤ . t Lemma (Chebyshev's inequality) Let be a real-valued random variable. Then for all X t > 0 Var( X ) P (| X − E [ X ]| ≥ t ) ≤ . t 2 Proposition (Weak law of large numbers) Let be i.i.d. real-valued random variables with finite mean and finite variance . Then σ 2 { X i } n μ i =1 n n σ 2 ∣ ∣ ∣ ∣ ∣ 1 ∣ 1 P ( n ∑ X i − μ ≥ ϵ ) ≤ ∣ n →∞ P lim ( N ∑ X i − μ ≥ ϵ ) = 0. ∣ nϵ 2 ∣ ∣ ∣ ∣ i =1 i =1 2

  3. 3

  4. 4

  5. 5

  6. TOOLS: TOOLS: MORE CONCENTRATION INEQUALITIES MORE CONCENTRATION INEQUALITIES Proposition (Hoeffding's inequality) Let be i.i.d. real-valued zero-mean random variables such that . Then for all { X i } n X i ∈ [ a i b i ; ] i =1 ϵ > 0 n 2 n 2 ϵ 2 ∣ ∣ ∣ 1 P ( n ∑ X i ∣ ≥ ϵ ) ≤ 2 exp ( − ) . ∑ n a i ) 2 i =1 b i ( − ∣ ∣ i =1 More on concentration inequalities later in the course when we need more precise results Lemma. Let and . If then there exists such that . f : X → R + x ∗ x ∗ ϵ > 0 E X [ f ( X )] ≤ ϵ ∈ X f ( ) ≤ 2 ϵ 6

  7. 7

  8. 8

  9. CHANNEL CODING CHANNEL CODING One shot channel coding Coding consists of encoder and decoder Enc : [1; M ] → X Dec : Y → [1; M ] Objective: transmit messages with small average probability of error W M 1 ˆ P e ( C ) ≜ P W ( ≠ W | C ) = M ∑ P (Dec( Y ) ≠ m | W = m ) m =1 Lemma (Random coding for channel reliability) For and , define ∈ Δ( X ) γ > 0 p X p Y ≜ W ∘ p X W Y | X ( y | x ) A γ ≜ { ( x , y ) ∈ X × Y : log ≥ γ } . p Y ( y ) For a codebook of independently generated codewords , we have C X i ∼ p X 2 − γ E C P e [ ( C )] ≤ P p X W Y | X (( X , Y ) ∉ A γ ) + M . One shot result not tight but good enough for many problems 9

  10. 10

  11. 11

  12. 12

  13. CHANNEL CODING CHANNEL CODING Reliable communication over a noisy channel Introduce coding over blocklength , so that and n ∈ N ∗ Enc : [1; M ] → X n Y n Dec : → [1; M ] Definition. (Achievable rate) A rate is achievable if there exists a sequence of codes with length with R n 1 log M ≥ R ( ) = 0 lim inf lim sup n →∞ P e C n n →∞ n Proposition (Achievability) The rate is achievable. max p X I ( X ; Y ) Proposition (Converse) All achievable rates must satisfy . R R ≤ max p X I ( X ; Y ) This provides an operational meaning to the channel capacity . C ≜ max p X I ( X ; Y ) 13

  14.    14

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend