secrecy capacities and multiterminal source coding
play

Secrecy Capacities and Multiterminal Source Coding Prakash Narayan - PowerPoint PPT Presentation

Secrecy Capacities and Multiterminal Source Coding Prakash Narayan Joint work with Imre Csisz ar and Chunxuan Ye Multiterminal Source Coding The Model n X 2 x 2 n x n x X X 1 3 1 3 x m n X m m 2 terminals. X 1


  1. Secrecy Capacities and Multiterminal Source Coding Prakash Narayan Joint work with Imre Csisz´ ar and Chunxuan Ye

  2. Multiterminal Source Coding

  3. The Model n X 2 x 2 n x n x X X 1 3 1 3 x m n X m • m ≥ 2 terminals. • X 1 , . . . , X m , m ≥ 2, are rvs with finite alphabets X 1 , . . . , X m . • Consider a discrete memoryless multiple source with components X n 1 = ( X 11 , . . . , X 1 n ) , . . . , X n m = ( X m 1 , . . . , X mn ). • Terminal X i observes the component X n i = ( X i 1 , . . . , X in ).

  4. The Model x 2 F F 2 m+2 F 3 F x x F m+3 3 m+1 1 F 1 F F F rm 2m m x m • The terminals are allowed to communicate over a noiseless channel, possibly interactively in several rounds. • All the transmissions are observed by all the terminals. • No rate constraints on the communication. • Assume w.l.o.g that transmissions occur in consecutive time slots in r rounds. △ • Communication depicted by rvs F = F 1 , . . . F rm , where ∗ F ν = transmission in time slot ν by terminal i ≡ ν mod m . ∗ F ν is a function of X n i and ( F 1 , . . . , F ν − 1 ).

  5. Communication for Omniscience x 2 F F 2 m+2 F 3 F x x F m+3 3 m+1 1 F 1 F F F rm 2m m x m • Each terminal wishes to become “omniscient,” i.e., recover ( X n 1 , . . . , X n m ) with probability ≥ 1 − ε . • What is the smallest achievable rate of communication for omniscience (CO-rate), lim n 1 n H ( F 1 , . . . , F rm )?

  6. Minimum Communication for Omniscience Proposition [I. Csisz´ ar - P. N., ’02]: The smallest achievable CO-rate, n H ( F ( n ) , . . . , F ( n ) lim n 1 rm ), which enables ( X n 1 , . . . , X n m ) to be ε n -recoverable at all the 1 terminals with communication ( F ( n ) , . . . , F ( n ) rm ) (with the number of rounds possibly 1 depending on n ), with ε n → 0, is m � R min = min R i , ( R 1 ,... ,R m ) ∈R SW i =1 � � ′ ′ ′ where R SW = ( R 1 , · · · , R m ) : � i ≥ H ( X B | X B c ) , B ⊂ { 1 , . . . , m } i ∈ B R . Remark : The region R SW , if stated for all B ⊆ { 1 , . . . , m } , gives the achievable rate region for the multiterminal version of the Slepian-Wolf source coding theorem. Case: m = 2; R min = H ( X 1 | X 2 ) + H ( X 2 | X 1 ).

  7. Communication for Omniscience Proof of Proposition: The proposition is a source coding theorem of the “Slepian-Wolf” type, with the additional element that interactive communication is not a priori excluded. Achievability: Straightforward extension of the multiterminal Slepian-Wolf source coding theorem; the CO-rates can be achieved with noninteractive communication. Converse: Nontrivial; consequence of the following “Main Lemma.”

  8. Common Randomness n F K = K (X , ) 2 2 2 x 2 n F n K = K (X , ) K = K (X , ) F x x 1 1 3 3 3 1 1 3 x m n F K = K (X , ) m m m Common Randomness (CR): A function K of ( X n 1 , · · · , X n m ) is ε - CR , achievable with communication F , if Pr { K = K 1 = · · · = K m } ≥ 1 − ε. Thus, CR consists of random variables generated by different terminals, based on – local measurements or observations – transmissions or exchanges of information such that the random variables agree with probability ∼ = 1.

  9. Main Lemma . . . B . . . m 1 Lemma [I. Csisz´ ar - P. N., ’02]: If K is ε -CR for the terminals X 1 , · · · , X m , achievable with communication F = ( F 1 , · · · , F rm ), then m 1 R i + m ( ε log |K| + 1) � nH ( K | F ) = H ( X 1 , · · · , X m ) − n i =1 for some numbers ( R 1 , · · · , R m ) ∈ R SW where � � ′ ′ � ′ R SW = ( R 1 , · · · , R m ) : i ≥ H ( X B | X B c ) , B ⊂ { 1 , . . . , m } R . i ∈ B Remark : Decomposition of total joint entropy H ( X 1 , . . . , X m ) into the normalized conditional entropy of any achievable ε -CR conditioned on the communication with which it is achieved, and a sum of rates which satisfy the SW conditions.

  10. Secrecy Capacities

  11. The General Model (X ,...,X ) User 2 21 2n (X ,...,X ) (X ,...,X ) 31 3n 11 1n User 1 User 3 Wiretapper User m (Z ,...,Z ) 1 n (X ,...,X ) m1 mn The user terminals wish to generate CR which is effectively concealed from an eavesdropper with access to the public interterminal communication or from a wiretapper.

  12. Secret Key n F K = K (X , ) 2 2 2 x 2 n F n K = K (X , ) K = K (X , ) F x x 1 1 3 3 3 1 1 3 x m n F K = K (X , ) m m m Secret Key (SK): A function K of ( X n 1 , · · · , X n m ) is an ε - SK , achievable with communication F , if • Pr { K = K 1 = · · · = K m } ≥ 1 − ε (“ ε -common randomness”) 1 • n I ( K ∧ F ) ≤ ε (“secrecy”) 1 n H ( K ) ≥ 1 • n log |K| − ε (“uniformity”) where K = set of all possible values of K . Thus, a secret key is effectively concealed from an eavesdropper with access to F , and is nearly uniformly distributed.

  13. Secret Key Capacity n F K = K (X , ) 2 2 2 x 2 n F n K = K (X , ) K = K (X , ) F x x 1 1 3 3 3 1 1 3 x m n F K = K (X , ) m m m • Achievable SK-rate: The (entropy) rate of such a SK, achievable with suitable communication (with the number of rounds possibly depending on n ). • SK-capacity C SK = largest achievable SK-rate.

  14. Some Recent Related Work • Maurer 1990, 1991, 1993, 1994, · · · • Ahlswede-Csisz´ ar 1993, 1994, 1998, · · · • Bennett, Brassard, Cr´ epeau, Maurer 1995. • Csisz´ ar 1996. • Maurer - Wolf 1997, 2003, · · · • Venkatesan - Anantharam 1995, 1997, 1998, 2000, · · · • Csisz´ ar - Narayan 2000. • Renner-Wolf 2003. . . . . . .

  15. The Connection

  16. Special Case: Two Users n n X X 2 1 ~H(X |X ) 1 2 x x 1 2 ~H(X |X ) 2 1 Observation C SK = I ( X 1 ∧ X 2 ) [Maurer 1993, Ahlswede - Csisz´ ar 1993] = H ( X 1 , X 2 ) − [ H ( X 1 | X 2 ) + H ( X 2 | X 1 )] = Total rate of shared CR − Smallest achievable CO-rate ( R min ) .

  17. The Main Result • SK-capacity [I. Csisz´ ar - P. N., ’02]: C SK = H ( X 1 , . . . , X m ) − Smallest achievable CO-rate, R min , i.e., smallest rate of communication which enables each terminal to reconstruct all the m components of the multiple source. • A single-letter characterization of R min , thus, leads to the same for C SK . Remark : The source coding problem of determining the smallest achievable CO-rate R min does not involve any secrecy constraints .

  18. Secret Key Capacity Theorem [I. Csisz´ ar - P. N., ’02]: The SK-capacity C SK for a set of terminals { 1 , . . . , m } equals C SK = H ( X 1 , . . . , X m ) − R min , and can be achieved with noninteractive communication. Proof : Converse: From Main Lemma. Idea of achievability proof: If L represents ε -CR for the set of terminals, achievable with communication F for some block length n , then 1 n H ( L | F ) is an achievable SK-rate if ε is small. With L ∼ = ( X n 1 , . . . , X n m ) , we have 1 = H ( X 1 , . . . , X m ) − 1 nH ( L | F ) ∼ nH ( F ) . Remark: The SK-capacity is not increased by randomization at the terminals. Case : m = 2; C SK = I ( X 1 ∧ X 2 ).

  19. Example x 2 x x 1 3 x m [I. Csisz´ ar - P. N.,’03]: • X 1 , · · · , X m − 1 are { 0 , 1 } -valued, mutually independent, ( 1 2 , 1 2 ) rvs, and X mt = X 1 t + · · · + X ( m − 1) t mod 2 , t ≥ 1 . • Total rate of shared CR= H ( X 1 , . . . , X m ) = H ( X 1 , . . . , X m − 1 ) = m − 1 bits. • R min = . . . = m ( m − 2) bits m − 1 • C SK = ( m − 1) − m ( m − 2) 1 = m − 1 bit. m − 1

  20. Example – Scheme for Achievability • Claim : 1 bit of perfect SK (i.e., with ε = 0) is achievable with observation length n = m − 1. • Scheme with noninteractive communication: - Let n = m − 1. - For i = 1 , · · · , m − 1, X i transmits F i = f i ( X n i ) = block X n i excluding X ii . - X m transmits F m = f m ( X n m ) = ( X m 1 + X m 2 mod 2 , X m 1 + X m 3 mod 2 , · · · , X m 1 + X mn mod 2) . • X 1 , · · · , X m all recover ( X n 1 , · · · , X n m ). (Omniscience) • In particular, X 11 is independent of F = ( F 1 , · · · , F m ). 1 1 • X 11 is an achievable perfect SK , so C SK ≥ m − 1 H ( X 11 ) = m − 1 bit.

  21. Eavesdropper with Wiretapped Side Information (X ,...,X ) User 2 21 2n (X ,...,X ) (X ,...,X ) 31 3n 11 1n User 1 User 3 Wiretapper User m (Z ,...,Z ) 1 n (X ,...,X ) m1 mn • The secrecy requirement now becomes 1 nI ( K ∧ F , Z n ) ≤ ε. • General problem of determining the “Wiretap Secret Key” capacity, C WSK , remains unsolved.

  22. Wiretapping of Noisy User Sources The eavesdropper can wiretap noisy versions of some or all of the components of the underlying multiple source. Formally, m � Pr { Z 1 = z 1 , . . . , Z m = z m | X 1 = x 1 , . . . , X m = x m } = Pr { Z i = z i | X i = x i } . i =1 Theorem [I. Csisz´ ar - P. N., ’03]: The WSK-capacity for a set of terminals { 1 , . . . , m } equals C WSK = H ( X 1 , . . . , X m , Z 1 , . . . , Z m ) − “Revealed” entropy H ( Z 1 , . . . , Z m ) − Smallest achievable CO-rate for user terminals when they additionally know ( Z 1 , . . . , Z m ) = H ( X 1 , . . . , X m | Z 1 , . . . , Z m ) − R min ( Z 1 , . . . , Z m ) , provided that randomization is permitted at the user terminals. Case : m = 2; C WSK = I ( X 1 ∧ X 2 | Z 1 , Z 2 ).

  23. A Few Variants

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend