hypergraph based coding schemes for two source coding
play

Hypergraph-based Coding Schemes for Two Source Coding Problems - PowerPoint PPT Presentation

Hypergraph-based Coding Schemes for Two Source Coding Problems under Maximal Distortion Sourya Basu: CSL, ECE Dept., University of Illinois at Urbana-Champaign Daewon Seo: ECE Dept., University of Wisconsin-Madison Lav Varshney: CSL, ECE Dept.,


  1. Hypergraph-based Coding Schemes for Two Source Coding Problems under Maximal Distortion Sourya Basu: CSL, ECE Dept., University of Illinois at Urbana-Champaign Daewon Seo: ECE Dept., University of Wisconsin-Madison Lav Varshney: CSL, ECE Dept., University of Illinois at Urbana- Champaign; Salesforce Research This work was funded in part by the IBM-Illinois Center for Cognitive Computing Systems Research (C3SR), a research collaboration as part of the IBM AI Horizons Network. 1

  2. Introduction ̂ ̂ • Distributed coding for computing under maximal distortion X 1 E 1 M 1 f ( X 1 , X 2 ) D || f ( X 1 , X 2 ) − f ( X , Y ) || ≤ ϵ M 2 X 2 E 2 2

  3. Introduction ̂ • Distributed coding for computing: Berger-Tung achievable region Let if and only if: ( R 1 , R 2 ) ∈ ℛ i , ϵ R 1 > I ( X 1 ; U 1 | U 2 , Q ) R 2 > I ( X 2 ; U 2 | U 1 , Q ) R 1 + R 2 > I ( X 1 , X 2 ; U 1 , U 2 | Q ) p ( q ) p ( u 1 | x 1 , q ) p ( u 2 | x 2 , q ) | 𝒭 | ≤ 4 for some joint pmf with , | 𝒱 j | ≤ | 𝒴 j | + 4 j = 1,2 f ( u 1 , u 2 ) , and some function such that 𝔽 [ d ϵ ( X 1 , X 2 , ̂ f ( U 1 , U 2 )) ] = 0 d ϵ ( x 1 , x 2 , z ) = 1 { || z − f ( x 1 , x 2 ) || > ϵ } , where . 3

  4. Introduction ̂ ̂ ̂ ̂ • Successive refinement coding under maximal distortion M 1 X 1 E 1 D 1 || X − X 1 || ≤ ϵ 1 X M 2 X 2 E 2 D 2 || X − X 2 || ≤ ϵ 2 4

  5. Introduction • Successive refinement coding: known rate region ( R 1 , R 2 ) ∈ ℛ ( ϵ 1 , ϵ 2 ) Let if and only if R 1 > I ( X ; ̂ X 1 ) R 2 > I ( X ; ̂ X 1 , ̂ X 2 ), x 2 | x ) 𝔽 [ 1 { || ̂ X i − X || > ϵ i } ] = 0 p ( ̂ x 1 , ̂ i ∈ {1,2} for some conditional pmf , for . 5

  6. Motivation • Not much is known about the auxiliary random variables involved, which makes it di ffi cult to actually attain these regions in practice. • Hence, this work is concerned with hypergraph-based auxiliary variables under maximal distortion, which makes it easier to attain the rate regions. W. Gu, “On achievable rate regions for source coding over networks,” Ph.D. dissertation, California Institute of Technology, 2009. 6

  7. Our contributions • Distributed coding for computing: we provide a hypergraph- based coding scheme that outperforms existing hypergraph- based coding schemes. • Successive refinement coding: we attain the entire rate region using a hypergraph-based coding scheme. 7

  8. ̂ Previous work f ( X , Y ) M E D X || f ( X , Y ) − f ( X , Y ) || ≤ ϵ Y Optimal practical codes R min = min I ( W ; X | Y ) I ( W ; X | Y ) R min = min , W − X − Y W − X − Y X ∈ W ∈ Γ ( G ϵ ) p ∈ 𝒬 (0) where is the set of all hyperedges of Γ ( G ϵ ) where is the set of all p(w|x), 𝒬 (0) a -characteristic hypergraph which can such that there exists a ϵ be constructed based on f, X, Y as a part satisfying g : 𝒳 × 𝒵 ↦ 𝒶 of coding scheme. E [1 || f ( X , Y ) − g ( W , Y ) || > D ] ≤ 0 . S. Basu, D. Seo, and L. Varshney, “Functional Epsilon Entropy”, in Proceedings of the IEEE Data Compression Conference, Snowbird, Utah, 24-27 March 2020. 8

  9. Overview • Problem settings • Definitions • Our rates regions • Some special cases 9

  10. ̂ ̂ Problem setting: distributed coding for computing X 1 E 1 M 1 f ( X 1 , X 2 ) D || f ( X 1 , X 2 ) − f ( X 1 , X 2 ) || ≤ ϵ M 2 X 2 E 2 • { X 1, i , X 2, i } N are N iid random variables, where X j , i ∈ 𝒴 j for j ∈ {1,2} and i ∈ {1,…, N } and are 𝒴 j i =1 finite sets. • { ̂ P avg ( ̂ { f ( X 1, i , X 2, i )} N Z i } N Z N , X N 1 , X N Reconstruct as such that as . 2 ) → 0 N → ∞ i =1 i =1 ϵ N Pr [ || ̂ 2 ) = 1 Z i − f ( X 1, i , X 2, i ) || > ϵ ] ∑ ( ̂ P avg Z N , X N 1 , X N ϵ N i =1 10

  11. ̂ ̂ ̂ ̂ Problem setting: successive refinement coding X 1 M 1 E 1 D 1 || X − X 1 || ≤ ϵ 1 X X 2 M 2 E 2 D 2 || X − X 2 || ≤ ϵ 2 • { X i } N are N iid random variables, where X i ∈ 𝒴 for i ∈ {1,…, N } and is a finite set. 𝒴 i =1 { ̂ ϵ i ( ̂ P avg • { X i } N X j , i } N X N i , X N ) → 0 Decoder reconstructs as for such that as . D j j ∈ {1,2} N → ∞ i =1 i =1 N Pr [ || ̂ j , X N ) = 1 X j , i − X i || > ϵ j ] ∑ P avg ϵ i ( ̂ X N N i =1 11

  12. Smallest enclosing circles Smallest enclosing circles: • For a set of points , the circle with smallest radius covering all the points in S is called the smallest enclosing circle of S S Images constructed using https://www.nayuki.io/page/smallest-enclosing-circle 12

  13. Distributed source coding: some definitions G ϵ i = ( 𝒴 i , E i ) i ∈ {1,2} ( G ϵ 1 , G ϵ 2 ) Let for . A pair of hypergraphs is called an ϵ f : 𝒴 1 × 𝒴 2 ↦ 𝒶 -achievable hypergraph pair with respect to a function , if, for w 1 ∈ E 1 w 2 ∈ E 2 any and the radius of the smallest enclosing circle of the set of { f ( x 1 , x 2 ) : x 1 ∈ w 1 , x 2 ∈ w 2 , and p ( x 1 , x 2 ) > 0} points is less than or equal to . ϵ 𝒣 ϵ A -characteristic hypergraph pair set , , consists of all -achievable hypergraph ϵ ϵ f pairs with respect to a function , i.e. 𝒣 ϵ = {( G 1 , G 2 ) : ( G 1 , G 2 ) is an ϵ -achievable hypergraph pair } 13

  14. Distributed coding for computing: hypergraph-based coding rate region ( R 1 , R 2 ) ∈ ℛ 𝒣 , ϵ Let if and only if R 1 > I ( X 1 ; W 1 | W 2 , Q ) R 2 > I ( X 2 ; W 2 | W 1 , Q ) R 1 + R 2 > I ( X 1 , X 2 ; W 1 , W 2 | Q ) p ( q ) p ( w 1 , w 2 | x 1 , x 2 , q ) W 1 − X 1 − X 2 − W 2 for some joint pmf such that form X 1 ∈ W 1 ∈ Γ ( G ϵ 1 ) X 2 ∈ W 2 ∈ Γ ( G ϵ 2 ) a Markov chain and and for some ( G ϵ 1 , G ϵ 2 ) ∈ 𝒣 ϵ . i ∈ {1,2} X i G ϵ W i For , induces a pmf over the vertices of the graph and is i p ( w i | x i ) w i ∈ Γ ( G ϵ i ) obtained by defining the pmfs over all hyperedges that ∑ x i p ( w i | x i ) ≥ 0 x i ∈ w i ∈ Γ ( G ϵ i ) p ( w i | x i ) = 1 contain , i.e. for all and . w i ∋ x i 14

  15. Distributed source coding: main result Theorem 1: The region is achievable. ℛ 𝒣 , ϵ ℛ sb ℛ sb Theorem 2: The region and match on the sum-rate bound, i.e. 𝒣 , ϵ i , ϵ ℛ sb 𝒣 , ϵ = ℛ sb . i , ϵ 15

  16. ̂ Distributed source coding: special cases X 2 ℛ ind i , ϵ = ℛ ind Theorem 3: When is independent of , . X 1 𝒣 , ϵ ℛ ind i , ϵ = ( R 1 , R 2 ) such that R 1 ≥ I ( X 1 ; U 1 | Q ) R 2 ≥ I ( X 2 ; U 2 | Q ), p ( u 1 | x 1 ), p ( u 2 | x 2 ) for some distribution and time-sharing 𝔽 [ d ϵ ( ̂ Q f ( U 1 , U 2 ), f ( X 1 , X 2 ))] = 0 random variable , such that f for some function . ℛ ind 𝒣 , ϵ = ( R 1 , R 2 ) such that R 1 ≥ I ( X 1 ; W 1 | Q ) R 2 ≥ I ( X 2 ; W 2 | Q ), p ( w 1 | x 1 ), p ( w 2 | x 2 ) for some distributions and time-sharing Q X i ∈ W i ∈ Γ ( G ϵ i ) i ∈ {1,2} random variable , such that for ( G ϵ 1 , G ϵ 2 ) ∈ 𝒣 ϵ for some 16

  17. Distributed source coding: improvement in rate region Example: Let be i.i.d. Bern(1/2) random variables. Encoder observes ( X 1 , X 2 ) i X i for and the decoder wants to compute the identity function i ∈ {1,2} with . f ( x 1 , x 2 ) = ( x 1 , x 2 ) ϵ = 0.5 17

  18. Distributed source coding: improvement in rate region • The convex region COD correspond to the hypergraph-based region in previous work. • The convex region YABX correspond to the hypergraph-based region in our work. S. Feizi and M. Médard, “On network functional compression,” IEEE Trans. Inf. Theory , vol. 60, no. 9, pp. 5387–5401, Sep. 2014. 18

  19. Distributed source coding: improvement in rate region Example: Consider the point to point source coding problem with no side information, uniformly distributed over and take the function as the X {0,1,2} f identity function. 19

  20. Distributed source coding: outline of proofs Theorem 1: The region is achievable. ℛ 𝒣 , ϵ • Proof: Will use the Berger-Tung inner bound. Basically, we use W 1 = U 1 , W 2 = U 2 and ̂ as the center of the smallest enclosing f ( W 1 , W 2 ) circle containing all the points . { f ( x 1 , x 2 ) : x 1 ∈ W 1 , x 2 ∈ W 2 } • Further, by definition of the -achievable hypergraph pair set, we have ϵ 𝔽 [ d ϵ ( X 1 , X 2 , ̂ f ( W 1 , W 2 )) ] = 0 . • This completes the proof to achievability. 20

  21. Distributed source coding: outline of proofs ℛ sb ℛ sb Theorem 2: The region and match on the sum-rate bound, i.e. 𝒣 , ϵ i , ϵ ℛ sb 𝒣 , ϵ = ℛ sb . i , ϵ • Proof: ℛ sb 𝒣 , ϵ ≥ ℛ sb ℛ sb 𝒣 , ϵ ≤ ℛ sb is trivial. We need to show that . i , ϵ i , ϵ 21

  22. Distributed source coding: outline of proofs ℛ sb ℛ sb Theorem 2: The region and match on the sum-rate bound, i.e. 𝒣 , ϵ i , ϵ ℛ sb 𝒣 , ϵ = ℛ sb . i , ϵ • Proof: ℛ sb 𝒣 , ϵ ≥ ℛ sb ℛ sb 𝒣 , ϵ ≤ ℛ sb is trivial. We need to show that . i , ϵ i , ϵ U 1 , U 2 , ̂ • Idea: For every ℛ sb satisfying conditions in , we need to find f i , ϵ ℛ sb corresponding that satisfy conditions in . W 1 , W 2 𝒣 , ϵ 22

Recommend


More recommend