slepian wolf and related
play

Slepian-Wolf and Related e Problems s n s o r w e b BASiCS - PowerPoint PPT Presentation

Slepian-Wolf and Related e Problems s n s o r w e b BASiCS Group, Smartdust, TinyOS, Blackouts s http://basics.eecs.berkeley.edu/sensorwebs Julius Kusuma Laboratory for Information and Decision Systems kusuma@mit.edu Massachusetts


  1. Slepian-Wolf and Related e Problems s n s o r w e b BASiCS Group, Smartdust, TinyOS, Blackouts s http://basics.eecs.berkeley.edu/sensorwebs Julius Kusuma Laboratory for Information and Decision Systems kusuma@mit.edu Massachusetts Institute of Technology

  2. Outline of presentation � Information-theoretic motivation: achievable performance � Algorithmic component for distributed compression � Code constructions � Rate-distortion performance � Optimization of parameters � Deployment in sensor networks University of California, Berkeley

  3. Distributed compression: basic ideas � Suppose X, Y correlated as Y X=Y+N � Y available at decoder but not at X encoder � How to compress X close to H(X|Y)? � Key idea: discount I(X;Y). H(X|Y) = H(X) – I(X;Y) � For now X and Y iid. University of California, Berkeley

  4. Binning argument � Make a main codebook of all X typical sequences. 2 nH(X) and 2 nH(Y) 6182-13ihronvqanv83-4vnq-ren Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv elements. Inhgvvvo3=vn3=nv3=vnv=wvc 6182-13ihronvqanv83-4vnq-ren Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv Inhgvvvo3=vn3=nv3=vnv=wvc � Partition into 2 nH(X|Y) . 6182-13ihronvqanv83-4vnq-ren Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv Inhgvvvo3=vn3=nv3=vnv=wvc 6182-13ihronvqanv83-4vnq-ren � When observe X n , transmit index Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv Inhgvvvo3=vn3=nv3=vnv=wvc of bin it belongs to � Decoder finds member of bin that is jointly typical with Y n . � Can extend to “symmetric cases” University of California, Berkeley

  5. Symmetric case: joint binning X 6182-13ihronvqanv83-4vnq-ren � Rate limited by: Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv R x ? H(X|Y) Inhgvvvo3=vn3=nv3=vnv=wvc 6182-13ihronvqanv83-4vnq-ren R Y ? H(Y|X) Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv Inhgvvvo3=vn3=nv3=vnv=wvc R x + R Y ? H(X,Y) 6182-13ihronvqanv83-4vnq-ren Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv Inhgvvvo3=vn3=nv3=vnv=wvc 6182-13ihronvqanv83-4vnq-ren Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv Inhgvvvo3=vn3=nv3=vnv=wvc H(X,Y) 6182-13ihronvqanv83-4vnq-ren Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv Inhgvvvo3=vn3=nv3=vnv=wvc Y R 6182-13ihronvqanv83-4vnq-ren Hqigofednv3q4nvqrnvqwnv0=r y Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv Inhgvvvo3=vn3=nv3=vnv=wvc H(Y|X) 6182-13ihronvqanv83-4vnq-ren Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 Nklqenv3=349i3wvn=3qwpvnv Inhgvvvo3=vn3=nv3=vnv=wvc 6182-13ihronvqanv83-4vnq-ren Hqigofednv3q4nvqrnvqwnv0=r Nkqlveno3[nv34=3nv=3w4nvi3 H(X|Y) R Nklqenv3=349i3wvn=3qwpvnv Inhgvvvo3=vn3=nv3=vnv=wvc x University of California, Berkeley

  6. Simple binary example � X and Y => length-3 binary data (equally likely), � Correlation: Hamming distance between X and Y is at most 1. Example: When X=[0 1 0], Y => [0 1 0], [0 1 1], [0 0 0], [1 1 0]. R ? System 1 ( | ) H X Y X ˆ X ? X Encoder Decoder •X and Y correlated •Y at encoder and decoder Y 0 0 0 Need 2 bits to index this. 0 0 1 X+Y= 0 1 0 1 0 0 University of California, Berkeley

  7. R ? System 2 ( | ) H X Y X ˆ X ? X Encoder Decoder •X and Y correlated Y •Y at encoder Y � What is the best that one can do? � The answer is still 2 bits! 000 001 X 010 How? 100 0 0 0 Coset-1 1 1 1 111 110 101 011 University of California, Berkeley

  8. ? ? ? ? 0 0 0 0 0 1 Coset-2 Coset-1 ? ? ? ? ? ? ? ? 1 1 1 1 1 0 ? ? 1 0 0 ? ? 0 1 0 ? ? Coset-3 Coset-4 ? ? ? ? 0 1 1 ? ? 1 0 1 •Encoder -> index of the coset containing X. •Decoder reconstructs X in given coset. Note: •Coset-1 -> repetition code. •Each coset -> unique “syndrome” •DIstributed Source Coding Using Syndromes University of California, Berkeley

  9. Group interpretation of “binning” Rules of thumb: � Want high density of elements in codebook 1. Want members of each bin as far apart 2. Consider error-correcting codes ! � Codes select a (normal) subgroup of all possible elements. Members of a subgroup is as far apart as possible. Error occurs when distance between side info to main info > d min � Example: (3,1) repetition codes: can compress if d H (X,Y)<2 � 000 001 coset-00 coset-10 111 110 010 100 coset-01 coset-11 101 011 University of California, Berkeley

  10. Intuition behind source coding with side info. Why does it not matter if encoder doesn’t have Y? Case I: Y present at both ends X=Y+N Q ˆ ˆ ˆ ? N ? X N X N ? ? Y Y � X = Y+N, where N is Gaussian (note X and Y need not be Gaussian) � Subtract Y and quantize only N, add Y back at the decoder. ? ? ? ? ? ? ? ? � Transmission rate: ˆ ˆ ˆ ; | I N h h N N N N ? ? ? ? ? 2 2 ? ? ? 1 n q log ? ? 2 ? 2 q University of California, Berkeley

  11. Intuition (contd.)… Case II: Y present at decoder only: X=Y+N Q ˆ ? ? X W W ˆ X N ? ? Y Y � Quantize to same rate, subtract and add Y back at the decoder. ? ? ? ? ? ? � Transmission rate: ; ; I W X I W Y ? ? ? ? ? ? | | h W Y h W X ? ? ? ? ? ? ? h N Q h Q ? ? ? ? ? 2 2 ? ? ? 1 n q log ? ? 2 ? 2 ? ? q University of California, Berkeley

  12. Geometric Interpretation Case I: Y at both sides Case II: Y at decoder only ? radius n ? q ? radius q University of California, Berkeley

  13. Sending the difference telepathically � Jump ahead to a real-world example: CNN X - Temperature in Boston Y - Temperature in Providence � Suppose we can bound difference, most of the time < 8 degrees � If Boston knows the reading of Providence, can just send difference. � But this means that the information Y must be available at both Boston and Providence! � Establishing communication network expensive in a sensor network! University of California, Berkeley

  14. Motivations for sensor networks • Dense sensor network � high spatial redundancy • Need to remove redundancy without communication • Assume statistical correlation properties of neighboring nodes are known/learnt University of California, Berkeley

  15. Consider the following idea Y X 0 1 2 3 4 5 6 7 A B C D A B C D We have � Difference at most 1 cell. compressed � Send only index of “coset”: A,B,C,D from 3 bits � Decoder decide which member of coset to 2 bits is the correct answer University of California, Berkeley

  16. Coding operation of “binning” � Performance determined by selection of main and subgroups. � Tradeoff: Quantization error : determined by the main group Coset error : determined by the subgroup � Quantization error : want main group to be dense � Coset error : want intra-coset distance of subgroup to be as large as possible University of California, Berkeley

  17. Gentle intro to groups and codes � Key idea: algebraic codes is a subgroup of (discrete) signal set. � For example: (7,4) Hamming code is subgroup of {0,1} 7 . � Therefore codes induce a (geometrically uniform) partition! � We develop several examples in the following University of California, Berkeley

  18. Partitioning a scalar quantizer � Start with a scalar quantizer � Partition into PAM signals 0 1 2 3 4 5 6 7 A B C D A B C D � Call this SQ-PAM University of California, Berkeley

  19. Better idea: using TCM codes � Objective of algebraic codes: sphere packing – densest packing for distance and rate. � Use a TCM code to partition ? L . ? ? 1 0 D G= ? ? 2 ? 1 ? D D ? ? G 0 0 1 Coset selector University of California, Berkeley

  20. And yet better … ! � Can also induce partition on codes themselves by choosing subcodes. G 2 G 1 Coset selector � Called: TCQ-TCM University of California, Berkeley

  21. Alternative representation � Subspace of a code is a subgroup of the code G G Y Note: send LSB of codewords! University of California, Berkeley

  22. SQ-PAM: Scalar quantization, pulse- amplitude modulation Y X � Back to previous example: 0 1 2 3 4 5 6 7 A B C D A B C D � The letters index different cosets of a PAM code. � Start with scalar quantization. Encoder calculates the index of the bin. Transmit index of the bin. Decoder receives index of the bin. Use correlated reading to determine which member of the bin is correct. University of California, Berkeley

  23. Observation: quantization indifference � Important note: ( x ) f X Quantizer can’t differentiate A from A ?? X ? * x ? ? * ( ) f x i d ( ) f X � Therefore: X ? ?? i Must combine statistics of members of bins -2d * -d * d * 2d * 0 X � Use PDF periodization : repeat PDFs using parameter d*. X -d * /2 d * /2 � Design using f’ x (x) University of California, Berkeley

  24. Caveats: choice of d* � If too small: high coset error � If too large: high quantization error 3d * 4d * 0 d * 2d * f(x) X 0 d * 2d * f(x) X University of California, Berkeley

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend