discus distributed
play

DISCUS: Distributed e s n s o Compression for r w Sensor - PowerPoint PPT Presentation

DISCUS: Distributed e s n s o Compression for r w Sensor Networks e b s K. Ramchandran, K. Pister, M. Jordan, J. Malik, V. Anantharam, L. Doherty, J. Hill, J. Kusuma, W. Leven, A. Ng, S. Pradhan, R. Szewczyk, J. Yeh


  1. DISCUS: Distributed e s n s o Compression for r w Sensor Networks e b s K. Ramchandran, K. Pister, M. Jordan, J. Malik, V. Anantharam, L. Doherty, J. Hill, J. Kusuma, W. Leven, A. Ng, S. Pradhan, R. Szewczyk, J. Yeh http://basics.eecs.berkeley.edu/sensorwebs BASiCS Group University of California at Berkeley

  2. Motivations � SmartDust and MEMS: quantum leap in device and computing device technology � Excellent platform for efficient, smart devices and collaborative, distributed theory and algorithms � Bring unified theoretical approach to address specific scenarios � Sensor networks -> correlated observations: want to compress in efficient and distributed manner University of California, Berkeley

  3. Mission Statement Correlated information Experimental testbed to � leverage advances in SmartDust MEMS technology and demonstrate developed theories and algorithms -- the BCSN (Berkeley Campus Sensor Network). University of California, Berkeley

  4. Scenario: correlated observations � Varying temperature field Information Theory: can gain performance boost thanks to correlation DISCUS: constructive way of harnessing correlation for blind joint compression close to optimum Central Decoder University of California, Berkeley

  5. At The End of The Day � Want to compress [ H(y), H(x|y) ] correlated observations � Don’t want communication 7 6 between motes 5 R 4 y 3 [ H(x), H(y|x) ] 3 4 5 6 7 R Source 1 Encoder 1 x R x Joint correlated Decoder R Encoder 2 Source 2 y University of California, Berkeley

  6. Distributed Compression � Slepian-Wolf, Wyner-Ziv (ca. 70’s): information-theoretically can jointly compress correlated observations, even with no communication between motes. [ H(y), H(x|y) ] Source 1 Encoder 1 Joint correlated 7 6 R Decoder 5 y 4 Encoder 2 3 Source 2 [ H(x), H(y|x) ] 3 4 5 6 7 R x � DISCUS: constructive framework, using well- studied error-correcting codes from coding theory. University of California, Berkeley

  7. Source Coding with Side Information – discrete alphabets � X and Y => length-3 binary data (equally likely), � Correlation: Hamming distance between X and Y is at most 1. Example: When X=[0 1 0], Y => [0 1 0], [0 1 1], [0 0 0], [1 1 0]. R ≥ ( | ) H X Y System 1 X X = ˆ X Decoder Encoder � X and Y correlated � Y at encoder and decoder Y 0 0 0 Need 2 bits to index this. 0 0 1 X+Y= 0 1 0 1 0 0 University of California, Berkeley

  8. R ≥ System 2 ( | ) H X Y X X = ˆ X Encoder Decoder � X and Y correlated � Y at decoder Y � What is the best that one can do? Y � The answer is still 2 bits! 000 001 X 010 100 0 0 0 How? Coset-1 1 1 1 111 110 101 011 University of California, Berkeley

  9. 0 0 1 0 0 0 � � � � Coset-2 Coset-1 � � � 1 1 0 1 1 1 � � 0 1 0 1 0 0 � � � � Coset-3 Coset-4 � � � � 1 0 1 0 1 1 � � � Encoder -> index of the coset containing X. � Decoder ->X in given coset. Note: � Coset-1 -> repetition code. � Each coset -> unique “syndrome” � DIstributed Source Coding Using Syndromes University of California, Berkeley

  10. Discrete case – handwaving argument � Consider abstraction using lattices � Correlation structure translates to distance between X and Y � Suppose X and Y have distance at most 1 � How to compress X if we know that Y is close to X? � Use cosets! � Partition lattice space into cosets and send coset index! University of California, Berkeley

  11. A little accounting � How much have we compressed? � Without compression: log 2 36 bits � With compression: log 2 9 bits X Y University of California, Berkeley

  12. And now the math cleanup! � Can use well-studied channel codes to build codes on euclidean space! � Basic idea: want to select points as far apart as possible (i.e. cosets) – done using channel codes � Send ambiguous information, side information will disambiguate information! � Can use Hamming, TCM, RS, etc. codes University of California, Berkeley

  13. Generalized coset codes: a/symmetric applications � How to compress to Hexagonal Lattice arbitrary (but information- theoretically sound) rates? � Use lattice idea again: 2 encoders send ambiguous information, Encoder-1 but only one true Encoder-2 answer that satisfy correlation constraint University of California, Berkeley

  14. Continuous case – quantization and estimation � Modular boxes: Quant Discrete Discrete Estim izer Encoder Decoder ator University of California, Berkeley

  15. Gaussian case example � Estimation with side information: – First quantize sensor information – Use discrete DISCUS encoders / decoders – Estimate given decoder output and side information [ ] ˆ = ∈ Γ | , X E X Y X i University of California, Berkeley

  16. No Such Thing as Free Lunch � Big issue: finer quantization means larger error distances – need stronger code to enable finer quantization!!! University of California, Berkeley

  17. Deployment Scenario � Observe readings: learn correlation � Use clustering algorithm � Assign codebooks to different motes � Can rotate “centroid” mote in each group � Centroid mote report to central decoder University of California, Berkeley

  18. Dynamic Tracking Scenario � Wish to dynamically update clustering � Good news: no need for child nodes to be aware!!! Codebook assignment not an issue! � Only central decoder needs to be aware of clustering � Can also rotate centroid node within each cluster: can detect correlation changes University of California, Berkeley

  19. Conclusions � Can use efficient error-correcting codes to enable distributed compression � Power/bandwidth/quality tradeoff through quantization and codebook selection � Very little work for encoders! � Tremendous gains in a sensor network University of California, Berkeley

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend