data driven ensembles for deep and hard decision
play

Data-Driven Ensembles for Deep and Hard-Decision Hybrid Decoding - PowerPoint PPT Presentation

Introduction Introduction Introduction Data-Driven Ensembles for Deep and Hard-Decision Hybrid Decoding International Symposium on Information Theory Tomer Raviv Nir Raviv Yair Beery School of


  1. Introduction Introduction Introduction Data-Driven Ensembles for Deep and Hard-Decision Hybrid Decoding International Symposium on Information Theory Tomer Raviv Nir Raviv Yair Be’ery School of Electrical Engineering Tel Aviv University, Israel June 2020

  2. Outline I. Background II. On the Importance of Data III. Data-Driven Ensembles IV. Summary

  3. Outline I. Background II. On the Importance of Data III. Data-Driven Ensembles IV. Summary

  4. Error Correction Codes Error Correction Codes increases reliability of transmission by redundancy Assumptions Linear Codes AWGN channel Galois Field - GF(2) Encoding

  5. Simplified Communication System

  6. Parity Check Matrix Property of linear codes Known as parity check matrix Size Each row corresponds to a check constraint Each column to a variable

  7. • Tanner Graph [1] Urbanke and Richardson ,“Modern Coding Theory” , 2008

  8. Belief Propagation [2] Judea Pearl , “ Probabilistic reasoning in intelligent systems: networks of plausible inference ” , 1988

  9. Belief Propagation [2] Judea Pearl , “ Probabilistic reasoning in intelligent systems: networks of plausible inference ” , 1988

  10. Belief Propagation [2] Judea Pearl , “ Probabilistic reasoning in intelligent systems: networks of plausible inference ” , 1988

  11. Belief Propagation [2] Judea Pearl , “ Probabilistic reasoning in intelligent systems: networks of plausible inference ” , 1988

  12. Belief Propagation [2] Judea Pearl , “ Probabilistic reasoning in intelligent systems: networks of plausible inference ” , 1988

  13. Belief Propagation [2] Judea Pearl , “ Probabilistic reasoning in intelligent systems: networks of plausible inference ” , 1988

  14. Belief Propagation [2] Judea Pearl , “ Probabilistic reasoning in intelligent systems: networks of plausible inference ” , 1988

  15. Belief Propagation [2] Judea Pearl , “ Probabilistic reasoning in intelligent systems: networks of plausible inference ” , 1988

  16. Belief Propagation Hard decision rule: [2] Judea Pearl , “ Probabilistic reasoning in intelligent systems: networks of plausible inference ” , 1988

  17. Suboptimal Performance Cycles degrade performance of BP… How to compensate for cycles?

  18. Suboptimal Performance Cycles degrade performance of BP… How to compensate for cycles? Deep learning!

  19. Weighted Belief Propagation Deep learning • Assign weights over the edges • Choose appropriate loss • Apply SGD Intuition Adjusting the weights compensates for small cycles [3] Nachmani et al. , “Learning to Decode Linear Codes Using Deep Learning”, Sep 16

  20. WBP [3] Nachmani et al. , “Learning to Decode Linear Codes Using Deep Learning”, Sep 16

  21. WBP Binary cross entropy multiloss: The BP and channel exhibit symmetry Training is done with zero codeword [3] Nachmani et al. , “Learning to Decode Linear Codes Using Deep Learning”, Sep 16

  22. Outline I. Background II. On the Importance of Data III. Data-Driven Ensembles IV. Summary

  23. ML Model Selection in Decoding • Incorporating domain knowledge into model • Constraints-free model

  24. ML Model Selection in Decoding • Incorporating domain knowledge into model Choice of hypothesis class Learnable parameters assignment

  25. ML Model Selection in Decoding • Constraints-free model Choice of any state-of-the-art architecture Less limitations on solution space [4 ] Tobias Gruber et al. “On deep learning - based channel decoding”. In: 51st Annual Conference on Information Sciences and Systems (CISS). 2017.

  26. ML Model Selection in Decoding Currently, the model based approach dominates Curse of dimensionality Yet, the model isn’t all…

  27. Data Importance in Deep Learning Training data is core in DL Not fully understood

  28. Data Distinction Classical ML Decoding ML • Limited amount • Unlimited amount • Distribution is unknown • Distribution is specified by channel

  29. Outline I. Background II. On the Importance of Data III. Data-Driven Ensembles IV. Summary

  30. Ensembles A single decoder is limited.. Combination of decoders is superior Divide and conquer principle [5] Lior Rokach , “Pattern Classification using Ensemble Methods”, 2010

  31. Ensembles for Decoding List-decoding is suboptimal ”there exists no clear evidence on which graph permutation performs best for a given input” [6] Elkelesh et al., “Belief Propagation List Decoding of Polar Codes”, 2018

  32. Ensembles for Decoding Key points decoding performance complexity [6] Elkelesh et al., “Belief Propagation List Decoding of Polar Codes”, 2018

  33. Data-Driven Ensemble

  34. Data Specialized Decoders Based on Hamming distance Error pattern computation Region assignment Match to dataset Train on Based on syndrome ( won’t have time to cover)…

  35. Gating Function Classical hard decision decoder (HDD) Outputs , calculate Gating types single-choice gating - for realizing all decoders gating - random-choice gating - randomly

  36. Combiner Likelihood-maximizing function [7]: [7] Ralf Koetter et al., “Characterizations of pseudo -codewords of (low- density) parity check codes”, 2007

  37. Experimental Results Results for BCH(63,45): CR-BCH(63,36) Code Waterfall - 0.3dB FER gain Error-floor – 1.25dB FER gain Compared to best previous results [8] 3-decoders [8] Ishay Be’ery et al., "Active Deep Decoding of Linear Codes," in IEEE Transactions on Communications, Feb. 2020.

  38. Experimental Results Results for BCH(63,45): CR-BCH(63,45) Code Waterfall - 0.3dB FER gain Error-floor – 1dB FER gain Compared to best previous results [8] 3-decoders [8] Ishay Be’ery et al., "Active Deep Decoding of Linear Codes," in IEEE Transactions on Communications, Feb. 2020.

  39. Outline I. Background II. On the Importance of Data III. Data-Driven Ensembles IV. Summary

  40. Summary Domain knowledge is vital for ML in communication Data is important as the algorithm (!) How to effectively and efficiently utilize data is still insufficiently researched

  41. Bibliography [1] Urbanke and Richardson ,“Modern Coding Theory” , 2008 [2] Judea Pearl , “ Probabilistic reasoning in intelligent systems: networks of plausible inference ” , 1988 [3] Nachmani et al. , “Learning to Decode Linear Codes Using Deep Learning”, Sep 16 [4 ] Tobias Gruber et al. “On deep learning - based channel decoding”. In: 51st Annual Conference on Information Sciences and Systems (CISS). 2017. [5] Lior Rokach , “Pattern Classification using Ensemble Methods”, 2010 [6] Elkelesh et al., “Belief Propagation List Decoding of Polar Codes”, 2018 [7] Ralf Koetter et al., “Characterizations of pseudo -codewords of (low- density) parity check codes”, 2007 [8] Ishay Be’ery et al., "Active Deep Decoding of Linear Codes," in IEEE Transactions on Communications, Feb. 2020.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend