SLIDE 6 Introduction and Motivation List decoding Tandem-duplication noise Setting and results
Coding redundancy (state-of-the-art)
Unlimited number of errors t = ∞ Rate loss, equivalent to that of an appropriate RLL system.
(0, k − 1)q-RLL, for alphabet size q and duplication window length k.
Jain et.al., T-IT, 2017.
Finite number of errors t < ∞ ECC optimal redundancy (lower and upper bounds): t logq(n) + O(1).
Lenz et.al., arXiv, 2018. Kovaˇ cevi´ c and Tan, IEEE Comm. Letters, 2018.
Efficient en/decoding: t logq(n) + o(log(n)) (asymptotically optimal)
Mahdavifar and Vardy, ISIT’17, 2017.
Multiple distinct reads of noisy data: (t − 1) logq(n) + O(1).
(Reconstruction with sublinear uncertainty.)
Yehezkeally and Schwartz, T-IT, 2020.
Question: At what cost may redundancy be further reduced?
Yehezkeally and Schwartz, ISIT’2020 Reconstructing Multiple Messages