a systematic approach to incremental redundancy over
play

A Systematic Approach to Incremental Redundancy over Erasure - PowerPoint PPT Presentation

A Systematic Approach to Incremental Redundancy over Erasure Channels Anoosheh Heidarzadeh (Texas A&M University) Joint work with Jean-Francois Chamberland (Texas A&M University), Parimal Parag (Indian Institute of Science, Bengaluru),


  1. A Systematic Approach to Incremental Redundancy over Erasure Channels Anoosheh Heidarzadeh (Texas A&M University) Joint work with Jean-Francois Chamberland (Texas A&M University), Parimal Parag (Indian Institute of Science, Bengaluru), and Richard D. Wesel (University of California, Los Angeles) June 19, 2018 ISIT

  2. Random Coding + Hybrid ARQ Consider the problem of communicating a k -bit message over a memoryless binary erasure channel (BEC) with erasure probability 0 ≤ ǫ < 1, using random coding + hybrid ARQ ∗ : ∗ ARQ: Automatic Repeat Request 1 / 19

  3. Random Coding + Hybrid ARQ Consider the problem of communicating a k -bit message over a memoryless binary erasure channel (BEC) with erasure probability 0 ≤ ǫ < 1, using random coding + hybrid ARQ ∗ : • Consider a random binary parity-check matrix H of size ( n − k ) × n • Consider an arbitrary mapping from k -bit messages to n -bit codewords in the null-space of matrix H ∗ ARQ: Automatic Repeat Request 1 / 19

  4. Random Coding + Hybrid ARQ Consider the problem of communicating a k -bit message over a memoryless binary erasure channel (BEC) with erasure probability 0 ≤ ǫ < 1, using random coding + hybrid ARQ ∗ : • Consider a random binary parity-check matrix H of size ( n − k ) × n • Consider an arbitrary mapping from k -bit messages to n -bit codewords in the null-space of matrix H • The source maps the message x = ( x 1 , . . . , x k ) to a codeword c = ( c 1 , . . . , c n ) • The source divides the codeword c into m sub-blocks c 1 , . . . , c m for a given 2 ≤ m ≤ n , where c i = ( c n i − 1 , . . . , c n i ) for i ∈ [ m ] = { 1 , . . . , m } , and n 1 , . . . , n m are given integers such that k ≤ n 1 < n 2 < · · · < n m = n , and n 0 = 0 ∗ ARQ: Automatic Repeat Request 1 / 19

  5. Random Coding + Hybrid ARQ (Cont.) • The source sends the first sub-block, c 1 • The destination receives c 1 , or a proper subset thereof • The destination performs ML decoding to recover the message x , and depending on the outcome of decoding, sends an ACK or NACK to the source over a perfect feedback channel 2 / 19

  6. Random Coding + Hybrid ARQ (Cont.) • The source sends the first sub-block, c 1 • The destination receives c 1 , or a proper subset thereof • The destination performs ML decoding to recover the message x , and depending on the outcome of decoding, sends an ACK or NACK to the source over a perfect feedback channel • If the source receives a NACK, it sends next sub-block, c 2 , and waits for an ACK or NACK again • This action repeats until (i) the source receives an ACK; or (ii) it exhausts all the sub-blocks, and does not receive an ACK 2 / 19

  7. Random Coding + Hybrid ARQ (Cont.) • The source sends the first sub-block, c 1 • The destination receives c 1 , or a proper subset thereof • The destination performs ML decoding to recover the message x , and depending on the outcome of decoding, sends an ACK or NACK to the source over a perfect feedback channel • If the source receives a NACK, it sends next sub-block, c 2 , and waits for an ACK or NACK again • This action repeats until (i) the source receives an ACK; or (ii) it exhausts all the sub-blocks, and does not receive an ACK In case (i), the communication round succeeds, and the source starts a new communication round for the next message In case (ii), the communication round fails, and the source starts a new communication round for the message x . 2 / 19

  8. Problem Expected Effective Blocklength: The expected number of bits being sent by the source within a communication round (the randomness comes from both the channel and the code) Problem: To identify the aggregate sub-block sizes n 1 , . . . , n m − 1 such that the expected effective blocklength is minimized where a maximum of m sub-blocks (i.e., maximum m bits of feedback) are available in a communication round 3 / 19

  9. Previous Works vs. This Work Previous works (for channels other than BEC): [1] Vakilinia-Williamson-Ranganathan-Divsalar-Wesel ’14 (Feedback systems using non-binary LDPC codes with a limited number of transmissions, ITW) [2] Williamson-Chen-Wesel ’15 (Variable-length convolutional coding for short blocklengths with decision feedback, TCOM) [3] Vakilinia-Ranganathan-Divsalar-Wesel ’16 (Optimizing transmission lengths for limited feedback with non-binary LDPC examples, TCOM) In this work, we propose a solution by extending the sequential differential optimization (SDO) framework of [3] for BEC 4 / 19

  10. Expected Effective Blocklength • R t : the number of bits observed by the destination at time t , i.e., R t ∼ B ( t , 1 − ǫ ) • P R t : the discrete probability measure associated with the random variable (r.v.) R t , i.e., � t � ǫ t − r (1 − ǫ ) r P R t ( r ) = r 5 / 19

  11. Expected Effective Blocklength • R t : the number of bits observed by the destination at time t , i.e., R t ∼ B ( t , 1 − ǫ ) • P R t : the discrete probability measure associated with the random variable (r.v.) R t , i.e., � t � ǫ t − r (1 − ǫ ) r P R t ( r ) = r • P s ( r ): the probability of decoding success given that the number of bits observed by the destination is r , i.e.,  0 0 ≤ r < k   � n − r − 1 � 1 − 2 l − ( n − k ) � P s ( r ) = k ≤ r < n l =0  1 r ≥ n  5 / 19

  12. Expected Effective Blocklength (Cont.) • P ACK ( t ): the probability that the destination sends an ACK to the source at time t or earlier, i.e., � 1 − � t e =0 (1 − P s ( t − e )) P R t ( t − e ) k ≤ t ≤ n P ACK ( t ) = 0 0 ≤ t < k 6 / 19

  13. Expected Effective Blocklength (Cont.) • P ACK ( t ): the probability that the destination sends an ACK to the source at time t or earlier, i.e., � 1 − � t e =0 (1 − P s ( t − e )) P R t ( t − e ) k ≤ t ≤ n P ACK ( t ) = 0 0 ≤ t < k • S : the index of last sub-block being sent by the source within a communication round • E [ n S ]: the expected effective blocklength, i.e., m − 1 � E [ n S ] = n m + ( n i − n i +1 ) P ACK ( n i ) i =1 Problem: To identify n 1 , . . . , n m − 1 such that E [ n S ] is minimized 6 / 19

  14. Multi-Dimensional vs. One-Dimensional Optimization Challenge: The problem of minimizing E [ n S ] is a multi-dimensional optimization problem with integer variables n 1 , . . . , n m − 1 7 / 19

  15. Multi-Dimensional vs. One-Dimensional Optimization Challenge: The problem of minimizing E [ n S ] is a multi-dimensional optimization problem with integer variables n 1 , . . . , n m − 1 Idea: Sequential differential optimization (SDO) reduces the problem to a one-dimensional optimization with integer variable n 1 Recall m − 1 � E [ n S ] = n m + ( n i − n i +1 ) P ACK ( n i ) i =1 Suppose that a smooth approximation F ( t ) of P ACK ( t ) is given Define m − 1 ˜ � E [ n S ] = n m + ( n i − n i +1 ) F ( n i ) i =1 7 / 19

  16. Sequential Differential Optimization (SDO) Recall m − 1 ˜ � E [ n S ] = n m + ( n i − n i +1 ) F ( n i ) i =1 SDO: Given ˜ n 1 , . . . , ˜ n i − 1 , an approximation ˜ n i of the optimal value of n i for 2 ≤ i ≤ m − 1 can be computed via setting the partial derivative of ˜ E [ n S ] with respect to n i − 1 to zero and solving for n i 8 / 19

  17. Sequential Differential Optimization (SDO) Recall m − 1 ˜ � E [ n S ] = n m + ( n i − n i +1 ) F ( n i ) i =1 SDO: Given ˜ n 1 , . . . , ˜ n i − 1 , an approximation ˜ n i of the optimal value of n i for 2 ≤ i ≤ m − 1 can be computed via setting the partial derivative of ˜ E [ n S ] with respect to n i − 1 to zero and solving for n i ❀ Given ˜ n 1 (and ˜ n 0 = −∞ ), an approximation ˜ n i of the optimal value of n i for all 2 ≤ i ≤ m − 1 can be obtained sequentially by � dF ( t ) � � − 1 � � n i = ˜ ˜ n i − 1 + ( F (˜ n i − 1 ) − F (˜ n i − 2 )) � dt � t =˜ n i − 1 ❀ a one-dimensional optimization problem with variable n 1 Challenge: To find a smooth approximation F ( t ) to P ACK ( t ) 8 / 19

  18. Main Idea and Contributions Fact: P ACK ( t ) for t < n matches the CDF of the r.v. N n that represents the length of a communication round Idea: • To study the asymptotic behavior of the mean and variance of the r.v. N n as n grows large, and • To approximate P ACK ( t ) by the CDF of a continuous r.v. with a mean and variance matching the mean and variance of the r.v. N n as n grows large 9 / 19

  19. Main Idea and Contributions Fact: P ACK ( t ) for t < n matches the CDF of the r.v. N n that represents the length of a communication round Idea: • To study the asymptotic behavior of the mean and variance of the r.v. N n as n grows large, and • To approximate P ACK ( t ) by the CDF of a continuous r.v. with a mean and variance matching the mean and variance of the r.v. N n as n grows large In this work, we show that lim n →∞ E [ N n ] = ( k + c 0 ) / (1 − ǫ ) and lim n →∞ Var ( N n ) = (( k + c 0 ) ǫ + c 0 + c 1 ) / (1 − ǫ ) 2 where c 0 = 1 . 60669 ... is the Erd¨ os-Borwein constant, and c 1 = 1 . 13733 ... is the digital search tree constant 9 / 19

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend