Gaussian Multiple and Random Access in the Finite Blocklength Regime - - PowerPoint PPT Presentation

gaussian multiple and random access in the finite
SMART_READER_LITE
LIVE PREVIEW

Gaussian Multiple and Random Access in the Finite Blocklength Regime - - PowerPoint PPT Presentation

Gaussian Multiple and Random Access in the Finite Blocklength Regime Recep Can Yavas California Institute of Technology June 21-26, 2020 Joint work with Victoria Kostina and Michelle Effros ISIT 2020 This work was supported in part by the


slide-1
SLIDE 1

Gaussian Multiple and Random Access in the Finite Blocklength Regime

Recep Can Yavas

California Institute of Technology

June 21-26, 2020

Joint work with Victoria Kostina and Michelle Effros ISIT 2020

This work was supported in part by the National Science Foundation (NSF) under grant CCF-1817241. 1 / 30

slide-2
SLIDE 2

Talk Plan

We present two achievability results for

1 Gaussian Multiple Access Channel (MAC) 2 Gaussian Random Access Channel (RAC) 2 / 30

slide-3
SLIDE 3

Gaussian Multiple Access Channel (MAC)

Maximal power constraint on the codewords: X n

k 2 ≤ nPk for k = 1, . . . , K

Notation: [M] = {1, . . . , M}, xA = (xa : a ∈ A)

3 / 30

slide-4
SLIDE 4

MAC Code Definition

Definition (K-transmitter MAC)

An (n, M1, . . . , MK, ǫ, P1, . . . , PK) code for the K-transmitter MAC consists of K encoding functions fk : [Mk] → Rn, k ∈ [K] a decoding function g : Rn → [M1] × · · · × [MK] with maximal power constraint fk(mk)2 ≤ nPk for mk ∈ [Mk], k ∈ [K] and 1

K

  • k=1

Mk

  • m[K]∈[M1]×···×[MK ]

P

  • g(Y n

K) = m[K] | X n k = fk(mk) ∀k ∈ [K]

  • ≤ ǫ

average probability of error

4 / 30

slide-5
SLIDE 5

Prior art: Point-to-point (P2P) Gaussian Channel (K = 1)

Channel: M∗(n, ǫ, P) {max M : an (n, M, ǫ, P) code exists.}. log M∗(n, ǫ, P) = nC(P)−

  • nV (P)Q−1(ǫ)+1

2 log n + O(1)

C(P)= 1

2 log(1+P)

(capacity) V (P)= P(P+2)

2(1+P)2

(dispersion)

third-order term Achievability (≥): [Tan-Tomamichel 15’] Converse (≤): [Polyanskiy et al. 10’]

5 / 30

slide-6
SLIDE 6

The Lesson from P2P Channel

We can achieve log M∗(n, ǫ, P) = nC(P) −

  • nV (P)Q−1(ǫ) + 1

2 log n + O(1) by using

6 / 30

slide-7
SLIDE 7

Motivation (MAC)

We are interested in refining the achievable third-order term for the Gaussian MAC in the finite blocklength regime. For the point-to-point case, it is known that the third-order term +1/2 log n is optimal. We want to show that +1/2 log n1 is achievable for the Gaussian MAC.

7 / 30

slide-8
SLIDE 8

Gaussian MAC - Main Result

Theorem

For any ǫ ∈ (0, 1) and any P1, P2 > 0, an (n, M1, M2, ǫ, P1, P2) code for the two-transmitter Gaussian MAC exists provided that   log M1 log M2 log M1M2   ∈ nC(P1, P2) − √nQinv(V(P1, P2), ǫ) + 1 2 log n1 + O(1)1.

C(P1, P2) =   C(P1) C(P2) C(P1 + P2)   = capacity vector V(P1, P2) = 3 × 3 positive-definite dispersion matrix Qinv(V, ǫ) = multidimensional counterpart of inverse Q-function Qinv(V, ǫ)

  • z ∈ Rd : P[Z ≤ z] ≥ 1 − ǫ
  • where Z ∼ N(0, V)

component-wise

8 / 30

slide-9
SLIDE 9

What does Qinv(V, ǫ) look like?

Qinv(1, ǫ) {x : x ≥ Q−1(ǫ)} Qinv(V, ǫ)

  • z ∈ Rd : P[Z ≤ z] ≥ 1 − ǫ
  • PDF of N(0, 1)
  • 3
  • 2
  • 1

1 2 3 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4

Area = 0.95

P [N (0, V) ≤ (z1, z2)] = 0.95 9 / 30

slide-10
SLIDE 10

Example

Achievable region for P1 = 2, P2 = 1 and ǫ = 10−3:

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8

R1

0.1 0.2 0.3 0.4 0.5 0.6

R2

10 / 30

slide-11
SLIDE 11

Comparison with the literature

Our third-order term improves! nC(P1, P2) − √nQinv(V(P1, P2), ǫ)+1 2 log n1 + O(1)1 > O

  • n1/4

1 [MolavianJazi-Laneman 15’] > O

  • n1/4 log n
  • 1

[Scarlett et al. 15’] Proof techniques:

Our bound: Spherical codebook + Maximum-likelihood decoder [MolavianJazi-Laneman 15’] : Spherical codebook + threshold decoder [Scarlett et al. 15’] : Constant composition codes + Quantization

11 / 30

slide-12
SLIDE 12

Encoding and decoding

Encoding: independently generate Mk codewords for k = 1, 2: [Shannon 49’] used spherical codebook to bound error exponent of the P2P Gaussian channel. Decoding: Mutual information density ı1,2(xn

1 , xn 2 ; yn) log

PY n

2 |X n 1 ,X n 2 (yn|xn

1 , xn 2 )

PY n

2 (yn)

Maximum likelihood (ML) Decoder: g(yn) = arg max

m1,m2 ı1,2(f1(m1), f2(m2); yn)

12 / 30

slide-13
SLIDE 13

Main Tool: Random-Coding Union (RCU) Bound

P2P case: proved in [Polyanskiy et al. 10’] Using the ML decoder, for a general MAC:

Theorem (New RCU bound for MAC)

For arbitrary input distributions PX1 and PX2, there exists a (M1, M2, ǫ)-MAC code such that ǫ ≤ E

  • min
  • 1, (M1 − 1)P
  • ı1( ¯

X1; Y2|X2) ≥ ı1(X1; Y2|X2) | X1, X2, Y2

  • + (M2 − 1)P
  • ı2( ¯

X2; Y2|X1) ≥ ı2(X2; Y2|X1) | X1, X2, Y2

  • + (M1 − 1)(M2 − 1)P
  • ı1,2( ¯

X1, ¯ X2; Y2) ≥ ı1,2(X1, X2; Y2) | X1, X2, Y2 , where PX1, ¯

X1,X2, ¯ X2,Y2(x1, ¯

x1, x2, ¯ x2, y) = PX1(x1)PX1(¯ x1)PX2(x2)PX2(¯ x2)PY2|X1X2(y|x1, x2). Crucial in refining the third-order term to 1

2 log n

13 / 30

slide-14
SLIDE 14

Key Challenge

Modified mutual information density r.v.: ˜ ı2   ˜ ı1(X n

1 ; Y n 2 |X n 2 )

˜ ı2(X n

2 ; Y n 2 |X n 1 )

˜ ı1,2(X n

1 , X n 2 ; Y n 2 )

  − nC(P1, P2) ˜ ı1,2(xn

1, xn 2; y n) log PY n

2 |X n 1 ,X n 2 (y n|xn

1, xn 2)

QY n

2 (y n)

with QY n

2 ∼ N(0, (1 + P1 + P2)In)

Lemma (New Berry-Esséen type bound)

Let D ∈ R3 be a convex, Borel measurable set and Z ∼ N(0, V(P1, P2)). Then

  • P

1 √n˜ ı2 ∈ D

  • − P [Z ∈ D]
  • ≤ C0

√n [MolavianJazi-Laneman 15’, Prop. 1] showed a weaker upper bound with O

  • 1

n1/4

  • using CLT for functions =

⇒ affects the third-order term We use a different technique to prove this lemma.

14 / 30

slide-15
SLIDE 15

Proof of Lemma

Problem: We cannot use Berry-Esséen theorem directly since X n

1 and

X n

2 are not i.i.d.

Solution:

Conditional dist. ˜ ı2|X n

1 , X n 2 = q is a sum of independent r.v.s

Apply the multidimensional Berry-Esséen theorem to that sum of independent vectors after conditioning on the inner product X n

1 , X n 2 .

Then integrate the probabilities over q.

15 / 30

slide-16
SLIDE 16

Extension to K-transmitter (Pk = P, Mk = M ∀ k ∈ [K])

Theorem

For any ǫ ∈ (0, 1), and P > 0, an (n, M1, ǫ, P1)-MAC code for the K-transmitter Gaussian MAC exists provided that K log M ≤ nC(KP) −

  • n(V (KP) + Vcr(K, P))Q−1(ǫ) + 1

2 log n + O(1). Vcr(K, P) is the cross dispersion term Vcr(K, P) = K(K − 1)P2 2(1 + KP)2 .

16 / 30

slide-17
SLIDE 17

Talk Plan

We present two achievability results for

1 Gaussian Multiple Access Channel (MAC) 2 Gaussian Random Access Channel (RAC) 17 / 30

slide-18
SLIDE 18

Random access

Random access solutions such as ALOHA, treating interference as noise, or orthogonalization methods (TDMA/FDMA) perform poorly. We want to design a random access communication strategy that

does not require the knowledge of transmitter activity and still does not cause a performance loss compared to k-MAC.

18 / 30

slide-19
SLIDE 19

Rateless Gaussian RAC Communication

There are K transmitters in total. A subset of those with size k are active. Nobody knows the active transmitters. No probability of being active is assigned to transmitters. 19 / 30

slide-20
SLIDE 20

Rateless Gaussian RAC Communication

Identical encoding and list decoding as in [Polyanskiy 17’] Average probability of error ≤ ǫk for k = 0, . . . , K New: Gaussian RAC, maximal power constraint: f(m)nk 2 ≤ nk P for all k and m 20 / 30

slide-21
SLIDE 21

Rateless Gaussian RAC Communication

Rateless coding scheme that we defined in the context of DMCs [Effros, Kostina, Yavas, “Random access channel coding in the finite blocklength regime", 18’] Predetermined decoding times: n0, . . . , nK 21 / 30

slide-22
SLIDE 22

Communication Process

22 / 30

slide-23
SLIDE 23

RAC Code Definition

Definition

An

  • {nk, ǫk}K

k=0, M, P

  • RAC consists of

an encoder function f decoding functions {gk}K

k=0

such that Maximal power constraints are satisfied: f(m)nk2 ≤ nkP for m ∈ {1, . . . , M}, k ∈ {1, . . . , K} and

1 Mk

  • m[k]∈[M]k

P

t<k

{gt(Y nt

k ) = e}

gk(Y nk

k ) π

= m[k]

  • X nk

[k] = f(m[k])nk

  • ≤ ǫk

the average probability of error in decoding k messages at time nk 23 / 30

slide-24
SLIDE 24

Gaussian RAC - Main Result

Theorem

For any K < ∞, ǫk ∈ (0, 1) and any P > 0, an (M, {(nk, ǫk)}K

k=0, P)-code

for the Gaussian RAC exists provided that k log M ≤ nkC(kP)−

  • nk(V (kP) + Vcr(k, P))Q−1(ǫk)+1

2 log nk + O(1) for all k ∈ [K], for some positive constant C. The same first, second, and third-order terms as in Gaussian MAC with known number of transmitters!

24 / 30

slide-25
SLIDE 25

Gaussian RAC - Encoding

To satisfy the maximal power constraints for all decoding times simultaneously, we set the input distribution as:

25 / 30

slide-26
SLIDE 26

Feasible codeword set for Gaussian RAC

n1 = 2, n2 = 3, P = 1

3:1

1If we use this input dist. for the Gaussian MAC, we achieve the same first three order terms. 26 / 30

slide-27
SLIDE 27

Gaussian RAC - Decoding

Mutual information density for t transmitters: ı[t](xnt

[t]; ynt) log

PY nt

t |X nt [t] (ynt|xnt

[t])

PY nt

t (ynt)

Decoder output at time nt is gt(ynt) =    arg max

m[t] ı[t](f(m[t])nt; ynt)

if

  • 1

nt ynt2 − (1 + tP)

  • ≤ λt

e

  • therwise

If e, send ACK = 0 to request the next subcodeword of length nt+1 − nt

27 / 30

slide-28
SLIDE 28

Summary of the main theorems

Gaussian MAC:

We refine the achievable third-order term to 1/2 log n1 by using spherical codebook and ML decoder. We derive a Berry-Esséen type bound for the spherical codebook.

Gaussian RAC:

Our proposed rateless code performs as well in the first-, second-, and third-order terms as the best known communication scheme when the set of active transmitters is known.

28 / 30

slide-29
SLIDE 29

References

1

  • E. MolavianJazi and J. N. Laneman, “A second-order achievable rate region for

Gaussian multi-access channels via a central limit theorem for functions, ”IEEE Transactions on Information Theory, vol. 61, no. 12, pp. 6719–6733, Dec. 2015.

2

  • A. M. Scarlett, and A. G. i Fabregas, “Second-order rate region of

constant-composition codes for the multiple-access channel, ”IEEE Transactions on Information Theory, vol. 61, no. 1, pp. 157–172, Jan. 2015.

3

  • Y. Polyanskiy, H. V. Poor, and S. Verdu, “Channel coding rate in the finite

blocklength regime,” IEEE Transactions on Information Theory, vol. 56, no. 5, pp. 2307–2359, May 2010.

4

  • V. Y. F. Tan and M. Tomamichel, “The third-order term in the normal

approximation for the AWGN channel,” IEEE Transactions on Information Theory,

  • vol. 61, no. 5, pp. 2430–2438, May 2015.

5

  • M. Effros, V. Kostina, and R. C. Yavas, “Random access channel coding in the

finite blocklength regime,” in 2018 IEEE International Symposium on Information Theory (ISIT), June 2018, pp. 1261–1265.

6

  • Y. Polyanskiy, "A perspective on massive random-access,” in Proceedings 2017

IEEE International Symposium on Information Theory, Aachen, Germany, June 2017, pp. 2523–2527.

29 / 30

slide-30
SLIDE 30

Thanks

  • R. C. Yavas, V. Kostina, and M. Effros, “Gaussian multiple and

random access in the finite blocklength regime,” ArXiv/2001.03867,

  • 2020. Available at: https://arxiv.org/abs/2001.03867

30 / 30