Unsourced Multiuser Sparse Regression Codes achieve the Symmetric - - PowerPoint PPT Presentation

unsourced multiuser sparse regression codes achieve the
SMART_READER_LITE
LIVE PREVIEW

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric - - PowerPoint PPT Presentation

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity Alexander Fengler Joint work with Peter Jung and Giuseppe Caire | Technische Universit at Berlin Communications and Information Theory Chair CommI Multiple Access


slide-1
SLIDE 1

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity

Alexander Fengler Joint work with Peter Jung and Giuseppe Caire | Technische Universit¨ at Berlin

slide-2
SLIDE 2

Multiple Access - Uplink

Typical mMTC specifications: – (very) large no. of potential users Ktot ∼ ∞ – random access with sparse activity: Ka ≪ Ktot, – short messages of B bits – central BS – no cooperation between users

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 2

CommI

Communications and Information Theory Chair

slide-3
SLIDE 3

Setting

Current solutions – Coordination by the BS: Identification by a unique pilot sequence and subsequent resource allocation (can be very wasteful for short messages and a large no. of inactive users) – Packet based communication with content resolution (Aloha and co.) (simple but suboptimal, ignores the nature of the channel)

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 3

CommI

Communications and Information Theory Chair

slide-4
SLIDE 4

Setting → ”Unsourced” (Polyanskiy 2017):

– Each users employs the same codebook – Decoder recovers a list of codewords, up to permutation – Closer to the mMTC requirements but still information theoretic

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 4

CommI

Communications and Information Theory Chair

slide-5
SLIDE 5

Previous work

Real AWGN channel without fading – Random coding achievability (Polyanskiy 2017) → existing schemes like TIN or ALOHA perform poorly compared to this bound – Several practical approaches – Reed-Muller code based: (Calderbank and Thompson 2018) – LDPC based: (Vem et al. 2017; Ustinova et al. 2019) – Polar code based: (Marshakov et al. 2019; Pradhan et al. 2019)

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 5

CommI

Communications and Information Theory Chair

slide-6
SLIDE 6

Previous work

Real AWGN channel without fading – Random coding achievability (Polyanskiy 2017) → existing schemes like TIN or ALOHA perform poorly compared to this bound – Several practical approaches – Reed-Muller code based: (Calderbank and Thompson 2018) – LDPC based: (Vem et al. 2017; Ustinova et al. 2019) – Polar code based: (Marshakov et al. 2019; Pradhan et al. 2019) – Concatenated scheme: NNLS + Outer Tree Code (Amalladinne et al. 2018)

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 5

CommI

Communications and Information Theory Chair

slide-7
SLIDE 7

This work

Real AWGN channel without fading – Last year: Sparse Regression based Unsourced Random Access (Fengler, Jung, and Caire 2019a) (with the outer tree code of (Amalladinne et al. 2018)) – This work: Refined analysis and closed form limits

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 6

CommI

Communications and Information Theory Chair

slide-8
SLIDE 8

Sparse Regression Coding (Barron and Joseph 2011)

Each user encodes his LJ-bit message m into n real symbols in the following way:

  • 1. Choose a codebook A = (A1|...|AL) with Al = (al,1|...|al,2J ) ∈ Rn×2J
  • 2. Split m in L parts (sections): m = (m1|...|mL)
  • 3. Integer representation: → (im1|...|imL) with ij ∈ [1 : 2J] (PPM)
  • 4. Transmit: t = L

l=1 al,imi

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 7

CommI

Communications and Information Theory Chair

slide-9
SLIDE 9

Sparse Regression Coding (Barron and Joseph 2011)

Each user encodes his LJ-bit message m into n real symbols in the following way:

  • 1. Choose a codebook A = (A1|...|AL) with Al = (al,1|...|al,2J ) ∈ Rn×2J
  • 2. Split m in L parts (sections): m = (m1|...|mL)
  • 3. Integer representation: → (im1|...|imL) with ij ∈ [1 : 2J] (PPM)
  • 4. Transmit: t = L

l=1 al,imi

In matrix form: t = Ax with x = (x1|...|xL)⊤, where xl ∈ {0, 1}2J indicate the chosen column in section l. The columns of A are normalized such that Et2

2 = nP

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 7

CommI

Communications and Information Theory Chair

slide-10
SLIDE 10

Channel Model

Let Ka active users transmit their messages in this way over an AWGN-Adder-MAC:

y =

Ka

  • k=1

Ax(k) + z = A Ka

  • k=1

x(k)

  • + z

(1)

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 8

CommI

Communications and Information Theory Chair

slide-11
SLIDE 11

Channel Model

Let Ka active users transmit their messages in this way over an AWGN-Adder-MAC:

y =

Ka

  • k=1

Ax(k) + z = A Ka

  • k=1

x(k)

  • + z

(1) Inner Channel:

s → As + z

(Sparse Recovery Problem) Outer Channel:

(x(1), ..., x(Ka)) → x(k)

(Binary Input MAC)

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 8

CommI

Communications and Information Theory Chair

slide-12
SLIDE 12

Channel Model

Let Ka active users transmit their messages in this way over an AWGN-Adder-MAC:

y =

Ka

  • k=1

Ax(k) + z = A Ka

  • k=1

x(k)

  • + z

(1) Inner Channel:

s → As + z

(Sparse Recovery Problem) Outer Channel:

(x(1), ..., x(Ka)) → x(k)

(Binary Input MAC)

→ MAC in the sparse domain, e.g. (Cohen, Heller, and Viterbi 1971)

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 8

CommI

Communications and Information Theory Chair

slide-13
SLIDE 13

Outer Channel

Assume that the inner decoder recovers the support with symbol-wise error probabilities pfa = P(”0 → 1”) and

pmd = P(”1 → 0”). This leads to the following OR-MAC model for the support:

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 9

CommI

Communications and Information Theory Chair

slide-14
SLIDE 14

Outer Channel

Assume that the inner decoder recovers the support with symbol-wise error probabilities pfa = P(”0 → 1”) and

pmd = P(”1 → 0”). This leads to the following OR-MAC model for the support:

Assuming uniform iid messages the output entropy is well approximated (for the typical case Ka ≪ 2J) by

H(ˆ s) = 2JH2((1 − p0)(1 − pmd − pfa) + pfa)

(2)

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 9

CommI

Communications and Information Theory Chair

slide-15
SLIDE 15

Outer Channel

– Assume Ka, J → ∞ with J = α log2 Ka for some α > 1 – and that pfa ≤ cKa/2J for some constant c, i.e. the false alarm rate does not dominate the sparsity asymptotically (o.w. the achievable rates go to zero), then:

lim

Ka,J→∞

I(x1, ..., xKa;ˆ s) JKa = (1 − pmd)

  • 1 − 1

α

  • (3)

– For pmd = 0 this is achievable by the tree code of (Amalladinne et al. 2018), at exponential complexity, or up to a constant with a polynomial complexity.

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 10

CommI

Communications and Information Theory Chair

slide-16
SLIDE 16

Inner Channel

  • 1. Approximate Message Passing (AMP)(Donoho, Maleki, and Montanari 2009)

st+1 = ηt(st + A⊤zt)

(4)

zt+1 = y − Ast+1 + 2JS n zt η′

t(st + A⊤zt)

  • (5)

(6) where ηt(r) is applied componentwise and given by (Fengler, Jung, and Caire 2019b)

ηt(ri) =

  • 1 +

p0 1 − p0 exp

  • 1

2 ˆ P τ 2

t

− 2

  • ˆ

P τt ri −1

(7) with τ 2

t = ||zt|| n

and p0 = (1 − 2−J)Ka.

  • 2. SBS-MAP

ˆ s

MAP

i

= arg max

si∈{0,...,Ka}

P(si|y) i = 1, ..., L2J

(8) Both can be analysed asymptotically by the RS-potential

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 11

CommI

Communications and Information Theory Chair

slide-17
SLIDE 17

Inner Channel

  • 1. Approximate Message Passing (AMP)(Donoho, Maleki, and Montanari 2009)

st+1 = ηt(st + A⊤zt)

(4)

zt+1 = y − Ast+1 + 2JS n zt η′

t(st + A⊤zt)

  • (5)

(6) where ηt(r) is applied componentwise and given by (Fengler, Jung, and Caire 2019b)

ηt(ri) =

  • 1 +

p0 1 − p0 exp

  • 1

2 ˆ P τ 2

t

− 2

  • ˆ

P τt ri −1

(7) with τ 2

t = ||zt|| n

and p0 = (1 − 2−J)Ka.

  • 2. SBS-MAP

ˆ s

MAP

i

= arg max

si∈{0,...,Ka}

P(si|y) i = 1, ..., L2J

(8) Both can be analysed asymptotically by the RS-potential

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 11

CommI

Communications and Information Theory Chair

slide-18
SLIDE 18

Inner Channel

  • 1. Approximate Message Passing (AMP)(Donoho, Maleki, and Montanari 2009)

st+1 = ηt(st + A⊤zt)

(4)

zt+1 = y − Ast+1 + 2JS n zt η′

t(st + A⊤zt)

  • (5)

(6) where ηt(r) is applied componentwise and given by (Fengler, Jung, and Caire 2019b)

ηt(ri) =

  • 1 +

p0 1 − p0 exp

  • 1

2 ˆ P τ 2

t

− 2

  • ˆ

P τt ri −1

(7) with τ 2

t = ||zt|| n

and p0 = (1 − 2−J)Ka.

  • 2. SBS-MAP

ˆ s

MAP

i

= arg max

si∈{0,...,Ka}

P(si|y) i = 1, ..., L2J

(8) Both can be analysed asymptotically by the RS-potential

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 11

CommI

Communications and Information Theory Chair

slide-19
SLIDE 19

Decoupled channel model

Now assume A is Gaussian iid. Consider the limit n, L → ∞ with fixed ratio β = L2J

n . Asymptotically the estimator

statistics are equivalent to the estimation of a single section s ∈ R2J in a Gaussian channel with effective power η ˆ

P : r =

  • ηAMP/MAP ˆ

P 1/2 s + z

(9) where ˆ

P = nP

L , s ∼ ps, z ∼ N(0, I) independent.

– ηMAP is the minimizer of RS potential

i

RS(η) = I2J (η ˆ

P) + 2J 2β [(η − 1) log e − log η]

(10) where I2J (η ˆ

P) is the input-output mutual information of the decoupled channel

– ηAMP is the smallest local minimizer of iRS(η).

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 12

CommI

Communications and Information Theory Chair

slide-20
SLIDE 20

RS-potential examples

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 13

CommI

Communications and Information Theory Chair

slide-21
SLIDE 21

Decoupled channel model

– Tanaka 2002; Guo and Verdu 2005: Replica analysis of the MMSE estimate of s if the components of s are iid – Rangan, Fletcher, and Goyal 2012: SBS-MAP estimator can be found as a limit of MMSE estimators – Bayati and Montanari 2011: AMP is described by the RS-potential for iid signals – Berthier, Montanari, and Nguyen 2017: AMP is described by the RS-potential for block iid signals – Reeves and Pfister 2016; Barbier and Macris 2019: Rigorous justification and extension to the block iid case

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 14

CommI

Communications and Information Theory Chair

slide-22
SLIDE 22

RS-potential

The vector potential can be approximated by the following scalar potential with an error of order Ka/2J (Fengler, Jung, and Caire 2019a):

i

RS(η) = 2JI(η ˆ

P) + 2J 2β [(η − 1) log e − log η] + O Ka 2J

  • (11)

where I(η ˆ

P) is the mutual information in the scalar Gaussian channel Y =

  • η ˆ

PX + Z

(12) with p(X = 0) = (1 − 2−J)Ka and p(X = 1) = 1 − p(X = 0). Up to the term O

Ka

2J

  • this is the well known RS-potential for random linear estimation of an iid 0-1 signal,

e.g. (Guo, Baron, and Shamai 2009) or (Reeves and Gastpar 2012)

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 15

CommI

Communications and Information Theory Chair

slide-23
SLIDE 23

Result

Let Sin = KaRin and Ein = SNR/(2Rin). In the limit with – J, Ka → ∞ with J = α log2 Ka for some α > 1 (as above) – Rin, SNR → 0 with fixed Sin and Ein (energy-efficient regime) Then η = 1 is a global minimizer of iRS

∞ if

Sin

  • 1 − 1

α

  • < 1

2 log2(1 + 2SinEin)

(13) and no local minimum besides the global one exists if

2Sin < log2(e)

  • 1 − 1

α −1 − 1 Ein

(14)

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 16

CommI

Communications and Information Theory Chair

slide-24
SLIDE 24

Concatenated Code

How large should the internal power η ˆ

P be?

If optimal detection is performed at the output of a Gaussian channel:

η ˆ P = (Q−1(pmd) + Q−1(pfa))2

(15) – ˆ

P = 2JEin, therefore the internal power goes to infinity with J for all fixed η and the error probabilities vanish.

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 17

CommI

Communications and Information Theory Chair

slide-25
SLIDE 25

Concatenated Code

How large should the internal power η ˆ

P be?

If optimal detection is performed at the output of a Gaussian channel:

η ˆ P = (Q−1(pmd) + Q−1(pfa))2

(15) – ˆ

P = 2JEin, therefore the internal power goes to infinity with J for all fixed η and the error probabilities vanish.

– But: The condition pfa < cKa/2J actually limits the permitted values to some set {η : η > ¯

η}.

– It turns out that ηAMP < ¯

η < 1 → Asymptotically, AMP can only achieve an error free recovery if the global minimum is the only one.

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 17

CommI

Communications and Information Theory Chair

slide-26
SLIDE 26

RS-potential

α = 2, pmd < 0.05/8, pfa = 0.01 · Ka/2J .

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 18

CommI

Communications and Information Theory Chair

slide-27
SLIDE 27

RS-potential

α = 2, Sin = 2, pmd < 0.05/8, pfa = 0.01 · Ka/2J .

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 19

CommI

Communications and Information Theory Chair

slide-28
SLIDE 28

Concatenated Code

Let S = SinRout and Eb/N0 = Ein/Rout. Using that Rout = 1 − α−1 is both achievable and optimal we get in the same limit as above: There exists an outer code such that asymptotically the described concatenated coding scheme with a random sparse regression code as inner code can be decoded with an inner MAP decoding and a vanishing per-user error probability if and

  • nly if

S < 1 2 log2(1 + 2SEb/N0)

(16) If AMP is used as an inner decoder, reliable decoding is possible if and only if

2S < log2(e) − (Eb/N0)−1

(17)

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 20

CommI

Communications and Information Theory Chair

slide-29
SLIDE 29

Summary

– The symmetric AWGN MAC capacity is achievable in an unsourced way with infinite users without cooperation between users. – The only coding scheme for U-RA that comes with an extensive analysis and asymptotic optimality guarantees. The analytic insight can be used to identify bottlenecks and improve the performance. Furthermore, it can help to analyse extensions of this scheme, like the turbo-like approach of Amalladinne et al. 2020.

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 21

CommI

Communications and Information Theory Chair

slide-30
SLIDE 30

References I

Amalladinne, Vamsi K. et al. (Sept. 12, 2018). “A Coupled Compressive Sensing Scheme for Uncoordinated Multiple Access”. In: arXiv: 1809.04745. Amalladinne, Vamsi K. et al. (Jan. 10, 2020). “On Approximate Message Passing for Unsourced Access with Coded Compressed Sensing”. In: arXiv: 2001.03705 [cs, math]. Barbier, Jean and Nicolas Macris (Aug. 1, 2019). “The Adaptive Interpolation Method: A Simple Scheme to Prove Replica Formulas in Bayesian Inference”. In: Probab. Theory Relat. Fields 174.3, pp. 1133–1185. DOI: 10.1007/s00440-018-0879-0. Barron, Andrew and Antony Joseph (2011). “Sparse Superposition Codes Are Fast and Reliable at Rates Approaching Capacity with Gaussian Noise”. In: Symp. Q. J. Mod. Foreign Lit.

  • Pp. 1–70.

Bayati, Mohsen and Andrea Montanari (2011). “The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing”. In: IEEE Trans. Inf. Theory 57.2,

  • pp. 764–785. DOI: 10.1109/TIT.2010.2094817.

Berthier, Raphael, Andrea Montanari, and Phan-Minh Nguyen (Aug. 13, 2017). “State Evolution for Approximate Message Passing with Non-Separable Functions”. In: arXiv: 1708.03950. Calderbank, Robert and Andrew Thompson (Nov. 2, 2018). “CHIRRUP: A Practical Algorithm for Unsourced Multiple Access”. In: arXiv: 1811.00879 [eess]. Cohen, A., J. Heller, and A. Viterbi (Oct. 1971). “A New Coding Technique for Asynchronous Multiple Access Communication”. In: IEEE Trans. Commun. Technol. 19.5, pp. 849–855. DOI:

10.1109/TCOM.1971.1090714.

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 22

CommI

Communications and Information Theory Chair

slide-31
SLIDE 31

References II

Donoho, David L., Arian Maleki, and Andrea Montanari (Nov. 10, 2009). “Message Passing Algorithms for Compressed Sensing”. In: PNAS 106.45, pp. 18914–18919. DOI:

10.1073/pnas.0909892106. arXiv: 0907.3574.

Fengler, Alexander, Peter Jung, and Giuseppe Caire (July 2019a). “SPARCs and AMP for Unsourced Random Access”. In: IEEE Int. Symp. Inf. Theory Proc. Pp. 2843–2847. — (Jan. 18, 2019b). “SPARCs for Unsourced Random Access”. In: arXiv: 1901.06234. Guo, Dongning, Dror Baron, and Shlomo Shamai (Sept. 2009). “A Single-Letter Characterization of Optimal Noisy Compressed Sensing”. In: 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton), pp. 52–59. DOI: 10.1109/ALLERTON.2009.5394838. Guo, Dongning and Sergio Verdu (Mar. 23, 2005). “Randomly Spread CDMA: Asymptotics via Statistical Physics”. In: arXiv: cs/0503063. Marshakov, Evgeny et al. (Sept. 2019). “A Polar Code Based Unsourced Random Access for the Gaussian MAC”. In: 2019 IEEE 90th Vehicular Technology Conference (VTC2019-Fall),

  • pp. 1–5. DOI: 10.1109/VTCFall.2019.8891583.

Polyanskiy, Y. (June 2017). “A Perspective on Massive Random-Access”. In: 2017 IEEE International Symposium on Information Theory (ISIT), pp. 2523–2527. DOI:

10.1109/ISIT.2017.8006984.

Pradhan, Asit Kumar et al. (Nov. 3, 2019). “Polar Coding and Random Spreading for Unsourced Multiple Access”. In: arXiv: 1911.01009.

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 23

CommI

Communications and Information Theory Chair

slide-32
SLIDE 32

References III

Rangan, Sundeep, Alyson K. Fletcher, and Vivek K. Goyal (Mar. 2012). “Asymptotic Analysis of MAP Estimation via the Replica Method and Applications to Compressed Sensing”. In: IEEE

  • Trans. Inform. Theory 58.3, pp. 1902–1923. DOI: 10.1109/TIT.2011.2177575. arXiv: 0906.3234.

Reeves, Galen and Michael Gastpar (May 2012). “The Sampling Rate-Distortion Tradeoff for Sparsity Pattern Recovery in Compressed Sensing”. In: IEEE Trans. Inf. Theory 58.5,

  • pp. 3065–3092. DOI: 10.1109/TIT.2012.2184848.

Reeves, Galen and Henry D Pfister (2016). “The Replica-Symmetric Prediction for Compressed Sensing with Gaussian Matrices Is Exact”. In: pp. 665–669. DOI:

10.1109/ISIT.2016.7541382.

Tanaka, Toshiyuki (2002). “A Statistical-Mechanics Approach to Large-System Analysis of CDMA Multiuser Detectors”. In: IEEE Trans. Inf. Theory 48.11, pp. 2888–2910. DOI:

10.1109/TIT.2002.804053.

Ustinova, Daria et al. (Sept. 2019). “Efficient Concatenated Same Codebook Construction for the Random Access Gaussian MAC”. In: 2019 IEEE 90th Vehicular Technology Conference (VTC2019-Fall). 2019 IEEE 90th Vehicular Technology Conference (VTC2019-Fall), pp. 1–5. DOI: 10.1109/VTCFall.2019.8891568. Vem, Avinash et al. (Nov. 2017). “A User-Independent Serial Interference Cancellation Based Coding Scheme for the Unsourced Random Access Gaussian Channel”. In: 2017 IEEE Information Theory Workshop (ITW). Kaohsiung, Taiwan: IEEE, pp. 121–125. DOI: 10.1109/ITW.2017.8278023.

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 24

CommI

Communications and Information Theory Chair