Coded Modulation An Information-Theoretic Perspective Young-Han Kim - - PowerPoint PPT Presentation

coded modulation an information theoretic perspective
SMART_READER_LITE
LIVE PREVIEW

Coded Modulation An Information-Theoretic Perspective Young-Han Kim - - PowerPoint PPT Presentation

Coded Modulation An Information-Theoretic Perspective Young-Han Kim http://young-han.kim Department of ECE UC San Diego Annual ACC Workshop Tel Aviv University January , Information theory of point-to-point communication


slide-1
SLIDE 1

Coded Modulation An Information-Theoretic Perspective

Young-Han Kim http://young-han.kim Department of ECE UC San Diego Annual ACC Workshop Tel Aviv University January , 

slide-2
SLIDE 2

Information theory of point-to-point communication

Channel

/

slide-3
SLIDE 3

Information theory of point-to-point communication

M k ̂ M Pe Xn Yn Encoder p(y|x) Decoder

/

slide-4
SLIDE 4

Information theory of point-to-point communication

M k ̂ M Pe Xn Yn Encoder p(y|x) Decoder

∙ “Baseband” picture of communication

/

slide-5
SLIDE 5

Information theory of point-to-point communication

M k ̂ M Pe Xn Yn Encoder p(y|x) Decoder

∙ “Baseband” picture of communication ∙ Tradeoff between R = k/n, Pe = P(M ̸= ̂

M), and n

/

slide-6
SLIDE 6

Information theory of point-to-point communication

M k ̂ M Pe Xn Yn Encoder p(y|x) Decoder

∙ “Baseband” picture of communication ∙ Tradeoff between R = k/n, Pe = P(M ̸= ̂

M), and n

∙ Capacity C: maximum R such that Pe →  as n → ∞

/

slide-7
SLIDE 7

Information theory of point-to-point communication

M k ̂ M Pe Xn Yn Encoder p(y|x) Decoder

∙ “Baseband” picture of communication ∙ Tradeoff between R = k/n, Pe = P(M ̸= ̂

M), and n

∙ Capacity C: maximum R such that Pe →  as n → ∞

R C   Pe

/

slide-8
SLIDE 8

Information theory of point-to-point communication

M k ̂ M Pe Xn Yn Encoder p(y|x) Decoder

∙ “Baseband” picture of communication ∙ Tradeoff between R = k/n, Pe = P(M ̸= ̂

M), and n

∙ Capacity C: maximum R such that Pe →  as n → ∞

R C   Pe

/

slide-9
SLIDE 9

Information theory of point-to-point communication

M k ̂ M Pe Xn Yn Encoder p(y|x) Decoder

∙ “Baseband” picture of communication ∙ Tradeoff between R = k/n, Pe = P(M ̸= ̂

M), and n

∙ Capacity C: maximum R such that Pe →  as n → ∞

R C   Pe

/

slide-10
SLIDE 10

Information theory of point-to-point communication

M k ̂ M Pe Xn Yn Encoder p(y|x) Decoder

∙ “Baseband” picture of communication ∙ Tradeoff between R = k/n, Pe = P(M ̸= ̂

M), and n

∙ Capacity C: maximum R such that Pe →  as n → ∞

/

Channel coding theorem (Shannon )

C = max

p(x) I(X; Y)

slide-11
SLIDE 11

Gaussian channel

M k ̂ M Pe Xn Yn Encoder Decoder g Z

/

slide-12
SLIDE 12

Gaussian channel

M k ̂ M Pe Xn Yn Encoder Decoder g Z

∙ Simple model for wireless, wired, and optical communication

/

slide-13
SLIDE 13

Gaussian channel

M k ̂ M Pe Xn Yn Encoder Decoder g Z

∙ Simple model for wireless, wired, and optical communication ∙ Average power constraint ∑n

i= x i (m) ≤ nP

/

slide-14
SLIDE 14

Gaussian channel

M k ̂ M Pe Xn Yn Encoder Decoder g Z

∙ Simple model for wireless, wired, and optical communication ∙ Average power constraint ∑n

i= x i (m) ≤ nP

∙ Channel quality measured by SNR = gP

/

slide-15
SLIDE 15

Gaussian channel

M k ̂ M Pe Xn Yn Encoder Decoder g Z

∙ Simple model for wireless, wired, and optical communication ∙ Average power constraint ∑n

i= x i (m) ≤ nP

∙ Channel quality measured by SNR = gP Channel coding theorem (Shannon )

C = 

 log( + SNR)

/

slide-16
SLIDE 16

Capacity of the Gaussian channel (Forney–Ungerboeck ’)

/

slide-17
SLIDE 17

How to achieve the capacity?

∙ Random coding and joint typicality decoding

(Shannon , Forney , Cover )

/

slide-18
SLIDE 18

How to achieve the capacity?

∙ Random coding and joint typicality decoding

(Shannon , Forney , Cover )

/

X n

slide-19
SLIDE 19

How to achieve the capacity?

∙ Random coding and joint typicality decoding

(Shannon , Forney , Cover )

/

X n xn(m)

slide-20
SLIDE 20

How to achieve the capacity?

∙ Random coding and joint typicality decoding

(Shannon , Forney , Cover )

∙ Find a unique m such that (xn(m), yn) is jointly typical w.r.t. p(x, y)

/

X n Yn yn xn(m)

slide-21
SLIDE 21

How to achieve the capacity?

∙ Random coding and joint typicality decoding

(Shannon , Forney , Cover )

∙ Find a unique m such that (xn(m), yn) is jointly typical w.r.t. p(x, y) ∙ Successful w.h.p. if R < I(X; Y)

/

X n Yn yn xn(m)

slide-22
SLIDE 22

The other half of the story

∙ Average performance of randomly generate codes is good

㶳 Probabilistic method: there exists a good code

/

slide-23
SLIDE 23

The other half of the story

∙ Average performance of randomly generate codes is good

㶳 Probabilistic method: there exists a good code 㶳 Most codes are good

/

slide-24
SLIDE 24

The other half of the story

∙ Average performance of randomly generate codes is good

㶳 Probabilistic method: there exists a good code 㶳 Most codes are good 㶳 Works for any alphabet

/

slide-25
SLIDE 25

The other half of the story

∙ Average performance of randomly generate codes is good

㶳 Probabilistic method: there exists a good code 㶳 Most codes are good 㶳 Works for any alphabet 㶳 “All codes are good, except those that we know of”

(Wozencraft–Reiffen , Forney )

/

slide-26
SLIDE 26

The other half of the story

∙ Average performance of randomly generate codes is good

㶳 Probabilistic method: there exists a good code 㶳 Most codes are good 㶳 Works for any alphabet 㶳 “All codes are good, except those that we know of”

(Wozencraft–Reiffen , Forney )

∙ Coding theory

㶳 Algebraic: Hamming, Reed–Solomon, BCH, Reed–Muller, polar codes 㶳 Probabilistic: LDPC, turbo, raptor, spatially coupled codes

/

slide-27
SLIDE 27

The other half of the story

∙ Average performance of randomly generate codes is good

㶳 Probabilistic method: there exists a good code 㶳 Most codes are good 㶳 Works for any alphabet 㶳 “All codes are good, except those that we know of”

(Wozencraft–Reiffen , Forney )

∙ Coding theory

㶳 Algebraic: Hamming, Reed–Solomon, BCH, Reed–Muller, polar codes 㶳 Probabilistic: LDPC, turbo, raptor, spatially coupled codes

∙ Coding practice

㶳 G LTE: Turbo and convolutional codes 㶳 G NR: LDPC and polar codes

/

slide-28
SLIDE 28

The other half of the story

∙ Average performance of randomly generate codes is good

㶳 Probabilistic method: there exists a good code 㶳 Most codes are good 㶳 Works for any alphabet 㶳 “All codes are good, except those that we know of”

(Wozencraft–Reiffen , Forney )

∙ Coding theory

㶳 Algebraic: Hamming, Reed–Solomon, BCH, Reed–Muller, polar codes 㶳 Probabilistic: LDPC, turbo, raptor, spatially coupled codes

∙ Coding practice

㶳 G LTE: Turbo and convolutional codes 㶳 G NR: LDPC and polar codes 㶳 Everything is binary

/

slide-29
SLIDE 29

Coded modulation

M k CN Xn Encoder Coded modulation Binary ECC

∙ Communication rate: R = (code rate) × (modulation rate) = k

N ⋅ N n = k n

/

slide-30
SLIDE 30

Coded modulation

M k CN Xn Encoder Coded modulation Binary ECC

∙ Communication rate: R = (code rate) × (modulation rate) = k

N ⋅ N n = k n

∙ BPSK: X = {−倂P, +倂P} and N = n

 㨃→ +倂P  㨃→ −倂P

/

slide-31
SLIDE 31

Coded modulation

M k CN Xn Encoder Coded modulation Binary ECC

∙ Communication rate: R = (code rate) × (modulation rate) = k

N ⋅ N n = k n

∙ BPSK: X = {−倂P, +倂P} and N = n

 㨃→ +倂P  㨃→ −倂P

∙ Sometimes multiple, independent (binary) codewords are modulated together

/

slide-32
SLIDE 32

Coded modulation

M k CN Xn Encoder Coded modulation Binary ECC

∙ Communication rate: R = (code rate) × (modulation rate) = k

N ⋅ N n = k n

∙ BPSK: X = {−倂P, +倂P} and N = n

 㨃→ +倂P  㨃→ −倂P

∙ Sometimes multiple, independent (binary) codewords are modulated together ∙ We decompose coded modulation into two operations

/

slide-33
SLIDE 33

Coded modulation

M k CN Xn Encoder Coded modulation Binary ECC

∙ Communication rate: R = (code rate) × (modulation rate) = k

N ⋅ N n = k n

∙ BPSK: X = {−倂P, +倂P} and N = n

 㨃→ +倂P  㨃→ −倂P

∙ Sometimes multiple, independent (binary) codewords are modulated together ∙ We decompose coded modulation into two operations

㶳 Symbol-level mapping: X = ϕ(U, U, . . . , UL), Ul ∈ {±}

/

slide-34
SLIDE 34

Coded modulation

M k CN Xn Encoder Coded modulation Binary ECC

∙ Communication rate: R = (code rate) × (modulation rate) = k

N ⋅ N n = k n

∙ BPSK: X = {−倂P, +倂P} and N = n

 㨃→ +倂P  㨃→ −倂P

∙ Sometimes multiple, independent (binary) codewords are modulated together ∙ We decompose coded modulation into two operations

㶳 Symbol-level mapping: X = ϕ(U, U, . . . , UL), Ul ∈ {±} 㶳 Block-level mapping: Un

l = ψ(CN), l = , . . . , L

/

slide-35
SLIDE 35

Multiple layers and symbol-level mapping

        b b bb U U X

∙ Natural mapping: X = α(U + U)

/

slide-36
SLIDE 36

Multiple layers and symbol-level mapping

      b b bb U U X   ϕ

∙ Natural mapping: X = α(U + U) ∙ Gray mapping: X = α(UU + U)

/

slide-37
SLIDE 37

Multiple layers and symbol-level mapping

      b b bb U U X   ϕ

∙ Natural mapping: X = α(U + U) ∙ Gray mapping: X = α(UU + U) ∙ Similar mapping ϕ exists for higher-order PAM, QPSK, QAM, PSK, MIMO, ...

XQPSK = 倂P exp急i π(UU + U)  怵

/

slide-38
SLIDE 38

Multiple layers and symbol-level mapping

      b b bb U U X   ϕ

∙ Natural mapping: X = α(U + U) ∙ Gray mapping: X = α(UU + U) ∙ Similar mapping ϕ exists for higher-order PAM, QPSK, QAM, PSK, MIMO, ...

XQPSK = 倂P exp急i π(UU + U)  怵

∙ Can be many-to-one (still information-lossless)

   

∙ Can induce nonuniform X (Gallager )

/

slide-39
SLIDE 39

Horizontal superposition coding

M M Un

Un

Xn ϕ

/

slide-40
SLIDE 40

Horizontal superposition coding

M M Un

Un

Xn ϕ

/

∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner )

slide-41
SLIDE 41

Horizontal superposition coding

M M Un

Un

Xn ϕ

/

∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding:

slide-42
SLIDE 42

Horizontal superposition coding

M M Un

Un

Xn ϕ

/

∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding:

㶳 Find a unique m such that (un

(m), yn) is jointly typical: R < I(U; Y)

slide-43
SLIDE 43

Horizontal superposition coding

M M Un

Un

Xn ϕ

/

∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding:

㶳 Find a unique m such that (un

(m), yn) is jointly typical: R < I(U; Y)

㶳 Find a unique m such that (un

 (m), un (m), yn) is jointly typical: R < I(U; U, Y)

slide-44
SLIDE 44

Horizontal superposition coding

M M Un

Un

Xn ϕ

/

∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding:

㶳 Find a unique m such that (un

(m), yn) is jointly typical: R < I(U; Y)

㶳 Find a unique m such that (un

 (m), un (m), yn) is jointly typical: R < I(U; U, Y)

㶳 Combined rate:

R + R < I(U; Y, U) + I(U; Y)

slide-45
SLIDE 45

Horizontal superposition coding

M M Un

Un

Xn ϕ

/

∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding:

㶳 Find a unique m such that (un

(m), yn) is jointly typical: R < I(U; Y)

㶳 Find a unique m such that (un

 (m), un (m), yn) is jointly typical: R < I(U; U, Y)

㶳 Combined rate:

R + R < I(U; Y, U) + I(U; Y) = I(U, U; Y)

slide-46
SLIDE 46

Horizontal superposition coding

M M Un

Un

Xn ϕ

/

∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding:

㶳 Find a unique m such that (un

(m), yn) is jointly typical: R < I(U; Y)

㶳 Find a unique m such that (un

 (m), un (m), yn) is jointly typical: R < I(U; U, Y)

㶳 Combined rate:

R + R < I(U; Y, U) + I(U; Y) = I(U, U; Y) = I(X; Y)

slide-47
SLIDE 47

Horizontal superposition coding

M M Un

Un

Xn ϕ

/

∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding:

㶳 Find a unique m such that (un

(m), yn) is jointly typical: R < I(U; Y)

㶳 Find a unique m such that (un

 (m), un (m), yn) is jointly typical: R < I(U; U, Y)

㶳 Combined rate:

R + R < I(U; Y, U) + I(U; Y) = I(U, U; Y) = I(X; Y)

㶳 Regardless of ϕ or the decoding order

slide-48
SLIDE 48

Horizontal superposition coding

M M Un

Un

Xn ϕ

/

∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding:

㶳 Find a unique m such that (un

(m), yn) is jointly typical: R < I(U; Y)

㶳 Find a unique m such that (un

 (m), un (m), yn) is jointly typical: R < I(U; U, Y)

㶳 Combined rate:

R + R < I(U; Y, U) + I(U; Y) = I(U, U; Y) = I(X; Y)

㶳 Regardless of ϕ or the decoding order

∙ Multi-level coding (MLC): Wachsmann–Fischer–Huber ()

slide-49
SLIDE 49

Vertical superposition coding

M Un

Un

Xn ϕ

/

slide-50
SLIDE 50

Vertical superposition coding

M Un

Un

Xn ϕ

/

∙ Single codeword of length n: Cn = (Cn, Cn

n+)

Cn 㨃→ Un

Cn

n+ 㨃→ Un 

slide-51
SLIDE 51

Vertical superposition coding

M Un

Un

Xn ϕ

/

∙ Single codeword of length n: Cn = (Cn, Cn

n+)

Cn 㨃→ Un

Cn

n+ 㨃→ Un 

∙ Treating the other layer as noise:

slide-52
SLIDE 52

Vertical superposition coding

M Un

Un

Xn ϕ

/

∙ Single codeword of length n: Cn = (Cn, Cn

n+)

Cn 㨃→ Un

Cn

n+ 㨃→ Un 

∙ Treating the other layer as noise:

㶳 Find a unique m such that

(un

 (m), yn) is jointly typical

and (un

(m), yn) is jointly typical

slide-53
SLIDE 53

Vertical superposition coding

M Un

Un

Xn ϕ

/

∙ Single codeword of length n: Cn = (Cn, Cn

n+)

Cn 㨃→ Un

Cn

n+ 㨃→ Un 

∙ Treating the other layer as noise:

㶳 Find a unique m such that

(un

 (m), yn) is jointly typical

and (un

(m), yn) is jointly typical

㶳 Successful w.h.p. if

R < I(U; Y) + I(U; Y)

slide-54
SLIDE 54

Vertical superposition coding

M Un

Un

Xn ϕ

/

∙ Single codeword of length n: Cn = (Cn, Cn

n+)

Cn 㨃→ Un

Cn

n+ 㨃→ Un 

∙ Treating the other layer as noise:

㶳 Find a unique m such that

(un

 (m), yn) is jointly typical

and (un

(m), yn) is jointly typical

㶳 Successful w.h.p. if

R < I(U; Y) + I(U; Y) < I(U, U; Y) = I(X; Y)

slide-55
SLIDE 55

Vertical superposition coding

M Un

Un

Xn ϕ

/

∙ Single codeword of length n: Cn = (Cn, Cn

n+)

Cn 㨃→ Un

Cn

n+ 㨃→ Un 

∙ Treating the other layer as noise:

㶳 Find a unique m such that

(un

 (m), yn) is jointly typical

and (un

(m), yn) is jointly typical

㶳 Successful w.h.p. if

R < I(U; Y) + I(U; Y) < I(U, U; Y) = I(X; Y)

∙ Bit-interleaved coded modulation (BICM): Caire–Taricco–Biglieri ()

slide-56
SLIDE 56

Diagonal superposition coding

M Un

Un

Xn ϕ

/

slide-57
SLIDE 57

Diagonal superposition coding

M㰀㰀 M㰀 Un

Un

Xn ϕ

/

slide-58
SLIDE 58

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block

slide-59
SLIDE 59

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block Cn

n+()

Cn()

slide-60
SLIDE 60

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block Cn

n+()

Cn

n+()

Cn() Cn()

slide-61
SLIDE 61

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block Cn

n+()

Cn

n+()

Cn

n+()

Cn() Cn() Cn()

slide-62
SLIDE 62

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn() Cn() Cn() Cn()

slide-63
SLIDE 63

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn() Cn() Cn() Cn() Cn()

slide-64
SLIDE 64

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn() Cn() Cn() Cn() Cn() Cn()

slide-65
SLIDE 65

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn() Cn() Cn() Cn() Cn() Cn()

∙ Sliding-window decoding:

slide-66
SLIDE 66

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn() Cn() Cn() Cn() Cn()

∙ Sliding-window decoding: R < I(U; U, Y) + I(U; Y) = I(X; Y)

slide-67
SLIDE 67

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn() Cn() Cn() Cn() Cn()

∙ Sliding-window decoding: R < I(U; U, Y) + I(U; Y) = I(X; Y) ∙ Block Markov coding: Used extensively in relay and feedback communication

slide-68
SLIDE 68

Diagonal superposition coding

Un

Un

Xn ϕ M(j − ) M(j)

/

∙ Think outside the block: Sequence of messages M(j) mapped to Cn(j)

       U U Block Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn

n+()

Cn() Cn() Cn() Cn() Cn()

∙ Sliding-window decoding: R < I(U; U, Y) + I(U; Y) = I(X; Y) ∙ Block Markov coding: Used extensively in relay and feedback communication ∙ Sliding-window coded modulation (SWCM): Kim et al. (), Wang et al. ()

slide-69
SLIDE 69

Multiple-antenna transmission

∙ Consider the signal layers U and U as antenna ports: X = (U, U)

/

slide-70
SLIDE 70

Multiple-antenna transmission

∙ Consider the signal layers U and U as antenna ports: X = (U, U) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures:

/

slide-71
SLIDE 71

Multiple-antenna transmission

∙ Consider the signal layers U and U as antenna ports: X = (U, U) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures:

㶳 Horizontal: H-BLAST (Foschini et al. /), also known as V-BLAST

/

slide-72
SLIDE 72

Multiple-antenna transmission

∙ Consider the signal layers U and U as antenna ports: X = (U, U) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures:

㶳 Horizontal: H-BLAST (Foschini et al. /), also known as V-BLAST 㶳 Diagonal: D-BLAST (Foschini )

/

slide-73
SLIDE 73

Multiple-antenna transmission

∙ Consider the signal layers U and U as antenna ports: X = (U, U) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures:

㶳 Horizontal: H-BLAST (Foschini et al. /), also known as V-BLAST 㶳 Diagonal: D-BLAST (Foschini ) 㶳 Vertical: Single-outer code (Foschini et al. ), but shouldn’t this be “V-BLAST”?

/

slide-74
SLIDE 74

Multiple-antenna transmission

∙ Consider the signal layers U and U as antenna ports: X = (U, U) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures:

㶳 Horizontal: H-BLAST (Foschini et al. /), also known as V-BLAST 㶳 Diagonal: D-BLAST (Foschini ) 㶳 Vertical: Single-outer code (Foschini et al. ), but shouldn’t this be “V-BLAST”?

∙ Signal layers can be far more general than antenna ports

/

slide-75
SLIDE 75

Multiple-antenna transmission

∙ Consider the signal layers U and U as antenna ports: X = (U, U) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures:

㶳 Horizontal: H-BLAST (Foschini et al. /), also known as V-BLAST 㶳 Diagonal: D-BLAST (Foschini ) 㶳 Vertical: Single-outer code (Foschini et al. ), but shouldn’t this be “V-BLAST”?

∙ Signal layers can be far more general than antenna ports ∙ Coded modulation can encompass MIMO transmission

U U U U

/

slide-76
SLIDE 76

Comparison

/

slide-77
SLIDE 77

Comparison

Horizontal

U U M M

Multi-level coding (MLC) R < I(U; Y) R < I(U; U, Y) Short, nonuniversal

/

slide-78
SLIDE 78

Comparison

Horizontal

U U M M

Multi-level coding (MLC) R < I(U; Y) R < I(U; U, Y) Short, nonuniversal Vertical

M M

Bit-interleaved coded modulation (BICM) R < I(U; Y) + I(U; Y) Other layers as noise

/

slide-79
SLIDE 79

Comparison

Horizontal

U U M M

Multi-level coding (MLC) R < I(U; Y) R < I(U; U, Y) Short, nonuniversal Vertical

M M

Bit-interleaved coded modulation (BICM) R < I(U; Y) + I(U; Y) Other layers as noise Diagonal

M M

Sliding-window coded modulation (SWCM) R < I(U; U, Y) + I(U; Y) = I(X; Y) Error prop., rate loss

/

slide-80
SLIDE 80

BICM vs. SWCM

5 10 15 20 25 30 0.5 1 1.5 2 2.5 3 3.5 4

4PAM 8PAM 16PAM

SNR(dB) Symmetric Rate SWCM BICM

LTE turbo code / ≤-iteration LOG-MAP decoding at b = , n = , BLER = .

/

slide-81
SLIDE 81

BICM vs. SWCM

5 10 15 20 25 30 0.5 1 1.5 2 2.5 3 3.5 4

4PAM 8PAM 16PAM

SNR(dB) Symmetric Rate SWCM BICM

LTE turbo code / ≤-iteration LOG-MAP decoding at b = , n = , BLER = .

/

slide-82
SLIDE 82

Application: Interference channels

desired signal interference

/

slide-83
SLIDE 83

Optimal rate region (Bandemer–El-Gamal–Kim )

X X Y Y p(y|x, x) p(y|x, x)

/

slide-84
SLIDE 84

Optimal rate region (Bandemer–El-Gamal–Kim )

X X Y Y p(y|x, x) p(y|x, x)

/

R < I(X; Y|X) R + R < I(X, X; Y)

  • r

R < I(X; Y)

R R

slide-85
SLIDE 85

Optimal rate region (Bandemer–El-Gamal–Kim )

X X Y Y p(y|x, x) p(y|x, x)

/

R < I(X; Y|X) R + R < I(X, X; Y)

  • r

R < I(X; Y)

R R

slide-86
SLIDE 86

Optimal rate region (Bandemer–El-Gamal–Kim )

X X Y Y p(y|x, x) p(y|x, x)

/

R < I(X; Y|X) R + R < I(X, X; Y)

  • r

R < I(X; Y)

R R

slide-87
SLIDE 87

Low-complexity (implementable) alternatives

/

X X Y Y p(y|x, x) p(y|x, x) R R

slide-88
SLIDE 88

Low-complexity (implementable) alternatives

∙ PP decoding

/

X X Y Y p(y|x, x) p(y|x, x) R R

slide-89
SLIDE 89

Low-complexity (implementable) alternatives

∙ PP decoding

㶳 Treating interference as (Gaussian) noise: R < I(X; Y)

/

X X Y Y p(y|x, x) p(y|x, x) R R

slide-90
SLIDE 90

Low-complexity (implementable) alternatives

∙ PP decoding

㶳 Treating interference as (Gaussian) noise: R < I(X; Y) 㶳 Successive cancellation decoding: R < I(X; Y), R < I(X; Y|X)

/

X X Y Y p(y|x, x) p(y|x, x) R R

slide-91
SLIDE 91

Low-complexity (implementable) alternatives

∙ PP decoding

㶳 Treating interference as (Gaussian) noise: R < I(X; Y) 㶳 Successive cancellation decoding: R < I(X; Y), R < I(X; Y|X)

∙ + rate splitting (Zhao et al. , Wang et al. )

/

U U X X Y Y p(y|x, x) p(y|x, x) R R

slide-92
SLIDE 92

Low-complexity (implementable) alternatives

∙ PP decoding

㶳 Treating interference as (Gaussian) noise: R < I(X; Y) 㶳 Successive cancellation decoding: R < I(X; Y), R < I(X; Y|X)

∙ + rate splitting (Zhao et al. , Wang et al. ) ∙ Novel codes

㶳 Spatially coupled codes (Yedla, Nguyen, Pfister, and Narayanan ) 㶳 Polar codes (Wang and S

¸as ¸o˘ glu )

/

X X Y Y p(y|x, x) p(y|x, x) R R

slide-93
SLIDE 93

Sliding-window superposition coding (Wang et al. )

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)

slide-94
SLIDE 94

Sliding-window superposition coding (Wang et al. )

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M()

slide-95
SLIDE 95

Sliding-window superposition coding (Wang et al. )

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M()

slide-96
SLIDE 96

Sliding-window superposition coding (Wang et al. )

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M()

slide-97
SLIDE 97

Sliding-window superposition coding (Wang et al. )

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M()

slide-98
SLIDE 98

Sliding-window superposition coding (Wang et al. )

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M() M() M()

slide-99
SLIDE 99

Sliding-window superposition coding (Wang et al. )

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M()

slide-100
SLIDE 100

Sliding-window superposition coding (Wang et al. )

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M()

slide-101
SLIDE 101

Sliding-window superposition coding (Wang et al. )

∙ Sliding-window coded modulation for sender  (without alphabet constraints)

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M()

slide-102
SLIDE 102

Sliding-window superposition coding (Wang et al. )

∙ Sliding-window decoding

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M()

slide-103
SLIDE 103

Sliding-window superposition coding (Wang et al. )

∙ Sliding-window decoding ∙ Successive cancellation decoding

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M()

slide-104
SLIDE 104

Sliding-window superposition coding (Wang et al. )

∙ Sliding-window decoding ∙ Successive cancellation decoding

R < I(X; Yj|U)

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M()

slide-105
SLIDE 105

Sliding-window superposition coding (Wang et al. )

∙ Sliding-window decoding ∙ Successive cancellation decoding

R < I(X; Yj|U) R < I(U; Yj) + I(U; Yj|U, X)

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M()

slide-106
SLIDE 106

Sliding-window superposition coding (Wang et al. )

∙ Every corner point: different decoding orders

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() R R

slide-107
SLIDE 107

Sliding-window superposition coding (Wang et al. )

∙ Every corner point: different decoding orders ∙ Every point: time sharing or more superposition layers

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() R R

slide-108
SLIDE 108

Sliding-window superposition coding (Wang et al. )

∙ Every corner point: different decoding orders ∙ Every point: time sharing or more superposition layers ∙ Extension to Han–Kobayashi (Wang et al. )

/

M(j − ) M(j) M(j) Un

Un

Xn

Xn

Yn

Yn

M(j) → M(j) M(j) → M(j) p(y|x, x) p(y|x, x)        X U U Block M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() M() R R

slide-109
SLIDE 109

Gaussian channel performance (Park–Kim–Wang )

8 9 10 11 12 INR (dB) 0.2 0.4 0.6 0.8 1 1.2 Symmetric Rate MLD SWCM SWCM (turbo) IAN IAN (turbo) 96% gain 154% gain 25% gain

LTE turbo code with b = , n = , BLER = ., SNR =  dB

/

slide-110
SLIDE 110

System-level performance (Kim et al. )

/

slide-111
SLIDE 111

Cooper’s Law

Source: Arraycomm, Zander–M¨ ah¨

  • nen ()

/

slide-112
SLIDE 112

Cooper’s Law

Source: Arraycomm, Zander–M¨ ah¨

  • nen ()

∙ Gain over the past  years =  ∝ ηWsysNBS

/

slide-113
SLIDE 113

Cooper’s Law

Source: Arraycomm, Zander–M¨ ah¨

  • nen ()

∙ Gain over the past  years =  ∝ ηWsysNBS

㶳 Spectral efficiency η: x 

/

slide-114
SLIDE 114

Cooper’s Law

Source: Arraycomm, Zander–M¨ ah¨

  • nen ()

∙ Gain over the past  years =  ∝ ηWsysNBS

㶳 Spectral efficiency η: x  㶳 System bandwidth Wsys: x 

/

slide-115
SLIDE 115

Cooper’s Law

Source: Arraycomm, Zander–M¨ ah¨

  • nen ()

∙ Gain over the past  years =  ∝ ηWsysNBS

㶳 Spectral efficiency η: x  㶳 System bandwidth Wsys: x  㶳  of base stations NBS: x  (spatial reuse of frequency)

/

slide-116
SLIDE 116

Concluding remarks

∙ Coded modulation as superposition coding

/

slide-117
SLIDE 117

Concluding remarks

∙ Coded modulation as superposition coding

㶳 Simple and unifying picture

/

slide-118
SLIDE 118

Concluding remarks

∙ Coded modulation as superposition coding

㶳 Simple and unifying picture 㶳 Framework for new coded modulation schemes

/

slide-119
SLIDE 119

Concluding remarks

∙ Coded modulation as superposition coding

㶳 Simple and unifying picture 㶳 Framework for new coded modulation schemes

∙ Open problems

㶳 Finer analysis: Single-shot method (Verd´

u )

/

slide-120
SLIDE 120

Concluding remarks

∙ Coded modulation as superposition coding

㶳 Simple and unifying picture 㶳 Framework for new coded modulation schemes

∙ Open problems

㶳 Finer analysis: Single-shot method (Verd´

u )

㶳 Shaping and dependence (a la Marton): CCDM (B¨

  • cherer et al. )

/

slide-121
SLIDE 121

Concluding remarks

∙ Coded modulation as superposition coding

㶳 Simple and unifying picture 㶳 Framework for new coded modulation schemes

∙ Open problems

㶳 Finer analysis: Single-shot method (Verd´

u )

㶳 Shaping and dependence (a la Marton): CCDM (B¨

  • cherer et al. )

∙ To learn more

㶳 Kramer and Kim (), “Network information theory for cellular wireless,” in Information

Theoretic Perspectives on G Systems and Beyond, eds. Shamai, Simeone, and Maric

㶳 Wang et al. (), “Sliding-window superposition coding: Two-user interference

channels,” arXiv:.

㶳 Kim et al. (), “Interference management via sliding-window coded modulation for

G cellular networks,” IEEE Commun. Mag.

/