Sum Rate of Gaussian Multiterminal Source Coding Pramod Viswanath - - PowerPoint PPT Presentation

sum rate of gaussian multiterminal source coding
SMART_READER_LITE
LIVE PREVIEW

Sum Rate of Gaussian Multiterminal Source Coding Pramod Viswanath - - PowerPoint PPT Presentation

Sum Rate of Gaussian Multiterminal Source Coding Pramod Viswanath University of Illinois, Urbana-Champaign March 19, 2003 Gaussian Multiterminal Source Coding y 1 = h t 1 n m 1 ENCODER 1 n 1 y 2 = h t 2 n m 2 D n 2 E ENCODER 2 C n O


slide-1
SLIDE 1

Sum Rate of Gaussian Multiterminal Source Coding

Pramod Viswanath University of Illinois, Urbana-Champaign March 19, 2003

slide-2
SLIDE 2

Gaussian Multiterminal Source Coding

ENCODER 2 ENCODER 1 ENCODER K E D E C O D R

yK = ht

Kn

n2 n1 nN

n =

ˆ

n

y2 = ht

2n

y1 = ht

1n

mK m2 m1

Sum of rates of encoders R and distortion metric d (n, ˆ

n)

slide-3
SLIDE 3

Quadratic Gaussian CEO Problem

ENCODER K ENCODER 1 ENCODER 2 E R D O C E D

y1 = n + z1 n ˆ n yK = n + zK x2 = n + z2 mK m2 m1

Sum of rates of encoders R and distortion metric d (n, ˆ n)

slide-4
SLIDE 4

Result: Quadratic Gaussian CEO

  • quadratic distortion metric d (n, ˆ

n) = (n − ˆ n)2

  • For large number of encoders K,

R(D) = 1 2 log+

  • σ2

n

D

  • + σ2

z

2σ2

n

  • σ2

n

D − 1

+

– Second term is loss w.r.t. cooperating encoders

slide-5
SLIDE 5

Outline

  • Problem Formulation:

Tradeoff between sum rate R and distortion (metric d (n, ˆ

n)).

  • Main Result:

Characterize a class of distortion metrics for which no loss in sum rate compared with encoder cooperation – A multiple antenna test channel

slide-6
SLIDE 6

Random Binning of Slepian-Wolf

∆ O ∆ × × × O O O × ∆ ∆ ∆ × × × ∆ ∆ ∆ O O O y1 y2 × ∆ × O O ×

  • Rate is number of quantizers
slide-7
SLIDE 7

Encoding in Slepian-Wolf

∆ O ∆ × × × O O O × ∆ ∆ ∆ × × × ∆ ∆ ∆ O O O y1 y2 × ∆ × O O ×

  • Quantizer closest to realization
slide-8
SLIDE 8

Decoding in Slepian-Wolf

  • Decoder knows joint distribution of y1, y2
  • It is given the two quantizer numbers from the encoders
  • Picks the pair of points in the quantizers which best matches

the joint distribution – For jointly Gaussian y1, y2 nearest neighbor type test

  • R1 + R2 = H (y1, y2) is sufficient for zero distortion
slide-9
SLIDE 9

Deterministic Broadcast Channel

y2 = g(x) x y1 = f(x)

  • Pick distribution on x such that y1, y2 have desired joint dis-

tribtion (Cover 98)

slide-10
SLIDE 10

Slepian-Wolf code for Broadcast Channel

  • Encoding: implement Slepian-Wolf decoder

– given two messages, find the appropriate pair y1, y2 in the two quantizers – transmit x that generates this pair y1, y2.

slide-11
SLIDE 11

Slepian-Wolf code for Broadcast Channel

  • Encoding: implement Slepian-Wolf decoder

– given two messages, find the appropriate pair y1, y2 in the two quantizers – transmit x that generates this pair y1, y2.

  • Decoding: implement Slepian-Wolf encoder

– quantize y1, y2 to nearest point – messages are the quantizer numbers

slide-12
SLIDE 12

Lossy Slepian-Wolf Source Coding

∆ O ∆ × × × O O O × ∆ ∆ ∆ × × × ∆ ∆ ∆ O O O u1 u2 × ∆ × O O ×

  • Approximate y1, y2 by u1, u2
slide-13
SLIDE 13

Lossy Slepian-Wolf Source Coding

  • Encoding:

Find ui that matches source yi, separately for each i – For jointly Gaussian r.v. s, nearest neighbor calculation – Each encoder sends quantizer number containing the u picked

slide-14
SLIDE 14

Lossy Slepian-Wolf Source Coding

  • Encoding:

Find ui that matches source yi, separately for each i – For jointly Gaussian r.v. s, nearest neighbor calculation – Each encoder sends quantizer number containing the u picked

  • Decoding: Reconstruct the u’s picked by the encoders

– reconstruction based on joint distribution of u’s – Previously ui = yi were correlated – Here u’s are independently picked

slide-15
SLIDE 15

Lossy Slepian-Wolf

  • We require

p [u1, . . . , uK|y1, . . . , yK] = ΠK

i=1p [ui|yi]

slide-16
SLIDE 16

Lossy Slepian-Wolf

  • We require

p [u1, . . . , uK|y1, . . . , yK] = ΠK

i=1p [ui|yi]

  • Generate ˆ

n1, . . . , ˆ nK deterministically from reconstructed u’s.

  • Need sum rate

Rsum = I (u1, . . . , uK; y1, . . . , yK) .

  • Distortion equal to

E [d (n, ˆ

n)] .

slide-17
SLIDE 17

Marton Coding for Broadcast Channel

Dec1 Dec2

w ∼ N (0, I) Ht

y

ˆ u1 ˆ u2

x

u1, u2 Enc

  • Reversed encoding and decoding operations
  • Sum rate I (u1, . . . , uK; y1, . . . , yK) .
  • No use for the Markov property

p [u1, . . . , uK|y1, . . . , yK] = ΠK

i=1p [ui|yi]

slide-18
SLIDE 18

Achievable Rates: Costa Precoding

m1 u1 w x Ht m2 u2 ˆ m1 ˆ m2

  • Users’ data modulated onto spatial signatures u1, u2
slide-19
SLIDE 19

Stage 1: Costa Precoding

Dec1

m1 u1 w x Ht m2 u2 ˆ m1

  • Encoding for user 1 treating signal from user 2 as known

interference at transmitter

slide-20
SLIDE 20

Stage 2

Dec2

m1 u1 w x Ht m2 u2 ˆ m2

  • Encode user 2 treating signal for user 1 as noise
slide-21
SLIDE 21

Adaptation to Lossy Slepian-Wolf

Dec

Ht z ∼ N (0, Kz)

x

u1, u2 Enc

y

ˆ u1 ˆ u2

  • Joint distribution of u’s and y’s depends on noise z
  • Performance independent of correlation in z
slide-22
SLIDE 22

Noise Coloring

  • Fix particular Costa coding scheme - fixes u’s and x.
  • Idea:

Choose z such that p [u1, . . . , uK|y1, . . . , yK] = ΠK

i=1p [ui|yi]

and (Kz)ii = 1

  • Then can adapt to Lossy Multiterminal Source Coding
slide-23
SLIDE 23

Markov Condition and Broadcast Channel

  • The Markov condition

p [u1, . . . , uK|y1, . . . , yK] = ΠK

i=1p [ui|yi]

  • f independent interest in the broadcast channel
slide-24
SLIDE 24

Markov Condition and Broadcast Channel

  • The Markov condition

p [u1, . . . , uK|y1, . . . , yK] = ΠK

i=1p [ui|yi]

  • f independent interest in the broadcast channel
  • p [u1, . . . , uK|y1, . . . , yK] = ΠK

i=1p

ui|y1, . . . , yK, u1, u2, . . . , ui−1

slide-25
SLIDE 25

Markov Condition and Broadcast Channel

  • The Markov condition

p [u1, . . . , uK|y1, . . . , yK] = ΠK

i=1p [ui|yi]

  • f independent interest in the broadcast channel
  • p [u1, . . . , uK|y1, . . . , yK] = ΠK

i=1p

ui|y1, . . . , yK, u1, u2, . . . , ui−1

  • Equivalent to:

given u1, . . . , ui−1 ui − → yi − → y1, . . . , yi−1, yi+1, . . . , yK

slide-26
SLIDE 26

Implication

Dec

Ht z ∼ N (0, Kz)

x

u1, u2 Enc

y

ˆ u1 ˆ u2

  • Need only y1 to decode u1
  • Given u1, need only y2 to decode u2

Performance of Costa scheme equals that when receivers cooperate

slide-27
SLIDE 27

Markov Condition and Noise Covariance

  • The sum capacity is also achieved by such a scheme

(CS 01, YC 01, VT 02, VJG 02)

  • For every Costa scheme, there is a choice of Kz such that

Markov condition holds (Yu and Cioffi, 01)

slide-28
SLIDE 28

Sum Capacity

z ∼ N (0, Kz) Receivers Cooperating Ht H Ht H Sato

Enc

w w Broadcast

Dec Enc Dec Enc

w E

  • xtKzx
  • ≤ P

Reciprocity Reciprocity x1, x2 independent

Multiple Access Cooperating Transmitters

slide-29
SLIDE 29

Sum Capacity

z ∼ N (0, Kz) Receivers Cooperating Ht H Ht H Sato

Enc

w w Broadcast

Dec Enc Dec Enc

w E

  • xtKzx
  • ≤ P

Reciprocity Convex Duality Reciprocity x1, x2 independent

Multiple Access Cooperating Transmitters

slide-30
SLIDE 30

Gaussian Multiterminal Source Coding

ENCODER 2 ENCODER 1 ENCODER K E D E C O D R

yK = ht

Kn

n2 n1 nN

n =

ˆ

n

y2 = ht

2n

y1 = ht

1n

mK m2 m1

H = [h1, . . . , hK] Sum of rates of encoders R and distortion metric d (n, ˆ

n)

slide-31
SLIDE 31

Main Result

  • Distortion metric

d (n; ˆ

n) = 1

N (n − ˆ

n)t

I + Hdiag {p1, . . . , pK} Ht (n − ˆ

n)

– Here p1, . . . , pK - powers of users in reciprocal MAC

  • Rate distortion function

R(D) = Sum rate of MAC − N log D = Sum rate of Broadcast Channel − N log D = log det

  • I + HDHt

− N log D

slide-32
SLIDE 32

Bells and Whistles

  • For quadratic distortion metric

d (n; ˆ

n) = 1

N (n − ˆ

n)t (n − ˆ n)

set of H can be characterized

  • Analogy with CEO problem:

For large number of encoders and random H characterization of R(D) almost surely

slide-33
SLIDE 33

Discussion

  • A “connection” made between coding schemes for multiter-

minal source and channel coding

slide-34
SLIDE 34

Discussion

  • A “connection” made between coding schemes for multiter-

minal source and channel coding

  • Connection somewhat superficial

– relation between source coding and broadcast channel through a common random coding argument (PR 02, CC 02) – relation between source coding and multiple access chan- nel through a change of variable (VT 02, JVG 01)

slide-35
SLIDE 35

Discussion

  • A “connection” made between coding schemes for multiter-

minal source and channel coding

  • Connection somewhat superficial

– relation between source coding and broadcast channel through a common random coding argument – relation between source coding and multiple access chan- nel through a change of variable

  • Connection is suggestive

– a codebook level duality