GAUSSIAN Max H. M. Costa and Olivier Rioul Unicamp and - - PowerPoint PPT Presentation
GAUSSIAN Max H. M. Costa and Olivier Rioul Unicamp and - - PowerPoint PPT Presentation
FROM ALMOST GAUSSIAN TO GAUSSIAN Max H. M. Costa and Olivier Rioul Unicamp and Tlcom-ParisTech 22/09/2014 MaxEnt 2014 Amboise, France Summary Gaussian Interference Channel - standard form Brief history Z-Interference
Summary
Gaussian Interference Channel - standard form Brief history Z-Interference channel Degraded Interference channel Corner points of capacity region Upper Bound Lower bound Discussion
Standard Gaussian Interference Channel
Power P1 Power P2
a b
W1 W2 W1 W2
^ ^
Z-Gaussian Interference Channel
The possibilities:
Things that we can do with interference:
1.
Ignore (take interference as noise (IAN)
2.
Avoid (divide the signal space (TDM/FDM))
3.
Partially decode both interfering signals
4.
Partially decode one, fully decode the other
5.
Fully decode both (only good for strong inter- ference, a≥1)
Brief history
Carleial (1975): Very strong interference does not
reduce capacity (a2 ≥ 1+P)
Sato (1981), Han and Kobayashi (1981): Strong
interference (a2 ≥ 1) : IFC behaves like 2 MACs
Motahari, Khandani (2009), Shang, Kramer and
Chen (2009), Annapureddy, Veeravalli (2009): Very weak interference (2a(1+a2P) ≤ 1) : Treat interference as noise (IAN)
History (continued)
Sason (2004): Symmetrical superposition to beat
TDM – found part of optimal choice for α
Etkin, Tse, Wang (2008): capacity to within 1 bit,
good heuristical choice of αP=1/a2
Degraded Gaussian Interference Channel
Differential capacity
Discrete time channel as a band limited channel
Gaussian Broadcast Channel
Superposition coding
N2 (1-)P P 1 P
Superposition coding
N2 (1-)P P 1 P
Multiple Access Channel
Degraded Interference Channel
- One Extreme Point
Degraded Interference Channel
- Another Extreme Point
Degraded Gaussian Interference Channel
Key variables
Let Z1 + Z2 + X2 be distributed as f Note: X2 is a codebook Let Z1 + Z2 + Z3 be distributed as g Z1, Z2, Z3 are Gaussian variables Have: h(g) – h(f) ≤ 𝑜1 (the almost Gaussian hypothesis)
Key variables (cont.)
Y1 = X1 + Z1 Y2 = X1 + Z1 + Z2 + X2 Y3 = X1 + Z1 + Z2 + Z3 X1 ~ p Y2 ~ f•p Y3 ~ g•p
The missing inequality
Need a Fano type inequality based on non-disturbance criterion: -n ≤ h(Y3) – h(Y2) ≤ n (with diminishing )
Upper bound on h(Y3) – h(Y2)
I(X1;Y2) = I(X1;Y2|X2) – I(X1;X2|Y2) ≥ I(X1;Y2|X2) – n2 ≥ H(X1) – H(X1| X1+Z1+Z2) – n2 = I(X1;X1+Z1+Z2) – n2 ≥ I(X1;Y3) – n2 By the data processing inequality (DPI). Therefore h(Y3)-h(Y2) ≤ h(Y3|X1)-h(Y2|X1) + n2 = h(g) – h(f) + n2 ≤ n1 + n2
Lower Bound on h(Y3) – h(Y2)
h(f) = -f log f h(g) = -g log g
- f log g
- g log f
By DPI: 0 ≤ D(f•p||g•p) ≤ D(f||g) ≤ n1 0 ≤ D(g•p||f•p) ≤ D(g||f) ≤ n1 h(Y3) = -g•p log g•p
- f•p log g•p
- g•p log f•p
h(Y2) = -f•p log f• p D(f||g) D(g||f) Smoothing by p:
Lower Bound (cont.)
Conjecture: We argue by continuity that (f•p - g•p) log f•p does not change sign. This implies: h(Y3) - h(Y2) ≥ -21
Rational
0 ≤ D(g•p||f•p) = ( g•p log g•p - ( g•p log f•p + ( f•p log f•p - ( f•p log f•p = h(f•p) – h(g•p) +(f•p – g•p) log f•p ≤ D(g||f) ≤ (f – g) log f ≤ 2n1 Equivalently h(Y3) - h(Y2) ≥ (f•p - g•p) log f•p +(g - f) log f ≥ -2n1
Special case
Let f = g + f.
Then expand
0 ≤ D(f•p||g•p) ≤ ( f•p log f•p - ( f•p log g•p + ( g•p log g•p - ( g•p log g•p ≤ h(g•p) – h(f•p) +(g•p – f•p) log g•p ≤ h(Y3) – h(Y2) +f•p log g•p If 𝑔
= g -f = 2g – f is also a valid density, then can prove the lower bound by symmetry and upper bound.
Remarks
Somewhat surprisingly, h(Y2) can be greater then h(Y3). Close to establish the corner points of the capacity
region of the standard interference channel.
To whisper or to shout: Not to cause inconvenience, X1
needs to be decoded at Y2. Better to shout!
Many thanks!