Compound channel with feedback: Opportunistic capacity and error - - PowerPoint PPT Presentation

compound channel with feedback opportunistic capacity and
SMART_READER_LITE
LIVE PREVIEW

Compound channel with feedback: Opportunistic capacity and error - - PowerPoint PPT Presentation

Compound channel with feedback: Opportunistic capacity and error exponents Aditya Mahajan and Sekhar Tatikonda Yale University March 17, 2010 CISS Compound channel Channel model | , =


slide-1
SLIDE 1

Compound channel with feedback: Opportunistic capacity and error exponents

Aditya Mahajan and Sekhar Tatikonda Yale University

March 17, 2010 CISS

slide-2
SLIDE 2

Compound channel

Channel model

  • | , − = ∘

|

∘ ∈ , known to encoder and decoder

slide-3
SLIDE 3

Compound channel

Channel model

  • | , − = ∘

|

∘ ∈ , known to encoder and decoder

Capacity

= max

∈𝛦𝕐 inf ∈ℚ 𝐽,

slide-4
SLIDE 4

Compound channel

Channel model

  • | , − = ∘

|

∘ ∈ , known to encoder and decoder

Capacity

= max

∈𝛦𝕐 inf ∈ℚ 𝐽,

Capacity with feedback

𝐺 = inf

∈ℚ max ∈𝛦𝕐 𝐽,

slide-5
SLIDE 5

Feedback capacity is defined pessimistically

slide-6
SLIDE 6

Outline

Variable length coding scheme

Achievable rate and opportunistic capacity Probability of error and error exponents

Literature Overview

Variable length communication over DMC Variable length communication over compound channel

Main Result

Lower bound on error exponent region Achievable coding scheme

Example

slide-7
SLIDE 7

Variable length coding

Assume = {, …, 𝑀}. Variable length coding scheme is a tuple 𝐍, , , 𝜐 Compound message: 𝐍 = , …, 𝑀. Let 𝕅 = ∏𝑀

ℓ={, …, ℓ}.

Encoding strategy: =

, , …

  • 𝑢 = 𝕅 × 𝑢− ↦

Decoding strategy: = , , … 𝑢 : 𝑢 ↦

𝑀

ℓ=

{ℓ, , ℓ, , …, ℓ, ℓ} Stopping time 𝜐 with respect to the channel output 𝑢

slide-8
SLIDE 8

Operation of the scheme

Compound message 𝐗 =

, …, 𝑀

  • ℓ is uniformly distributed in {, …, ℓ}

Encoder =

𝐗,

=

𝐗, ,

⋯ Decoder: ˆ , ˆ = 𝜐

, …, 𝜐

slide-9
SLIDE 9

Performance metrics

Probability of error 𝐐 =

, …, 𝑀

  • ℓ = ℓ ˆ

≠ ˆ

𝑀

Rate: 𝐒 = , …, 𝑀 ℓ = 𝔽ℓ[log ˆ

𝑀]

𝔽ℓ[𝜐]

slide-10
SLIDE 10

Operational interpretation

Encoder Channel Decoder

Variable length communication using 𝐍, , , 𝜐 Higher level application generates an infinite bit-stream

slide-11
SLIDE 11

Operational interpretation

Encoder Channel Decoder

Variable length communication using 𝐍, , , 𝜐 Higher level application generates an infinite bit-stream Let max = max{, …, 𝑀} and min = min{, …, 𝑀}

slide-12
SLIDE 12

Operational interpretation

Encoder Channel Decoder

Variable length communication using 𝐍, , , 𝜐 Higher level application generates an infinite bit-stream Let max = max{, …, 𝑀} and min = min{, …, 𝑀}

Encoding

Transmitter picks log max bits from the bit-stream.

  • ℓ is the decimal expansion of the first log ℓ of these bits.
slide-13
SLIDE 13

Operational interpretation

Encoder Channel Decoder

Decoding

At stopping time 𝜐, the receiver passes ˆ , ˆ to a higher layer application. The transmitter removes log ˆ

𝑀 bits from the log max initially

chosen bits and returns the remaining bits to the bit-stream.

slide-14
SLIDE 14

Operational interpretation

Encoder Channel Decoder

Decoding

At stopping time 𝜐, the receiver passes ˆ , ˆ to a higher layer application. The transmitter removes log ˆ

𝑀 bits from the log max initially

chosen bits and returns the remaining bits to the bit-stream. Advantage of being opportunistic: log ˆ

𝑀 − log min

slide-15
SLIDE 15

Opportunistic capacity

Achievable Rate

Rate 𝐒 = , …, 𝑀 is achievable if ∃ sequence of coding schemes such that for 𝜁 > and sufficiently large , and for all ℓ = , …, ,

  • 1. lim→∞ 𝔽ℓ[𝜐] = ∞

2.

< 𝜁 and

ℓ − 𝜁

slide-16
SLIDE 16

Opportunistic capacity

Achievable Rate

Rate 𝐒 = , …, 𝑀 is achievable if ∃ sequence of coding schemes such that for 𝜁 > and sufficiently large , and for all ℓ = , …, ,

  • 1. lim→∞ 𝔽ℓ[𝜐] = ∞

2.

< 𝜁 and

ℓ − 𝜁 The union of all achievable rates is called the opportunistic capacity region ℂ𝐺

slide-17
SLIDE 17

Error Exponents

Error exponent

Given a sequence of coding scheme that achieve a rate vector 𝐒, the error exponent 𝐅 = , …, 𝑀 is given by ℓ = lim

→∞ −log ℓ

𝔽ℓ[𝜐]

slide-18
SLIDE 18

Error Exponents

Error exponent

Given a sequence of coding scheme that achieve a rate vector 𝐒, the error exponent 𝐅 = , …, 𝑀 is given by ℓ = lim

→∞ −log ℓ

𝔽ℓ[𝜐] For a particular rate 𝐒, the union of all possible error exponents is called the error exponent region 𝔽𝐒.

slide-19
SLIDE 19

Outline

Variable length coding scheme

Achievable rate and opportunistic capacity Probability of error and error exponents

Literature Overview

Variable length communication over DMC Variable length communication over compound channel

Main Result

Lower bound on error exponent region Achievable coding scheme

Example

slide-20
SLIDE 20

Variable length communication over DMC

Special case of a compound channel when || = . Burnashev-76, “Data transmission over a discrete channel with feedback: Random transmission time” Burnashev exponent , = − where = /.

slide-21
SLIDE 21

Variable length communication over DMC

Achievability scheme

Yamamoto-Itoh-79, “Asymptotic performance of a modified Schalkwijk-Barron scheme with noiseless feedback”. Message mode: Fixed length code at rate − 𝜁 and length Control mode: Send ACCEPT or REJECT for length − Repeat until ACCEPT is received

slide-22
SLIDE 22

Advantage of variable length comm

Rate Exponent BSC. = 0.53 bits = 0.53 bits 0.728 0.728 0.322 0.322 2.536 2.536 Burnashev's exponent

slide-23
SLIDE 23

Variable length comm

  • ver compound channel

Tchamkerten-Telatar-06, “Variable length coding over unknown channel” Can we achieve Burnashev exponent even if we do not know the channel?

slide-24
SLIDE 24

Variable length comm

  • ver compound channel

Tchamkerten-Telatar-06, “Variable length coding over unknown channel” Can we achieve Burnashev exponent even if we do not know the channel?

Negative result

Restricted attention to ℓ/ℓ = 𝑑𝑏 Under some restricted conditions, yes. In general, no.

slide-25
SLIDE 25

Counterexample: {BSC𝑞, BSC−𝑞}

0.0 0.1 0.2 0.3 0.4 0.5 0.0 0.5 1.0 1.5 2.0 2.5 Rate Error Exponent

Burnashev exponent Zero−rate error exponent Combined upper bound Exponent of proposed scheme

slide-26
SLIDE 26

Questions

What are the error exponents when conditions of Tchamkerten-Telatar-06 are not met? Which coding schemes achieve the best exponent? What about rates when ℓ/ℓ is not a constant?

slide-27
SLIDE 27

Outline

Variable length coding scheme

Achievable rate and opportunistic capacity Probability of error and error exponents

Literature Overview

Variable length communication over DMC Variable length communication over compound channel

Main Result

Lower bound on error exponent region Achievable coding scheme

Example

slide-28
SLIDE 28

Main Result

Opportunistic Capacity

ℂ𝐺 = {, …, 𝑀 : ℓ < ℓ, ℓ = , …, } where ℓ is the capacity of DMC ℓ.

slide-29
SLIDE 29

Main Result

Opportunistic Capacity

ℂ𝐺 = {, …, 𝑀 : ℓ < ℓ, ℓ = , …, } where ℓ is the capacity of DMC ℓ.

Error Exponent Region

Let 𝑈𝑑

ℓ be the exponent of the channel estimation error when the

channel is ℓ. For any channel estimation scheme, 𝑈𝑑

, …, 𝑈𝑑 𝑀 ∈ 𝕌*.

slide-30
SLIDE 30

Main Result

Opportunistic Capacity

ℂ𝐺 = {, …, 𝑀 : ℓ < ℓ, ℓ = , …, } where ℓ is the capacity of DMC ℓ.

Error Exponent Region

Let 𝑈𝑑

ℓ be the exponent of the channel estimation error when the

channel is ℓ. For any channel estimation scheme, 𝑈𝑑

, …, 𝑈𝑑 𝑀 ∈ 𝕌*.

At rate 𝐒 = , …, 𝑀, the error exponent is ℓ 𝑈𝑑

𝑈𝑑

ℓ + ℓ

ℓ ( − ℓ ℓ ) where ℓ = max𝑦𝐵,𝑦𝑆∈𝕐 ℓ⋅|𝑦𝐵, ℓ⋅|𝑦

slide-31
SLIDE 31

The achievable scheme

  • ˆ

,

  • ˆ

𝑑, Communicate in variable number of epochs. Each epoch is variable length and consists of four phases Training phase of length . Generate channel estimate ˆ

  • Message phase of length ˆ

, . Assume that the channel is ˆ . Re-training phase of length . Generate channel estimate ˆ 𝑑. Control phase of length ˆ 𝑑, . Transmit ACCEPT or REJECT assuming that the channel is ˆ 𝑑.

slide-32
SLIDE 32

Proof Outline: Number of epochs

  • ˆ

,

  • ˆ

𝑑,

Number of epochs

ℓ = 𝑙 = ℓ − ℓ𝑙−, lim

→∞ ℓ =

Number of epochs ≈

slide-33
SLIDE 33

Proof Outline: Rate of transmission

  • ˆ

,

  • ˆ

𝑑,

Rate of transmission

lim

→∞

𝔽ℓ[# messages] 𝔽ℓ[# epochs]𝔽ℓ[epoch length]

slide-34
SLIDE 34

Proof Outline: Rate of transmission

  • ˆ

,

  • ˆ

𝑑,

Rate of transmission

lim

→∞

𝔽ℓ[# messages] 𝔽ℓ[# epochs]𝔽ℓ[epoch length] 𝔽ℓ[# messages] = ( ) −

−𝛾( ) + (

)

−𝛾( )

≈ ℓℓ

slide-35
SLIDE 35

Proof Outline: Rate of transmission

  • ˆ

,

  • ˆ

𝑑,

Rate of transmission

lim

→∞

𝔽ℓ[# messages] 𝔽ℓ[# epochs]𝔽ℓ[epoch length] 𝔽ℓ[# messages] = ( ) −

−𝛾( ) + (

)

−𝛾( )

≈ ℓℓ 𝔽ℓ[# epochs] ≈

slide-36
SLIDE 36

Proof Outline: Rate of transmission

  • ˆ

,

  • ˆ

𝑑,

Rate of transmission

lim

→∞

𝔽ℓ[# messages] 𝔽ℓ[# epochs]𝔽ℓ[epoch length] 𝔽ℓ[# messages] = ( ) −

−𝛾( ) + (

)

−𝛾( )

≈ ℓℓ 𝔽ℓ[# epochs] ≈ 𝔽ℓ[epoch length] = 𝔽ℓ[ + ℓ, + + ] ≈ ℓ

slide-37
SLIDE 37

Proof Outline: Rate of transmission

  • ˆ

,

  • ˆ

𝑑,

Rate of transmission

lim

→∞

𝔽ℓ[# messages] 𝔽ℓ[# epochs]𝔽ℓ[epoch length] ≈ ℓ 𝔽ℓ[# messages] = ( ) −

−𝛾( ) + (

)

−𝛾( )

≈ ℓℓ 𝔽ℓ[# epochs] ≈ 𝔽ℓ[epoch length] = 𝔽ℓ[ + ℓ, + + ] ≈ ℓ

slide-38
SLIDE 38

Proof Outline: Error Exponents

Error exponent

lim

→∞

− log

𝔽ℓ[# epochs]𝔽ℓ[epoch length]

slide-39
SLIDE 39

Proof Outline: Error Exponents

Error exponent

lim

→∞

− log

𝔽ℓ[# epochs]𝔽ℓ[epoch length] 𝔽ℓ[# epochs] ≈ , 𝔽ℓ[epoch length] ≈ ℓ

slide-40
SLIDE 40

Proof Outline: Error Exponents

Error exponent

lim

→∞

− log

𝔽ℓ[# epochs]𝔽ℓ[epoch length] 𝔽ℓ[# epochs] ≈ , 𝔽ℓ[epoch length] ≈ ℓ

  • ℓ (

−𝛾( ) + −𝛾ℓ,( ) )

× (

−𝛾( ) + −𝛾ℓ,( ) )

× 𝔽ℓ[# epochs]

slide-41
SLIDE 41

Proof Outline: Error Exponents

Error exponent

lim

→∞

− log

𝔽ℓ[# epochs]𝔽ℓ[epoch length] 𝔽ℓ[# epochs] ≈ , 𝔽ℓ[epoch length] ≈ ℓ

  • ℓ (

−𝛾( ) + −𝛾ℓ,( ) )

× (

−𝛾( ) + −𝛾ℓ,( ) )

× 𝔽ℓ[# epochs] − log (

−𝛾( ) + −𝛾ℓ,( ) ) ≈ (

)

slide-42
SLIDE 42

Proof Outline: Error Exponents

Error exponent

lim

→∞

− log

𝔽ℓ[# epochs]𝔽ℓ[epoch length] 𝔽ℓ[# epochs] ≈ , 𝔽ℓ[epoch length] ≈ ℓ

  • ℓ (

−𝛾( ) + −𝛾ℓ,( ) )

× (

−𝛾( ) + −𝛾ℓ,( ) )

× 𝔽ℓ[# epochs] − log (

−𝛾( ) + −𝛾ℓ,( ) ) ≈ (

) − log (

−𝛾( ) + −𝛾ℓ,( ) ) ( )⋅( ) ( )+( ) + ℓ,

slide-43
SLIDE 43

Proof Outline: Error Exponents

Error exponent

lim

→∞

− log

𝔽ℓ[# epochs]𝔽ℓ[epoch length]

  • 𝑈𝑑

ℓ ⋅ ℓ

𝑈𝑑

ℓ + ℓ

− ℓ 𝔽ℓ[# epochs] ≈ , 𝔽ℓ[epoch length] ≈ ℓ

  • ℓ (

−𝛾( ) + −𝛾ℓ,( ) )

× (

−𝛾( ) + −𝛾ℓ,( ) )

× 𝔽ℓ[# epochs] − log (

−𝛾( ) + −𝛾ℓ,( ) ) ≈ (

) − log (

−𝛾( ) + −𝛾ℓ,( ) ) ( )⋅( ) ( )+( ) + ℓ,

slide-44
SLIDE 44

Outline

Variable length coding scheme

Achievable rate and opportunistic capacity Probability of error and error exponents

Literature Overview

Variable length communication over DMC Variable length communication over compound channel

Main Result

Lower bound on error exponent region Achievable coding scheme

Example

slide-45
SLIDE 45

An Example

  • r
  • r

= {BSC𝑞, BSC−𝑞}, known at encoder and decoder

slide-46
SLIDE 46

An Example

  • r
  • r

= {BSC𝑞, BSC−𝑞}, known at encoder and decoder Capacity: 𝑞 = −𝑞 = − ℎ

slide-47
SLIDE 47

An Example

  • r
  • r

= {BSC𝑞, BSC−𝑞}, known at encoder and decoder Capacity: 𝑞 = −𝑞 = − ℎ Slope of Burnashev exp: 𝑞 = −𝑞 = ∥ −

slide-48
SLIDE 48

An Example

  • r
  • r

= {BSC𝑞, BSC−𝑞}, known at encoder and decoder Capacity: 𝑞 = −𝑞 = − ℎ Slope of Burnashev exp: 𝑞 = −𝑞 = ∥ − Channel estimation rule: Transmit the all zero sequence as the training

  • sequence. Estimate BSC𝑞 if frequency of ones is less than ; else

estimate BSC−𝑞.

slide-49
SLIDE 49

An Example

  • r
  • r

= {BSC𝑞, BSC−𝑞}, known at encoder and decoder Capacity: 𝑞 = −𝑞 = − ℎ Slope of Burnashev exp: 𝑞 = −𝑞 = ∥ − Channel estimation rule: Transmit the all zero sequence as the training

  • sequence. Estimate BSC𝑞 if frequency of ones is less than ; else

estimate BSC−𝑞. Exponent of training error: 𝑈

𝑞 = ∥

𝑈−𝑞 = − ∥

slide-50
SLIDE 50

Performance evaluation

Communication at rate 𝐒 = 𝑞, −𝑞. Let = /.

Error exponents

𝑞 ∥ ⋅ ∥ − ∥ + ∥ − − 𝑞 −𝑞 ∥ − ⋅ ∥ − ∥ − + ∥ − − −𝑞

slide-51
SLIDE 51

Performance evaluation

Communication at rate 𝐒 = 𝑞, −𝑞. Let = /.

Error exponents

𝑞 ∥ ⋅ ∥ − ∥ + ∥ − − 𝑞 −𝑞 ∥ − ⋅ ∥ − ∥ − + ∥ − − −𝑞

Optimal threshold

Choose such that 𝑞 = −𝑞: solve for in 𝜒, = − 𝑞 − −𝑞 where 𝜒, is appropriately defined

slide-52
SLIDE 52

Error Exponents

0.2 0.4 0.6 0.8 0.2 0.4 0.6 0.8 0.1 0.2 0.3 0.4 0.5

γp γ1−p E p γp γ1−p

0.2 0.4 0.6 0.8 0.2 0.4 0.6 0.8

0.05 . 1 . 1 5 . 2 . 2 5 . 3 . 3 5 . 4 . 4 5 . 5

slide-53
SLIDE 53

Threshold for channel estimation

0.2 0.4 0.6 0.8 q ϕ(q, 0.1) 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10−3 10−2 10−1 1 10 102 103

slide-54
SLIDE 54

Threshold for channel estimation

𝑞 −𝑞

  • 𝑞 = −𝑞

0.5 0.1 0.5861 0.3666 0.5 0.2 0.5695 0.3511 0.5 0.3 0.5501 0.3329 0.5 0.4 0.5273 0.3114 0.5 0.5 0.5000 0.2855 0.5 0.6 0.4666 0.2537 0.5 0.7 0.4247 0.2139 0.5 0.8 0.3698 0.1628 0.5 0.9 0.2918 0.0952

slide-55
SLIDE 55

Conclusion

Contributions

Defining opportunistic capacity and corresponding error exponent regions for compound channels with feedback. A simple and easy to implement coding scheme whose error exponents are within a multiplicative factor of the best possible error exponents In the presence of feedback, training based schemes can lead to reasonable performance

slide-56
SLIDE 56

Conclusion

Contributions

Defining opportunistic capacity and corresponding error exponent regions for compound channels with feedback. A simple and easy to implement coding scheme whose error exponents are within a multiplicative factor of the best possible error exponents In the presence of feedback, training based schemes can lead to reasonable performance

Future directions

Channels defined over continuous families and continuous alphabets Upper bound on error exponents

slide-57
SLIDE 57

Thank You