di digi gital tal co comm mmuni unication cation sy syst
play

Di Digi gital tal Co Comm mmuni unication cation Sy Syst - PowerPoint PPT Presentation

Di Digi gital tal Co Comm mmuni unication cation Sy Syst stem ems ECS 452 EC Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th Channel Capacity Office Hours: Rangsit Library: Tuesday 16:20-17:20 BKD3601-7: Thursday


  1. Di Digi gital tal Co Comm mmuni unication cation Sy Syst stem ems ECS 452 EC Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th Channel Capacity Office Hours: Rangsit Library: Tuesday 16:20-17:20 BKD3601-7: Thursday 16:00-17:00 1

  2. Operational Meaning of Capacity 2

  3. Reliable Communication   Q y x X Y  Reliable communication means arbitrary small error probability can be achieved.  This seems to be an impossible goal.  If the channel introduces errors, how can one correct them all?  Any correction process is also subject to error, ad infinitum.  Operational Channel capacity C = the maximum rate at which reliable communication over a channel is possible. 3

  4. Coding  or Encoding  or Channel Encoding  Introduce redundancy so that even if some of the information is lost or corrupted, it will still be possible to recover the message at the receiver. 4

  5. Repetition Code (k = 1)  The most obvious coding scheme is to repeat information.  For example,  to send a 1, we send 11111, and  to send a 0, we send 00000.  This scheme uses five symbols to send 1 bit, and therefore has a rate of 1/5 bit per symbol.  If this code is used on a binary symmetric channel, the ML decoding rule (which is optimal when the 0s and 1s are equiprobable), is equivalent to taking the majority vote of each block of five received bits.  If three or more bits are 1, we decode the block as a 1;  otherwise, we decode it as 0.  By using longer repetition codes, we can achieve an arbitrarily low probability of error .  But the rate of the code also goes to zero with (larger) block length, so even though the code is “simple,” it is really not a very useful code. 5

  6. Repetition Code over BSC 0.5 0.45 0.4 0.35 0.3   P n = 1 0.25 n = 5 0.2 0.15 n = 15 n = 25 0.1 0.05 0 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 p 6

  7. Parity Bit or Check Bit  In mathematics, parity refers to the evenness or oddness of an integer  Here, parity refers to the evenness or oddness of the # 1’s within a given set of bits.  It can be calculated via an XOR sum of the bits, yielding 0 for even parity and 1 for odd parity.  Ex.  Even parity: 0110, 011011  Odd Parity: 0111, 011010  A parity bit, or check bit, is a bit added to the end of the k information bit.  𝑜 = 𝑙 + 1  There are two variants of parity bits: even parity bit and odd parity bit .  Even parity bit : Choose the n th bit so that the number of 1’s in the block is even .  Ex. k = 5   B , i 1,2, , k  10000 1 ; 10100 0 ; 11111 1 ; 01011 1   i X        i  B B B B , i k 1 n 1 2 3 k 7

  8. Parity Bit or Check Bit  Used as the simplest form of error detecting code.  Does not detect an even number of errors  Does not give any information about how to correct the errors that occur.  Generalization: Parity Check Code s  We can extend the idea of parity check bit  to allow for multiple parity check bits and  to allow the parity checks to depend on various subsets of the information bits.  The Hamming code is an example of a parity check code. 8

  9. NOISY CHANNEL CODING THEOREM  [SHANNON, 1948] Reliable communication over a (discrete memoryless) 1. channel is possible if the communication rate R satisfies R < C , where C is the channel capacity .  In particular, for any R < C , there exist codes (encoders and decoders) with sufficiently large n such that  Positive function of R for R < C     ˆ        n E R P P W W 2  Completely determined by the   channel characteristics At rates higher than capacity, 2. reliable communication is impossible . 9

  10. NOISY CHANNEL CODING THEOREM  Express the limit to reliable communication  Provides a yardstick to measure the performance of communication systems.  A system performing near capacity is a near optimal system and does not have much room for improvement.  On the other hand a system operating far from this fundamental bound can be improved (mainly through coding techniques). 10

  11. Shannon’s nonconstructive proof  Shannon introduces a method of proof called random coding .  Instead of looking for the best possible coding scheme and analyzing its performance, which is a difficult task,  all possible coding schemes are considered  by generating the code randomly with appropriate distribution  and the performance of the system is averaged over them.  Then it is proved that if R < C, the average error probability tends to zero.  This proves that  as long as R < C,  at any arbitrarily small (but still positive) probability of error,  one can find (there exist) at least one code (with sufficiently long block length n ) that performs better than the specified probability of error. 11

  12. Shannon’s nonconstructive proof  If we used the scheme suggested and generate a code at random, the code constructed is likely to be good for long block lengths.  No structure in the code. Very difficult to decode  In addition to achieving low probabilities of error, useful codes should be “simple,” so that they can be encoded and decoded efficiently.  Hence the theorem does not provide a practical coding scheme.  Since Shannon’s paper, a variety of techniques have been used to construct good error correcting codes.  The entire field of coding theory has been developed during this search.  Turbo codes have come close to achieving capacity for Gaussian channels. 12

  13. Deriving the Q Matrix 13

  14. Probability Calculation: 1-D Noise           b a a b            2         N 0, P a N b Q Q                 i Decision region for j s Decision region for s  ( t )     j i s s a j b j a i b i                        i i i i P s P a R b S s P a s N b s     i i i i                     i i i i a s b s s a b s              i i i i Q Q 1 Q Q                     ˆ                i i i P W j W i P a R b S s P a s N b s       j j j j           i i a s b s       j j Q Q       14    

  15.           i i a s b s  ˆ          j j P W j W i Q Q             Ex. Standard 3-PAM         2 3   1 s 0 s d s d  ( t ) d d  2 2 j 1 2 3   i i s                 d d d d                                     d d d d                   d         d 2 2 2 2                 d 1 Q Q Q Q Q Q                                                   d d d d                          0   0   0   0                  0         0 2 2 2 2                0 2 Q Q Q Q Q Q                                                   d   d   d   d                        d   d   d   d                    d         d 2 2 2 2                 d 3 Q Q Q Q Q Q                                   15

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend