i n f o r m a t i o n t r a n s m i s s i o n c h a p t e
play

I n f o r m a t i o n T r a n s m i s s i o n - PowerPoint PPT Presentation

I n f o r m a t i o n T r a n s m i s s i o n C h a p t e r 5 , C h a n n e l c o d i n g OVE EDFORS ELECTRICAL AND INFORMATION TECHNOLOGY L e a r n i n g o u t c o m e s A f t e


  1. I n f o r m a t i o n T r a n s m i s s i o n C h a p t e r 5 , C h a n n e l c o d i n g OVE EDFORS ELECTRICAL AND INFORMATION TECHNOLOGY

  2. L e a r n i n g o u t c o m e s ● A f t e r t h i s l e c t u r e t h e s t u d e n t s h o u l d – u n d e r s t a n d t h e p r i n c i p l e s o f c h a n n e l c o d i n g , – u n d e r s t a n d h o w t y p i c a l s e q u e n c e s c a n b e u s e d t o fi n d o u t h o w ” f a s t ” w e c a n s e n d i n f o r m a t i o n o v e r a c h a n n e l , – h a v e a b a s i c k n o w l e d g e a b o u t h o w c h a n n e l c a p a c i t y i s r e l a t e d t o m u t u a l i n f o r m a t i o n a n d i t s m a x i m i z a t i o n o v e r t h e c h a n n e l i n p u t d i s t r i b u t i o n – k n o w h o w t o c a l c u l a t e t h e c h a n n e l c a p a c i t y f o r t h e b i n a r y s y m m e t r i c c h a n n e l a n d t h e a d d i t i v e w h i t e G a u s s i a n n o i s e ( A WG N ) c h a n n e l O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 2

  3. Wh e r e a r e w e i n t h e B I G P I C T U R E ? Channel Channel coding capacity Lecture relates to pages 166-179 in textbook. O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 3

  4. Wh a t d i d S h a n n o n p r o m i s e ? • As long as the SNR is above -1.6 dB in an AWGN channel we can provide reliable communication O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 4

  5. A s c h e m a t i c c o m m u n i c a t i o n s y s t e m OUR FOCUS O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 5

  6. REP. T y p i c a l s e q u e n c e s All typical long sequences have approximately the same probability and from the law of large numbers it follows that the set of these typical sequences is overwhelmingly probable. The probability that a long source output sequence is typical is close to one, and, there are approximately typical long sequences. O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 6

  7. REP. E x a m p l e f r o m t e x t b o o k ( d r a w f r o m u r n ) ● - probability 1/3 ○ - probability 2/3 Number of typical sequences should be about: Sequences with “observed uncertainty” within 15% of h(1/3) (probability between 0.027 and 0.068): (the ones marked with stars) Why the large discrepancy? Only valid for “long” sequences. … but the 15 sequences are less than 1/2 of all sequences and contain about 2/3 of all probability. O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 7

  8. REP. P r o p e r t i e s o f t y p i c a l s e q u e n c e s O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 8

  9. REP. L o n g e r t y p i c a l s e q u e n c e s Let us now choose a smaller namely (5% of h(1/3)), and increase the length of the sequences. Then we obtain the following table: Note : In the first example with length-five sequences we had a wider tolerance of 15% of h(1/3), and captured 2/3 of the probability in our typical sequences. With this tighter tolerance we need sequences of length 100 to capture 2/3 of the total probability in the typical sequences. O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 9

  10. REP. T y p i c a l s e q u e n c e s i n t e x t If we have L letters in our alphabet, then we can compose L n different sequences that are n letters long. Only approximately , where H(X) is the uncertainty of the language, of these are “meaningful”. What is meant by “meaningful” is determined by the structure of the language; that is, by its grammar, spelling rules etc. O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 1 0

  11. REP. T y p i c a l s e q u e n c e s i n t e x t Only a fraction which vanishes when n grows provided that is ”meaningful” text of length n letters. For the English language H ( X ) is typically 1.5 bits/letter and bits/letter. O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 1 1

  12. REP. S t r u c t u r e i n t e x t Shannon illustrated how increasing structure between letters will give better approximations of the English language. Assuming an alphabet with 27 symbols – 26 letters and one space – he started with an approximation of the first order. The symbols are chosen independently of each other but with the actual probability distribution (12 % E, 2 % W, etc.): OCRO HLI RGWR NMIELWIS EU LL NBNESEBYA TH EEI ALHENHTTPA OOBTTVA NAH BRL O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 1 2

  13. REP. S t r u c t u r e i n t e x t Then Shannon continued with the approximation of the second order. The symbols are chosen with the actual bigram statistics – when a symbol has been chosen, the next symbol is chosen according to the actual conditional probability distribution: ON IE ANTSOUTINYS ARE T INCTORE ST BE S DEAMY ACHIN D ILONASIVE TUCOOWE AT TEASONARE FUSO TIZIN ANDY TOBE SEACE CTISBE O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 1 3

  14. REP. S t r u c t u r e i n t e x t The approximation of the third order is based on the trigram statistics – when two successive symbols have been chosen, the next symbol is chosen according to the actual conditional probability distribution: IN NO IST LAT WHEY CRATICT FROURE BIRS GROCID PONDENOME OF DEMONSTRURES OF THE REPTAGIN IS REGOACTIONA OF CRE O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 1 4

  15. REP. Tie p r i n c i p l e o f s o u r c e c o d i n g Consider the set of typical long output sequences of n symbols from a source with uncertainty H ( X ) bits per source symbol. Since there are fewer than typical long sequences in this set, they can be represented by binary digits; that is, by binary digits per source symbol. O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 1 5

  16. Channel coding O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 1 6

  17. B l o c k c o d i n g b a s i c s Divide the information-sequence to be transmitted into blocks u = [ u 1 u 1 … u K ] of K bits. This is called … 1001|1110|1010|0011|1010|1111|1110 … an ( N , K ) block code, with code Divided into blocks of 4 bits here rate There are 2 K different blocks u of K information bits (here 16). For each unique block of information bits, assign a unique code word x = [ x 1 x 2 … x N ] of length N > K bits. Let's use N = 7. and in this case it is a (7,4) code Note that this is a subset of all possible sequences of length N . with rate Encode your information sequence by replacing each information block u with the corresponding code word x . … 0011001|0010110|0100101|1000011|1011010|1111111|0010110 … 7 bit code words here O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 1 7

  18. D i g i t a l c h a n n e l – s y m b o l s i n a n d o u t Our code words Symbols X as input to channel Symbols Y as output from channel Our code words, corrupted by the noisy channel O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 1 8

  19. F a n s ( o f a t y p i c a l i n p u t s e q u e n c e a n d i t s t y p i c a l o u t p u t s e q u e n c e s ) Consider a channel with input X and output Y . Then we have approximately and typical input and output sequences of length N , respectively. Furthermore, for each typical long input sequence we have approximately typical long output sequences that are jointly typical with the given input sequence, we call such an input sequence together with its jointly typical output sequences a fan . O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 1 9

  20. Input sequences of Output sequences of length N length N We can have at most non-overlapping fans O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 2 0

  21. Ma x i m u m r a t e Each fan can represent a message. Hence, the number of distinguishable messages, can be at most, , that is Equivalently, the largest value of the rate R for non- overlapping fans is O v e E d f o r s E I T A 3 0 - C h a p t e r 5 ( P a r t 2 ) 2 1

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend