chapter 1 overview
play

Chapter 1 Overview Peng-Hua Wang Graduate Inst. of Comm. - PowerPoint PPT Presentation

Chapter 1 Overview Peng-Hua Wang Graduate Inst. of Comm. Engineering National Taipei University What is information theory ? Fundamental questions in communication theory: How much can we compression data? entropy H . How fast can


  1. Chapter 1 Overview Peng-Hua Wang Graduate Inst. of Comm. Engineering National Taipei University

  2. What is information theory ? ■ Fundamental questions in communication theory: ◆ How much can we compression data? entropy H . ◆ How fast can we transmit data ? channel capacity C . ■ Information theory has fundamental contributions to ◆ electrical engineering ◆ statistical physics ◆ computer science ◆ statistical inference ◆ probability and statistics. Peng-Hua Wang, February 19, 2012 Information Theory, Chap. 1 - p. 2/5

  3. EE: Communication theory ■ Is it impossible to send information without error ? ◆ Shannon proved that the probability of error could be made nearly zero for all communication rates below channel capacity. (and created a new field of applied mathematics: information theory ). ◆ Compression of a random processes has a limit (the entropy). ◆ If the entropy of the source is less than the capacity of the channel, asymptotically error-free communication can be achieved. ■ Recent work on the communication aspects of information theory focus on network information theory. ◆ The theory of simultaneous communication from many senders to many receivers. Peng-Hua Wang, February 19, 2012 Information Theory, Chap. 1 - p. 3/5

  4. CS: Kolmogorov complexity ■ The complexity of a string of data is the length of the shortest binary computer program for computing the string. ◆ The Kolmogorov complexity K ≈ Shannon entropy H ➜ if the sequence is drawn at random from a distribution that has entropy H . ■ Computational complexity (time complexity) ⇒ program running time Kolmogorov complexity ⇒ program length. ◆ Can we simultaneous minimize these two ? Peng-Hua Wang, February 19, 2012 Information Theory, Chap. 1 - p. 4/5

  5. What we will learn in this course ■ Basic definition: entropy, mutual information, channel capacity, . . . ■ Data compression : What is the shortest description of a random variable ? ■ Rate distortion theory : If distortion D is allowable, what channel capacities are sufficient for transmission and reconstruction with distortion less than or equal to D ? ■ Data transmission : How do we transmit so that the receiver can decode the message with a small probability of error? ■ Network information theorem : How do we compress many sources and then jointly reconstruct these compressed message? Peng-Hua Wang, February 19, 2012 Information Theory, Chap. 1 - p. 5/5

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend