lecture 1 introduction
play

Lecture 1 Introduction I-Hsiang Wang Department of Electrical - PowerPoint PPT Presentation

Course Information Overview Lecture 1 Introduction I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw September 22, 2015 1 / 46 I-Hsiang Wang IT Lecture 1 Course Information Overview


  1. Course Information Overview Lecture 1 Introduction I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw September 22, 2015 1 / 46 I-Hsiang Wang IT Lecture 1

  2. Course Information Overview Information Theory Information Theory is a mathematical theory of information Information is usually obtained by getting some “messages” (speech, text, images, etc.) from others. When obtaining information from a message, you may care about: What is the meaning of a message? How important is the message? How much information can I get from the message? 2 / 46 I-Hsiang Wang IT Lecture 1

  3. Course Information Overview Information Theory Information Theory is a mathematical theory of information. Information is usually obtained by getting some “messages” (speech, text, images, etc.) from others. When obtaining information from a message, you may care about: What is the meaning of a message? How important is the message? How much information can I get from the message? Information theory is about the quantification of information. 3 / 46 I-Hsiang Wang IT Lecture 1

  4. Course Information Overview Information Theory Information Theory is a mathematical theory of information (primarily for communication systems) that Establishes the fundamental limits of communication systems (Quantifies the amount of information that can be delivered from a party to another) Built upon probability theory and statistics Main concern: ultimate performance limit (usually the rate of amount of time) scales to the asymptotic regime, given that the desired information is delivered “reliably”. 4 / 46 I-Hsiang Wang IT Lecture 1 information processing) as certain resources (usually the total

  5. Course Information Overview In this course, we will 1 Establish solid foundations and intuitions of information theory, 2 Introduce explicit methods to achieve information theoretic limits, 3 Demonstrate further applications of information theory beyond communications. Later, we begin with a brief overview of information theory and the materials to be covered in this course. 5 / 46 I-Hsiang Wang IT Lecture 1

  6. Course Information Overview 1 Course Information 2 Overview 6 / 46 I-Hsiang Wang IT Lecture 1

  7. Course Information Overview Logistics Email: ihwang@ntu.edu.tw Office Hours: 17:00 – 18:00, Monday and Tuesday 2 Lecture Time: 13:20 – 14:10 (6) Tuesday, and 10:20 – 12:10 (34) Wednesday 4 Course Website: http://homepage.ntu.edu.tw/~ihwang/Teaching/Fa15/IT.html 5 Prerequisites: Probability, Linear Algebra. 7 / 46 I-Hsiang Wang IT Lecture 1 1 Instructor: I-Hsiang Wang 王奕翔 Office: MD-524 明達館 524 室 3 Lecture Location: EE2-225 電機二館 225 室

  8. Course Information Press, 2011. I-Hsiang Wang 8 / 46 University Press, 2011. A. El Gamal and Y.-H. Kim, Network Information Theory, Cambridge R. Yeung, Information Theory and Network Coding, Springer, 2008. Lab, ETH Zürich, Switzerland, 2014. S. M. Moser, Information Theory (Lecture Notes), 4th edition, ISI Discrete Memoryless Systems, 2nd Edition, Cambridge University Overview I. Csiszar and J. Korner, Information Theory: Coding Theorems for Wiley, 1968. R. Gallager, Information Theory and Reliable Communications, Edition, Wiley-Interscience, 2006. T. Cover and J. Thomas, Elements of Information Theory, 2nd 7 References 6 Grading: Homework (35%), Midterm (30%), Final (35%) Logistics IT Lecture 1

  9. Course Information We will provide L I-Hsiang Wang 9 / 46 5 This additional effort accounts for part of your homework grades. making sure the solution is correct. instructor about the homework problem that you are in charge of, EX templates, and you should discuss with the T A EX and submitted 1 week after the HW due. Overview A T documented in L 4 Everyone has to develop detailed solution for one HW problem, 3 Late homework = 0 points. (Let me know in advance if you have difficulties.) submission is usually on the next Wednesday in class. 2 Homework (HW) is usually released on Monday. Deadline of Homework IT Lecture 1 1 Roughly 5 ∼ 6 problems every two weeks, in total 7 times.

  10. Course Information Overview Reading and Lecture Notes 1 Slides : Slides are usually released/updated every Sunday evening. 2 Readings : Each lecture has assigned readings. Reading is required: it is not enough to learn from the slides! 3 Go through the slides and the assigned readings before our lectures. It helps you learn better. 4 I recommend you get a copy of the textbook by Cover and Thomas. It is a good reference, and we will often assign readings in the book. 5 Other assigned readings could be Moser’s lecture note (can be obtained online) and relevant papers. 10 / 46 I-Hsiang Wang IT Lecture 1

  11. Course Information week. You are more than welcome to come visit us and ask I-Hsiang Wang 11 / 46 your feedback anonymously. Feedback : There will be online polls during the semester to collect . Send us emails with a subject starting with “[NTU Fall15 IT]” schedule a time slot. My schedule can be found in my website. If you cannot make it to the regular office hours, send us emails to questions, discuss about research, chat, complain, etc. Office Hours : Both TA and myself have 2-hour office hours per Overview 2 Out-of-class: understand. Occasionally, I will call for volunteer to solve the Exercises : We put some exercises on the slides to help you learn and question in English (if necessary), and answer it in English. interaction, feel free to ask questions in Mandarin. I will repeat your Language : This class is taught in English. However, to encourage 1 In-class: Interaction IT Lecture 1 exercises in class. Volunteers get bonus.

  12. Course Information memoryless channels, random coding, typicality decoder, threshold I-Hsiang Wang 12 / 46 entropy, Gaussian channel capacity. coding with cost constraints, discretization technique, differential Channel Coding over Continuous Valued Channels : channel feedback. decoder, error probability analysis, converse proof, channel with Noisy Channel Coding : noisy channel coding theorem, discrete Overview entropy rate. sequences, Fano’s inequality, converse proof, ergodic sources, memoryless sources, asymptotic equipartition property, typical Lossless Source Coding : lossless source coding theorem, discrete entropy (KL divergence), mutual information. Measures of Information : entropy, conditional entropy, relative Course Outline IT Lecture 1

  13. Course Information Data Compression : prefix-free code, Kraft’s inequality, Huffman I-Hsiang Wang 13 / 46 community detection, non-asymptotic information theory, etc. Selected Advanced Topics : network coding, compressed sensing, Capacity Achieving Channel Codes : polar codes, LDPC codes. code, Lempel-Ziv compression. lower bound, non-parametric estimation. Overview theorem, large deviation, hypothesis testing, estimation, Cramér-Rao Information Theory and Statistics : method of types, Sanov’s Source-Channel Separation and Joint Source-Channel Coding rate-distortion tradeoff, typicality encoder, converse proof. Lossy Source Coding (Rate Distortion Theory) : distortion, Course Outline IT Lecture 1

  14. Course Information HW3 out HW2 out 6 10/20, 21 Noisy Channel Coding HW2 due 7 10/27, 28 Continuous-Valued Channel Coding 8 10/13, 14 11/03, 04 Continuous-Valued Channel Coding HW3 due 9 11/10, 11 Midterm Exam 14 / 46 I-Hsiang Wang Noisy Channel Coding 5 Overview 2 Tentative Schedule Week Date Content Remark 1 09/15, 16 Introduction; Measures of Information 09/22, 23 HW1 due Measures of Information 3 09/29, 30 Lossless Source Coding HW1 out 4 10/06, 07 Lossless Source Coding IT Lecture 1

  15. Course Information Polar Code Data Compression HW6 out 15 12/22, 23 Data Compression; Polar Code HW6 due 16 12/29, 30 HW7 out 14 17 01/05, 06 Advanced Topics HW7 due 18 01/12, 13 Final Exam 15 / 46 I-Hsiang Wang 12/15, 16 HW5 due Overview HW4 out Tentative Schedule Week Date Content Remark 10 11/17, 18 Lossy Source Coding 11 Information Theory and Statistics 11/24, 25 Joint Source-Channel Coding HW4 due 12 12/01, 02 Information Theory and Statistics HW5 out 13 12/08, 09 IT Lecture 1

  16. Course Information Overview 1 Course Information 2 Overview 16 / 46 I-Hsiang Wang IT Lecture 1

  17. Course Information Overview 17 / 46 I-Hsiang Wang IT Lecture 1

  18. Course Information Overview Claude E. Shannon (1916 – 2001) 18 / 46 I-Hsiang Wang IT Lecture 1

  19. Course Information Overview Information theory is a mathematical theory of communication 19 / 46 I-Hsiang Wang IT Lecture 1

  20. Course Information Overview 20 / 46 I-Hsiang Wang IT Lecture 1

  21. Course Information Overview Information theory is the mathematical theory of communication 21 / 46 I-Hsiang Wang IT Lecture 1

  22. Course Information Overview Origin of Information Theory 22 / 46 I-Hsiang Wang IT Lecture 1

  23. Course Information Overview Origin of Information Theory 23 / 46 I-Hsiang Wang IT Lecture 1

  24. Course Information Overview Origin of Information Theory Shannon’s landmark paper in 1948 is generally considered as the “birth” of information theory. In the paper, Shannon set it clear that information theory is about the quantification of information in a communication system. In particular, it focuses on characterizing the necessary and sufficient message generated by a source terminal. 24 / 46 I-Hsiang Wang IT Lecture 1 condition of whether or not a destination terminal can reproduce a

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend