Lecture 0 Introduction
I-Hsiang Wang
Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw
September 13, 2016
1 / 50 I-Hsiang Wang IT Lecture 0
Lecture 0 Introduction I-Hsiang Wang Department of Electrical - - PowerPoint PPT Presentation
Lecture 0 Introduction I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw September 13, 2016 1 / 50 I-Hsiang Wang IT Lecture 0 What is Information Theory about? It is a mathematical theory of
Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw
1 / 50 I-Hsiang Wang IT Lecture 0
2 / 50 I-Hsiang Wang IT Lecture 0
3 / 50 I-Hsiang Wang IT Lecture 0
4 / 50 I-Hsiang Wang IT Lecture 0
1 Establish solid foundations and intuitions of information theory, 2 Introduce explicit methods to achieve information theoretic limits, 3 Demonstrate further applications of information theory beyond communications.
5 / 50 I-Hsiang Wang IT Lecture 0
Course Information
1
2
6 / 50 I-Hsiang Wang IT Lecture 0
Course Information
1 Instructor: I-Hsiang Wang ⺩奕翔
Email: ihwang@ntu.edu.tw Website: http://cc.ee.ntu.edu.tw/~ihsiangw/ Office: MD-524 明達館 524 室 Office Hours: 17:00 – 18:00, Monday and Tuesday
2 Lecture Time: 09:10 – 12:10 (234) Wednesday 3 Lecture Location: BL-114 博理館 114 室 4 Course Website: http://homepage.ntu.edu.tw/~ihwang/Teaching/Fa16/IT.html 5 Prerequisites: Probability, Linear Algebra.
7 / 50 I-Hsiang Wang IT Lecture 0
Course Information
6 Grading: Homework (30%), Midterm (30%), Final (40%) 7 References
2nd Edition, Cambridge University Press, 2011.
2014.
2012-2016.
8 / 50 I-Hsiang Wang IT Lecture 0
Course Information
1 Roughly 4 – 5 problems every 2 – 3 weeks, in total 6 times. 2 Homework (HW) is usually released on Monday. Deadline of submission is usually on the next
3 Late homework = 0 points. (Let me know in advance if you have difficulties.) 4 Everyone has to develop detailed solution for one HW problem, documented in L AT
AT
You should discuss with the instructor and TA about the homework problem that you are in charge
5 This additional effort accounts for part of your homework grades.
9 / 50 I-Hsiang Wang IT Lecture 0
Course Information
1 Slides are usually released/updated every Sunday evening. 2 Each lecture has assigned readings.
3 Go through the slides and the assigned readings before our lectures. It helps you learn better. 4 I recommend you get a copy of the textbook by Cover and Thomas. It is a good reference, and
5 Other assigned readings could be Moser's lecture note, Polyanskiy-Wu lecture note, and
10 / 50 I-Hsiang Wang IT Lecture 0
Course Information
1 In-class:
Language: This class is taught in English. However, to encourage interaction, feel free to ask questions in Mandarin. I will repeat your question in English (if necessary), and answer it in English. Exercises: We put some exercises on the slides to help you learn and understand. Occasionally, I will call for volunteer to solve the exercises in class. Volunteers get bonus.
2 Out-of-class:
Office Hours: Both TA and myself have 2-hour office hours per week. You are more than welcome to come visit us and ask questions, discuss about research, chat, complain, etc. If you cannot make it to the regular office hours, send us emails to schedule a time slot. My schedule can be found on my website. Send us emails with a subject starting with "[NTU Fall16 IT]". Feedback: There will be online polls during the semester to collect your feedback anonymously.
11 / 50 I-Hsiang Wang IT Lecture 0
Course Information
12 / 50 I-Hsiang Wang IT Lecture 0
Course Information
13 / 50 I-Hsiang Wang IT Lecture 0
Course Information
Week Date Content Remark 1 09/14 Introduction; Measures of Information 2 09/21 Measures of Information HW1 out 3 09/28 No Lecture (I-Hsiang out of town) 4 10/05 Lossless Source Coding HW1 due 5 10/12 Lossless Source Coding HW2 out 6 10/19 Noisy Channel Coding 7 10/26 Noisy Channel Coding HW2 due; HW3 out 8 11/02 Lossy Source Coding 9 11/09 Lossy Source Coding HW3 due; HW4 out
14 / 50 I-Hsiang Wang IT Lecture 0
Course Information
Week Date Content Remark 10 11/16 Polar Coding 11 11/23 Midterm Exam HW4 due 12 11/30 Statistical Decision Theory 13 12/07 Statistical Decision Theory HW5 out 14 12/14 Information Theory and Statistics 15 12/21 Information Theory and Statistics HW5 due; HW6 out 16 12/28 Information Theory and Statistics 17 01/04 Advanced Topics HW6 due 18 01/11 Final Exam
15 / 50 I-Hsiang Wang IT Lecture 0
Course Overview
1
2
16 / 50 I-Hsiang Wang IT Lecture 0
Course Overview
17 / 50 I-Hsiang Wang IT Lecture 0
Course Overview
18 / 50 I-Hsiang Wang IT Lecture 0
Course Overview
19 / 50 I-Hsiang Wang IT Lecture 0
Course Overview
20 / 50 I-Hsiang Wang IT Lecture 0
Course Overview
21 / 50 I-Hsiang Wang IT Lecture 0
Course Overview General Overview of Information Theory
1
2
22 / 50 I-Hsiang Wang IT Lecture 0
Course Overview General Overview of Information Theory
23 / 50 I-Hsiang Wang IT Lecture 0
Course Overview General Overview of Information Theory
1 Stochastic modeling
2 Theorems, not just definitions
3 Sharp phase transition
24 / 50 I-Hsiang Wang IT Lecture 0
Course Overview General Overview of Information Theory
lossless
25 / 50 I-Hsiang Wang IT Lecture 0
Course Overview General Overview of Information Theory
1 Throughout this course, we prove theorems like
If parameter of a family of algorithms satisfy condition a, then as problem size tends to infinity, performance metric is at a certain "satisfactory" level with high probability.
If parameter of a family of algorithms does NOT satisfy condition b, then as problem size tends to infinity, performance metric is NOT at a certain "satisfactory" level with high probability.
2 For different algorithms, the asymptotic behavior of performance metric can often be
3 The asymptotic performance limits and thresholds are simple to compute (in most cases).
26 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
1
2
27 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Encoder Channel Decoder Source Destination Noise
1 Source aims to deliver some message to Destination.
2 Channel is the physical medium that connects Source and Destination, usually subject to
3 Encoder can carry out any processing of the source output, including compression, modulation,
4 Decoder can carry out any processing of the channel output to reproduce the source message.
28 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
29 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
30 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Encoder Channel Decoder Source Destination Noise
31 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Encoder Channel Decoder Source Destination Noise
32 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Source Encoder Source Noisy Channel Channel Encoder Destination Source Decoder Channel Decoder Binary Interface Bits Bits
33 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
34 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Encoder Channel Decoder Source Destination Noise
Encoder Channel Decoder Source Destination Noise
35 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Noisy Channel Channel Encoder Channel Decoder Source Encoder Source Destination Source Decoder
36 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Noisy Channel Channel Encoder Channel Decoder Source Encoder Source Destination Source Decoder
s[1], . . . , s[N] b[1], . . . , b[K] b s[1], . . . , b s[N]
37 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Noisy Channel Channel Encoder Channel Decoder Source Encoder Source Destination Source Decoder
s[1], . . . , s[N] b[1], . . . , b[K] b s[1], . . . , b s[N]
N ?
38 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Noisy Channel Channel Encoder Channel Decoder Source Encoder Source Destination Source Decoder
s[1], . . . , s[N] b[1], . . . , b[K] b s[1], . . . , b s[N]
N > the entropy rate of the source, H.
39 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Encoder Channel Decoder Source Destination Noise
Encoder Channel Decoder Source Destination Noise i.i.d. Bernoulli(1/2) i.e., Random Bits
40 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Source Encoder Source Noisy Channel Channel Encoder Destination Source Decoder Channel Decoder
41 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Source Encoder Source Noisy Channel Channel Encoder Destination Source Decoder Channel Decoder
b[1], . . . , b[K] b b[1], . . . ,b b[K] x[1], . . . , x[N] y[1], . . . , y[N]
p (y|x)
42 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Source Encoder Source Noisy Channel Channel Encoder Destination Source Decoder Channel Decoder
b[1], . . . , b[K] b b[1], . . . ,b b[K] x[1], . . . , x[N] y[1], . . . , y[N]
p (y|x)
N ?
43 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
Source Encoder Source Noisy Channel Channel Encoder Destination Source Decoder Channel Decoder
b[1], . . . , b[K] b b[1], . . . ,b b[K] x[1], . . . , x[N] y[1], . . . , y[N]
p (y|x)
N < the channel capacity of the channel, C.
44 / 50 I-Hsiang Wang IT Lecture 0
Course Overview First Part of this Course: Shannon Theory
1 Lossless source coding: Huffman code [Huffman, 1951] 2 Noisy channel coding: Polar code [Arikan, 2009] 3 Lossy source coding: Polar code [Korada-Urbanke, 2010]
45 / 50 I-Hsiang Wang IT Lecture 0
Course Overview Second Part of this Course: Information Theory and Statistics
1
2
46 / 50 I-Hsiang Wang IT Lecture 0
Course Overview Second Part of this Course: Information Theory and Statistics
Decision Making
47 / 50 I-Hsiang Wang IT Lecture 0
Course Overview Second Part of this Course: Information Theory and Statistics
1 Statistical decision theory forms the foundation of decoding in information theory
In source coding, tells us how to choose the best estimated source sequence. In channel coding, given a transmission scheme and the channel output, tells us how to choose the most likely codeword.
2 Another critical aspect in information theory is mechanism design: codebook design in channel
3 The techniques developed information theory also enrich the development of statistics in the
Information measures Proof techniques of impossibility results
48 / 50 I-Hsiang Wang IT Lecture 0
49 / 50 I-Hsiang Wang IT Lecture 0
50 / 50 I-Hsiang Wang IT Lecture 0