Lecture 0 Introduction I-Hsiang Wang Department of Electrical - - PowerPoint PPT Presentation

lecture 0 introduction
SMART_READER_LITE
LIVE PREVIEW

Lecture 0 Introduction I-Hsiang Wang Department of Electrical - - PowerPoint PPT Presentation

Lecture 0 Introduction I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw September 13, 2016 1 / 50 I-Hsiang Wang IT Lecture 0 What is Information Theory about? It is a mathematical theory of


slide-1
SLIDE 1

Lecture 0 Introduction

I-Hsiang Wang

Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw

September 13, 2016

1 / 50 I-Hsiang Wang IT Lecture 0

slide-2
SLIDE 2

What is Information Theory about?

It is a mathematical theory of information. Information is usually obtained by getting some "messages" (speech, text, images, etc.) from others. When obtaining information from a message, you may care about: What is the meaning of a message? How important is the message? How much information can I get from the message?

2 / 50 I-Hsiang Wang IT Lecture 0

slide-3
SLIDE 3

What is Information Theory about?

It is a mathematical theory of information. Information is usually obtained by getting some "messages" (speech, text, images, etc.) from others. When obtaining information from a message, you may care about: What is the meaning of a message? How important is the message? How much information can I get from the message?

Information theory is about the quantification of information.

3 / 50 I-Hsiang Wang IT Lecture 0

slide-4
SLIDE 4

What is Information Theory about?

It is a mathematical theory of information (primarily for communication systems) Establishes the fundamental limits of various types of information processing, including storage, compression, transmission, exchange, etc. Built upon probability theory and statistical decision theory Main Focus: ultimate performance limit (usually the efficiency of information processing) as certain resources (usually the total amount of time) scales to an asymptotic regime, given that the desired task is accomplished at a certain "satisfactory" level.

4 / 50 I-Hsiang Wang IT Lecture 0

slide-5
SLIDE 5

In this course, we will

1 Establish solid foundations and intuitions of information theory, 2 Introduce explicit methods to achieve information theoretic limits, 3 Demonstrate further applications of information theory beyond communications.

Later, we begin with a brief overview of information theory and the materials to be covered in this course.

5 / 50 I-Hsiang Wang IT Lecture 0

slide-6
SLIDE 6

Course Information

1

Course Information

2

Course Overview General Overview of Information Theory First Part of this Course: Shannon Theory Second Part of this Course: Information Theory and Statistics

6 / 50 I-Hsiang Wang IT Lecture 0

slide-7
SLIDE 7

Course Information

Logistics

1 Instructor: I-Hsiang Wang ⺩奕翔

Email: ihwang@ntu.edu.tw Website: http://cc.ee.ntu.edu.tw/~ihsiangw/ Office: MD-524 明達館 524 室 Office Hours: 17:00 – 18:00, Monday and Tuesday

2 Lecture Time: 09:10 – 12:10 (234) Wednesday 3 Lecture Location: BL-114 博理館 114 室 4 Course Website: http://homepage.ntu.edu.tw/~ihwang/Teaching/Fa16/IT.html 5 Prerequisites: Probability, Linear Algebra.

7 / 50 I-Hsiang Wang IT Lecture 0

slide-8
SLIDE 8

Course Information

Logistics

6 Grading: Homework (30%), Midterm (30%), Final (40%) 7 References

  • T. Cover and J. Thomas, Elements of Information Theory, 2nd Edition, Wiley-Interscience, 2006.
  • R. Gallager, Information Theory and Reliable Communications, Wiley, 1968.
  • I. Csiszar and J. Korner, Information Theory: Coding Theorems for Discrete Memoryless Systems,

2nd Edition, Cambridge University Press, 2011.

  • S. M. Moser, Information Theory (Lecture Notes), 4th edition, ISI Lab, ETH Zürich, Switzerland,

2014.

  • R. Yeung, Information Theory and Network Coding, Springer, 2008.
  • A. El Gamal and Y.-H. Kim, Network Information Theory, Cambridge University Press, 2011.
  • Y. Polyanskiy and Y. Wu, Lecture notes on Information Theory, MIT (6.441), UIUC (ECE 563),

2012-2016.

8 / 50 I-Hsiang Wang IT Lecture 0

slide-9
SLIDE 9

Course Information

Homework

1 Roughly 4 – 5 problems every 2 – 3 weeks, in total 6 times. 2 Homework (HW) is usually released on Monday. Deadline of submission is usually on the next

Wednesday in class.

3 Late homework = 0 points. (Let me know in advance if you have difficulties.) 4 Everyone has to develop detailed solution for one HW problem, documented in L AT

EX and submitted 1 week after the HW due. (L

AT

EX template will be provided)

You should discuss with the instructor and TA about the homework problem that you are in charge

  • f, making sure the solution is correct.

5 This additional effort accounts for part of your homework grades.

9 / 50 I-Hsiang Wang IT Lecture 0

slide-10
SLIDE 10

Course Information

Reading and Lecture Notes

1 Slides are usually released/updated every Sunday evening. 2 Each lecture has assigned readings.

Reading is required: it is not enough to learn from the slides!

3 Go through the slides and the assigned readings before our lectures. It helps you learn better. 4 I recommend you get a copy of the textbook by Cover and Thomas. It is a good reference, and

we will often assign readings and exercises in the book.

5 Other assigned readings could be Moser's lecture note, Polyanskiy-Wu lecture note, and

relevant papers (all can be obtained online).

10 / 50 I-Hsiang Wang IT Lecture 0

slide-11
SLIDE 11

Course Information

Interaction

1 In-class:

Language: This class is taught in English. However, to encourage interaction, feel free to ask questions in Mandarin. I will repeat your question in English (if necessary), and answer it in English. Exercises: We put some exercises on the slides to help you learn and understand. Occasionally, I will call for volunteer to solve the exercises in class. Volunteers get bonus.

2 Out-of-class:

Office Hours: Both TA and myself have 2-hour office hours per week. You are more than welcome to come visit us and ask questions, discuss about research, chat, complain, etc. If you cannot make it to the regular office hours, send us emails to schedule a time slot. My schedule can be found on my website. Send us emails with a subject starting with "[NTU Fall16 IT]". Feedback: There will be online polls during the semester to collect your feedback anonymously.

11 / 50 I-Hsiang Wang IT Lecture 0

slide-12
SLIDE 12

Course Information

Course Outline

Measures of Information: entropy, conditional entropy, mutual information, differential entropy, information divergence. Lossless Source Coding: lossless source coding theorem, discrete memoryless sources, asymptotic equipartition property, typical sequences, Fano's inequality, converse proof, ergodic sources, entropy rate. Noisy Channel Coding: noisy channel coding theorem, discrete memoryless channels, random coding, typicality decoder, threshold decoder, error probability analysis, converse proof, channel with feedback, channel coding with cost constraints, Gaussian channel capacity. Lossy Source Coding (Rate Distortion Theory): distortion, rate-distortion tradeoff, typicality encoder, converse proof. Capacity Achieving Code: polar coding.

12 / 50 I-Hsiang Wang IT Lecture 0

slide-13
SLIDE 13

Course Information

Course Outline

Statistical Decision Theory: hypothesis testing, estimation, minimax risk, Bayes risk. Information Theory and Statistics: method of types, Sanov's theorem, large deviation, large sample asymptotics, Cramér-Rao lower bound, high-dimensional estimation problems. Advanced Topics (if time allowed): compressed sensing, community detection, non-asymptotic information theory, etc.

13 / 50 I-Hsiang Wang IT Lecture 0

slide-14
SLIDE 14

Course Information

Tentative Schedule

Week Date Content Remark 1 09/14 Introduction; Measures of Information 2 09/21 Measures of Information HW1 out 3 09/28 No Lecture (I-Hsiang out of town) 4 10/05 Lossless Source Coding HW1 due 5 10/12 Lossless Source Coding HW2 out 6 10/19 Noisy Channel Coding 7 10/26 Noisy Channel Coding HW2 due; HW3 out 8 11/02 Lossy Source Coding 9 11/09 Lossy Source Coding HW3 due; HW4 out

14 / 50 I-Hsiang Wang IT Lecture 0

slide-15
SLIDE 15

Course Information

Tentative Schedule

Week Date Content Remark 10 11/16 Polar Coding 11 11/23 Midterm Exam HW4 due 12 11/30 Statistical Decision Theory 13 12/07 Statistical Decision Theory HW5 out 14 12/14 Information Theory and Statistics 15 12/21 Information Theory and Statistics HW5 due; HW6 out 16 12/28 Information Theory and Statistics 17 01/04 Advanced Topics HW6 due 18 01/11 Final Exam

15 / 50 I-Hsiang Wang IT Lecture 0

slide-16
SLIDE 16

Course Overview

1

Course Information

2

Course Overview General Overview of Information Theory First Part of this Course: Shannon Theory Second Part of this Course: Information Theory and Statistics

16 / 50 I-Hsiang Wang IT Lecture 0

slide-17
SLIDE 17

Course Overview

Claude E. Shannon

(1916 – 2001)

17 / 50 I-Hsiang Wang IT Lecture 0

slide-18
SLIDE 18

Course Overview

Information theory

– a mathematical theory of communication

18 / 50 I-Hsiang Wang IT Lecture 0

slide-19
SLIDE 19

Course Overview

Information theory

– the mathematical theory of communication

19 / 50 I-Hsiang Wang IT Lecture 0

slide-20
SLIDE 20

Course Overview

Origin of Information Theory

20 / 50 I-Hsiang Wang IT Lecture 0

slide-21
SLIDE 21

Course Overview

Origin of Information Theory

Shannon's 1948 paper is generally considered as the "birth" of information theory and (modern) digital communication. It was set clear that information theory is about the quantification of information. In particular, it focuses on characterizing the necessary and sufficient condition of whether or not a destination terminal can reproduce a message generated by a source terminal.

21 / 50 I-Hsiang Wang IT Lecture 0

slide-22
SLIDE 22

Course Overview General Overview of Information Theory

1

Course Information

2

Course Overview General Overview of Information Theory First Part of this Course: Shannon Theory Second Part of this Course: Information Theory and Statistics

22 / 50 I-Hsiang Wang IT Lecture 0

slide-23
SLIDE 23

Course Overview General Overview of Information Theory

What is Information Theory about?

23 / 50 I-Hsiang Wang IT Lecture 0

slide-24
SLIDE 24

Course Overview General Overview of Information Theory

It is about the analysis of fundamental limits

1 Stochastic modeling

It is a unified theory based on stochastic modeling (information source, noisy channel, etc.).

2 Theorems, not just definitions

It provides rigorous theorems on optimal performance of algorithms (coding schemes), rather than merely definitions.

3 Sharp phase transition

It draws the boundary between what is possible to achieve and what is impossible, leading to math-driven system design.

24 / 50 I-Hsiang Wang IT Lecture 0

slide-25
SLIDE 25

Course Overview General Overview of Information Theory

It is about the design driven by theory

lossless

Data compression Error correcting code Cryptography DSL modem Cellular system Wireless access

  • …….

Compressive sensing Clustering … and much more!

25 / 50 I-Hsiang Wang IT Lecture 0

slide-26
SLIDE 26

Course Overview General Overview of Information Theory

It is (usually) about the asymptotic system behavior

1 Throughout this course, we prove theorems like

If parameter of a family of algorithms satisfy condition a, then as problem size tends to infinity, performance metric is at a certain "satisfactory" level with high probability.

and

If parameter of a family of algorithms does NOT satisfy condition b, then as problem size tends to infinity, performance metric is NOT at a certain "satisfactory" level with high probability.

2 For different algorithms, the asymptotic behavior of performance metric can often be

characterized, and the fundamental limit can be used to benchmark different algorithms.

3 The asymptotic performance limits and thresholds are simple to compute (in most cases).

26 / 50 I-Hsiang Wang IT Lecture 0

slide-27
SLIDE 27

Course Overview First Part of this Course: Shannon Theory

1

Course Information

2

Course Overview General Overview of Information Theory First Part of this Course: Shannon Theory Second Part of this Course: Information Theory and Statistics

27 / 50 I-Hsiang Wang IT Lecture 0

slide-28
SLIDE 28

Course Overview First Part of this Course: Shannon Theory

Communication System

Encoder Channel Decoder Source Destination Noise

Above is an abstract model of communication system:

1 Source aims to deliver some message to Destination.

(message includes video, audio, text, etc.)

2 Channel is the physical medium that connects Source and Destination, usually subject to

certain noise disturbances. (medium includes cable, optical fiber, EM radiation, etc.)

3 Encoder can carry out any processing of the source output, including compression, modulation,

insertion of redundancy, etc.

4 Decoder can carry out any processing of the channel output to reproduce the source message.

28 / 50 I-Hsiang Wang IT Lecture 0

slide-29
SLIDE 29

Course Overview First Part of this Course: Shannon Theory

A primary concern of information theory is on the encoder and the decoder: How the encoder and the decoder function? (algorithm design) The existence or nonexistence of encoders and decoders that achieve a given level of performance. (fundamental limit)

29 / 50 I-Hsiang Wang IT Lecture 0

slide-30
SLIDE 30

Course Overview First Part of this Course: Shannon Theory

Some History

Prior to the 1948 paper, design of communication systems followed the analog paradigm – if the source produces a electromagnetic waveform, the destination should try its best to reconstruct this waveform, in order to extract useful information (usually, voice). This line of research was based on Fourier analysis and gave birth to sampling theory. Shannon asked: If the receiver knows that a sine wave of unknown frequency is to be communicated, why not simply send the frequency rather than the entire waveform? Prior to Shannon, theorist and engineers were able to analyze the performance of certain choice of encoders/decoders, but had little knowledge about what is the ultimate limit. Shannon asked: For all possible encoders/decoders, what is the necessary and sufficient condition for the destination to be able to reconstruct the message sent from the source?

30 / 50 I-Hsiang Wang IT Lecture 0

slide-31
SLIDE 31

Course Overview First Part of this Course: Shannon Theory

Shannon's View

Encoder Channel Decoder Source Destination Noise

Key new insights due to Shannon's work: Shannon: "Information is the resolution of uncertainty." Indeed, the set of possible source

  • utputs, rather than any particular output, is of primary interest.

Introduction of an abstract mathematical model of communication system based on random processes (hence, a stochastic model) Creation of the digital paradigm of communication system design, bit, as the universal currency

  • f information, by proposing and proving the source-channel separation theorem

31 / 50 I-Hsiang Wang IT Lecture 0

slide-32
SLIDE 32

Course Overview First Part of this Course: Shannon Theory

Stochastic Modeling

Encoder Channel Decoder Source Destination Noise

Source: model the information source by random processes, where the data to be conveyed is drawn randomly from a given distribution Channel: model the noisy channel by random processes, where the impact of noise is drawn randomly from a given distribution Why use random processes to model communication system? Shannon: "The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design."

32 / 50 I-Hsiang Wang IT Lecture 0

slide-33
SLIDE 33

Course Overview First Part of this Course: Shannon Theory

A Profound Result: Source-Channel Separation

Source Encoder Source Noisy Channel Channel Encoder Destination Source Decoder Channel Decoder Binary Interface Bits Bits

Shannon: No loss in optimality by splitting the coders into source coders and channel coders In other words, introducing a digital (binary) interface does not incur any loss of optimality. Separation of source coding and channel coding simplifies engineering design – source coder design (data compression) and channel coder design (data transmission).

33 / 50 I-Hsiang Wang IT Lecture 0

slide-34
SLIDE 34

Course Overview First Part of this Course: Shannon Theory

I have been always wondering how Shannon came up with the brilliant idea of separating source coding and channel coding. A very likely answer:

Shannon is simply a genius.

A more down-to-earth answer:

Shannon saw the essence of research: seek for simplification first.

34 / 50 I-Hsiang Wang IT Lecture 0

slide-35
SLIDE 35

Course Overview First Part of this Course: Shannon Theory

Original Block Diagram

Encoder Channel Decoder Source Destination Noise

Simplification: Remove the Channel Noise!

Encoder Channel Decoder Source Destination Noise

This step makes life much easier. Yet, it is still a non-trivial problem.

35 / 50 I-Hsiang Wang IT Lecture 0

slide-36
SLIDE 36

Course Overview First Part of this Course: Shannon Theory

Source Coding (Data Compression)

Noisy Channel Channel Encoder Channel Decoder Source Encoder Source Destination Source Decoder

Features of source messages: Uncertainty: the destination has no idea what message is chosen by the source a priori. Redundancy: though randomly chosen, some choices are more likely, others are less likely. Goal: Remove redundancy of the source message and represent it by a bit sequence, so that it can be reproduced at the destination.

36 / 50 I-Hsiang Wang IT Lecture 0

slide-37
SLIDE 37

Course Overview First Part of this Course: Shannon Theory

Source Coding (Data Compression)

Noisy Channel Channel Encoder Channel Decoder Source Encoder Source Destination Source Decoder

s[1], . . . , s[N] b[1], . . . , b[K] b s[1], . . . , b s[N]

Notations:

{s[1], . . . , s[N]} represent the source message; each s[t] is called a "source symbol". {b[1], . . . , b[K]} represent the codeword, generated by the source encoder; each bit b[t] is

called a "source codeword symbol (bit)".

{ s[1], . . . , s[N]} represent the reproduced source message at the destination.

37 / 50 I-Hsiang Wang IT Lecture 0

slide-38
SLIDE 38

Course Overview First Part of this Course: Shannon Theory

Source Coding (Data Compression)

Noisy Channel Channel Encoder Channel Decoder Source Encoder Source Destination Source Decoder

s[1], . . . , s[N] b[1], . . . , b[K] b s[1], . . . , b s[N]

Question: Given N (# of source symbols), what is the minimum K (# of bits) to recover s[1], . . . , s[N] at Decoder? It is not hard to show that the smallest K = Θ (N). (check!)

The right (non-trivial) question to ask is: What is the minimum value of K

N ?

38 / 50 I-Hsiang Wang IT Lecture 0

slide-39
SLIDE 39

Course Overview First Part of this Course: Shannon Theory

Source Coding (Data Compression)

Noisy Channel Channel Encoder Channel Decoder Source Encoder Source Destination Source Decoder

s[1], . . . , s[N] b[1], . . . , b[K] b s[1], . . . , b s[N]

A Source Coding Theorem (roughly) The destination can reconstruct the source message losslessly

⇐ ⇒ code rate R ≜ K

N > the entropy rate of the source, H.

Entropy rate (defined later) is a quantity that can be computed from the distribution of {S[t] | t ∈ N}.

39 / 50 I-Hsiang Wang IT Lecture 0

slide-40
SLIDE 40

Course Overview First Part of this Course: Shannon Theory

Original Block Diagram

Encoder Channel Decoder Source Destination Noise

Simplification: Remove the Source Redundancy!

Encoder Channel Decoder Source Destination Noise i.i.d. Bernoulli(1/2) i.e., Random Bits

It remains a highly non-trivial problem.

40 / 50 I-Hsiang Wang IT Lecture 0

slide-41
SLIDE 41

Course Overview First Part of this Course: Shannon Theory

Channel Coding (Data Transmission)

Source Encoder Source Noisy Channel Channel Encoder Destination Source Decoder Channel Decoder

Features of noisy channel: Noise: symbols sent by the channel encoder is corrupted by the random noise. Uniform messages: input of channel encoder is assumed WLOG to be bit sequences with no redundancy, since source coding removes all redundancy and convert data to bit sequences. Goal: add minimum redundancy so that messages (bit sequences) can be decoded reliably.

41 / 50 I-Hsiang Wang IT Lecture 0

slide-42
SLIDE 42

Course Overview First Part of this Course: Shannon Theory

Channel Coding (Data Transmission)

Source Encoder Source Noisy Channel Channel Encoder Destination Source Decoder Channel Decoder

b[1], . . . , b[K] b b[1], . . . ,b b[K] x[1], . . . , x[N] y[1], . . . , y[N]

p (y|x)

Notations:

{x[1], . . . , x[N]} represent the codeword; each x[t] is called a "coded symbol". {b[1], . . . , b[K]} represent the message; each bit b[t] is called a "data symbol (bit)". {y[1], . . . , y[N]} represent the channel output.

42 / 50 I-Hsiang Wang IT Lecture 0

slide-43
SLIDE 43

Course Overview First Part of this Course: Shannon Theory

Channel Coding (Data Transmission)

Source Encoder Source Noisy Channel Channel Encoder Destination Source Decoder Channel Decoder

b[1], . . . , b[K] b b[1], . . . ,b b[K] x[1], . . . , x[N] y[1], . . . , y[N]

p (y|x)

Question: Given K (# of input bits), what is the minimum N (# of coded symbols) to recover b[1], . . . , b[K] at Decoder? It turns out that N = Θ (K). However, proving this is already non-trivial.

Shannon further asked: What is the maximum value of K

N ?

43 / 50 I-Hsiang Wang IT Lecture 0

slide-44
SLIDE 44

Course Overview First Part of this Course: Shannon Theory

Channel Coding (Data Transmission)

Source Encoder Source Noisy Channel Channel Encoder Destination Source Decoder Channel Decoder

b[1], . . . , b[K] b b[1], . . . ,b b[K] x[1], . . . , x[N] y[1], . . . , y[N]

p (y|x)

A Channel Coding Theorem (Roughly) The (channel) decoder can decode the message reliably

⇐ ⇒ code rate R ≜ K

N < the channel capacity of the channel, C.

Channel capacity (defined later) is a quantity that can be computed by maximizing the "mutual information" between X and Y , a function of the distribution of channel noise .

44 / 50 I-Hsiang Wang IT Lecture 0

slide-45
SLIDE 45

Course Overview First Part of this Course: Shannon Theory

Explicit Construction of Computationally-Efficient Optimal Codes

In Shannon's 1948 paper, He proved the lossless source coding theorem He also "sketched" a proof of the noisy channel coding theorem (The first rigorous proof of the noisy channel coding theorem was due to Amiel Feinstein (1954)) Caveat: These information theoretic results only prove the fundamental limits and existence of coding schemes, but do not give explicit construction of computationally-efficient optimal codes. The (long) quest of finding computationally-efficient optimal codes:

1 Lossless source coding: Huffman code [Huffman, 1951] 2 Noisy channel coding: Polar code [Arikan, 2009] 3 Lossy source coding: Polar code [Korada-Urbanke, 2010]

45 / 50 I-Hsiang Wang IT Lecture 0

slide-46
SLIDE 46

Course Overview Second Part of this Course: Information Theory and Statistics

1

Course Information

2

Course Overview General Overview of Information Theory First Part of this Course: Shannon Theory Second Part of this Course: Information Theory and Statistics

46 / 50 I-Hsiang Wang IT Lecture 0

slide-47
SLIDE 47

Course Overview Second Part of this Course: Information Theory and Statistics

Statistical Decision Theory

Model: Data generated from a random model, the distribution of which, pX; θ, is controlled by a hidden parameter θ. Goal: Infer the hidden parameter θ from collected data samples. The quality of the estimated ˆ

θ is measured by some

"risk" function, such as

  • θ − ˆ

θ

  • .

(Stochastic) Generative Model . . .

pX; θ θ X1 X2 Xn

Decision Making

ˆ θ

Statistical decision theory tells how to design decision making algorithms that minimize the "risk". Also interested in how the minimal risk scales as sample size n → ∞.

47 / 50 I-Hsiang Wang IT Lecture 0

slide-48
SLIDE 48

Course Overview Second Part of this Course: Information Theory and Statistics

Information Theory and Statistics

1 Statistical decision theory forms the foundation of decoding in information theory

In source coding, tells us how to choose the best estimated source sequence. In channel coding, given a transmission scheme and the channel output, tells us how to choose the most likely codeword.

2 Another critical aspect in information theory is mechanism design: codebook design in channel

coding, efficient coding algorithms, interaction, etc.

3 The techniques developed information theory also enrich the development of statistics in the

asymptotic regime.

Information measures Proof techniques of impossibility results

In this part of the course we will see a lot of interplay between information theory and statistics.

48 / 50 I-Hsiang Wang IT Lecture 0

slide-49
SLIDE 49

Summary

49 / 50 I-Hsiang Wang IT Lecture 0

slide-50
SLIDE 50

Information theory focuses on the quantitative aspects of information, not the qualitative aspects Information theory is mainly about what is possible and what is impossible in communication systems In information theory, one investigates problems in communication systems through the lens of probability theory and statistics In the course, we mainly focus on discrete-time signals, not on continuous-time signals Source-channel separation forms the basis of digital communication, where binary digits (bits) become the universal currency of information Statistical decision theory forms the foundation of decision making in information theory Information theoretic tools are quite useful in studying the asymptotic behaviors as well as deriving performance bounds for various statistical problems

50 / 50 I-Hsiang Wang IT Lecture 0