Pre-Quantum Information Theory Goutam Paul Cryptology and Security - - PowerPoint PPT Presentation

pre quantum information theory
SMART_READER_LITE
LIVE PREVIEW

Pre-Quantum Information Theory Goutam Paul Cryptology and Security - - PowerPoint PPT Presentation

Pre-Quantum Information Theory Goutam Paul Cryptology and Security Research Unit, Indian Statistical Institute, Kolkata February 9, 2016 Lecture at International School and Conference on Quantum Information, Institute of Physics (IOP),


slide-1
SLIDE 1

Pre-Quantum Information Theory

Goutam Paul

Cryptology and Security Research Unit, Indian Statistical Institute, Kolkata February 9, 2016 Lecture at International School and Conference on Quantum Information, Institute of Physics (IOP), Bhubaneswar (Feb 9-18, 2016).

slide-2
SLIDE 2

Outline

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-3
SLIDE 3

Roadmap

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-4
SLIDE 4

Roadmap

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-5
SLIDE 5

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Information and Probability

Goutam Paul Pre-Quantum Information Theory Slide 5 of 34

slide-6
SLIDE 6

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Information and Probability

For an event with probability p, let I(p) be the information contained in it.

Goutam Paul Pre-Quantum Information Theory Slide 5 of 34

slide-7
SLIDE 7

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Information and Probability

For an event with probability p, let I(p) be the information contained in it. p ↓⇒ I(p) ↑ and p ↑⇒ I(p) ↓

Goutam Paul Pre-Quantum Information Theory Slide 5 of 34

slide-8
SLIDE 8

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Information and Probability

For an event with probability p, let I(p) be the information contained in it. p ↓⇒ I(p) ↑ and p ↑⇒ I(p) ↓ For two independent events with probabilities p1 and p2, I(p1p2) ∝ I(p1) + I(p2).

Goutam Paul Pre-Quantum Information Theory Slide 5 of 34

slide-9
SLIDE 9

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Information and Probability

For an event with probability p, let I(p) be the information contained in it. p ↓⇒ I(p) ↑ and p ↑⇒ I(p) ↓ For two independent events with probabilities p1 and p2, I(p1p2) ∝ I(p1) + I(p2). Thus, a natural definition is I(p) log 1 p

  • = − log p.

Goutam Paul Pre-Quantum Information Theory Slide 5 of 34

slide-10
SLIDE 10

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Relation to Uncertainty / Surprise / Knowledge Gain

Amount of information contained in an event

Goutam Paul Pre-Quantum Information Theory Slide 6 of 34

slide-11
SLIDE 11

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Relation to Uncertainty / Surprise / Knowledge Gain

Amount of information contained in an event = Amount of uncertainty before the event happens

Goutam Paul Pre-Quantum Information Theory Slide 6 of 34

slide-12
SLIDE 12

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Relation to Uncertainty / Surprise / Knowledge Gain

Amount of information contained in an event = Amount of uncertainty before the event happens = Amount of surprise when the event happens

Goutam Paul Pre-Quantum Information Theory Slide 6 of 34

slide-13
SLIDE 13

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Relation to Uncertainty / Surprise / Knowledge Gain

Amount of information contained in an event = Amount of uncertainty before the event happens = Amount of surprise when the event happens = Amount of knowledge gain after the event happens

Goutam Paul Pre-Quantum Information Theory Slide 6 of 34

slide-14
SLIDE 14

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Average Information

Let X denote a random variable taking values from a discrete set (may denote a set of events or a source of symbols) with probabilities p(x) = Prob(X = x).

Goutam Paul Pre-Quantum Information Theory Slide 7 of 34

slide-15
SLIDE 15

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Average Information

Let X denote a random variable taking values from a discrete set (may denote a set of events or a source of symbols) with probabilities p(x) = Prob(X = x). Average information in X (or of the corresponding set / source) H(X)

  • E[I(p(X))]

Goutam Paul Pre-Quantum Information Theory Slide 7 of 34

slide-16
SLIDE 16

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Average Information

Let X denote a random variable taking values from a discrete set (may denote a set of events or a source of symbols) with probabilities p(x) = Prob(X = x). Average information in X (or of the corresponding set / source) H(X)

  • E[I(p(X))]

= E[− log p(X)]

Goutam Paul Pre-Quantum Information Theory Slide 7 of 34

slide-17
SLIDE 17

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Average Information

Let X denote a random variable taking values from a discrete set (may denote a set of events or a source of symbols) with probabilities p(x) = Prob(X = x). Average information in X (or of the corresponding set / source) H(X)

  • E[I(p(X))]

= E[− log p(X)] = −

  • x∈X

p(x) log p(x)

Goutam Paul Pre-Quantum Information Theory Slide 7 of 34

slide-18
SLIDE 18

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Average Information

Let X denote a random variable taking values from a discrete set (may denote a set of events or a source of symbols) with probabilities p(x) = Prob(X = x). Average information in X (or of the corresponding set / source) H(X)

  • E[I(p(X))]

= E[− log p(X)] = −

  • x∈X

p(x) log p(x) This is called the entropy of the variable X (or of the set / source).

Goutam Paul Pre-Quantum Information Theory Slide 7 of 34

slide-19
SLIDE 19

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Joint and Conditional Entropy

Goutam Paul Pre-Quantum Information Theory Slide 8 of 34

slide-20
SLIDE 20

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Joint and Conditional Entropy

H(X, Y) −

  • x
  • y

p(x, y) log p(x, y).

Goutam Paul Pre-Quantum Information Theory Slide 8 of 34

slide-21
SLIDE 21

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Joint and Conditional Entropy

H(X, Y) −

  • x
  • y

p(x, y) log p(x, y). H(Y | X)

  • x

p(x)H(Y | X = x)

Goutam Paul Pre-Quantum Information Theory Slide 8 of 34

slide-22
SLIDE 22

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Joint and Conditional Entropy

H(X, Y) −

  • x
  • y

p(x, y) log p(x, y). H(Y | X)

  • x

p(x)H(Y | X = x) =

  • x

p(x)

  • y

p(y|x) log p(y|x)

  • Goutam Paul

Pre-Quantum Information Theory Slide 8 of 34

slide-23
SLIDE 23

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Joint and Conditional Entropy

H(X, Y) −

  • x
  • y

p(x, y) log p(x, y). H(Y | X)

  • x

p(x)H(Y | X = x) =

  • x

p(x)

  • y

p(y|x) log p(y|x)

  • =

  • x
  • y

p(x, y) log p(y|x)

Goutam Paul Pre-Quantum Information Theory Slide 8 of 34

slide-24
SLIDE 24

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Important Results Related to Entropy

Goutam Paul Pre-Quantum Information Theory Slide 9 of 34

slide-25
SLIDE 25

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Important Results Related to Entropy

Chain Rule: H(X, Y) = H(X) + H(Y|X)

Goutam Paul Pre-Quantum Information Theory Slide 9 of 34

slide-26
SLIDE 26

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Important Results Related to Entropy

Chain Rule: H(X, Y) = H(X) + H(Y|X) H(X, Y) ≤ H(X) + H(Y)

Goutam Paul Pre-Quantum Information Theory Slide 9 of 34

slide-27
SLIDE 27

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Important Results Related to Entropy

Chain Rule: H(X, Y) = H(X) + H(Y|X) H(X, Y) ≤ H(X) + H(Y) H(Y | X) ≤ H(Y)

Goutam Paul Pre-Quantum Information Theory Slide 9 of 34

slide-28
SLIDE 28

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Mutual Information

I(X; Y)

  • x
  • y

p(x, y) log p(x, y) p(x)p(y)

Goutam Paul Pre-Quantum Information Theory Slide 10 of 34

slide-29
SLIDE 29

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Mutual Information

I(X; Y)

  • x
  • y

p(x, y) log p(x, y) p(x)p(y) = H(X) − H(X|Y)

Goutam Paul Pre-Quantum Information Theory Slide 10 of 34

slide-30
SLIDE 30

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Mutual Information

I(X; Y)

  • x
  • y

p(x, y) log p(x, y) p(x)p(y) = H(X) − H(X|Y) = H(Y) − H(Y|X)

Goutam Paul Pre-Quantum Information Theory Slide 10 of 34

slide-31
SLIDE 31

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Mutual Information

I(X; Y)

  • x
  • y

p(x, y) log p(x, y) p(x)p(y) = H(X) − H(X|Y) = H(Y) − H(Y|X) = H(X) + H(Y) − H(X, Y)

Goutam Paul Pre-Quantum Information Theory Slide 10 of 34

slide-32
SLIDE 32

Roadmap

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-33
SLIDE 33

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Information and Codeword Length

Kraft Inequality: The necessary and sufficient conditions for the existence of an instantaneous code over an r-ary alphabet with codeword lengths ℓ1, ℓ2, . . . , ℓn satisfy

n

  • i=1

r −ℓi ≤ 1.

Goutam Paul Pre-Quantum Information Theory Slide 12 of 34

slide-34
SLIDE 34

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Information and Codeword Length

Kraft Inequality: The necessary and sufficient conditions for the existence of an instantaneous code over an r-ary alphabet with codeword lengths ℓ1, ℓ2, . . . , ℓn satisfy

n

  • i=1

r −ℓi ≤ 1. An Engineering Optimization: Minimize L =

n

  • i=1

piℓi s.t.

n

  • i=1

r −ℓi ≤ 1 gives ℓ∗

i = − logr pi

and L∗ =

n

  • i=1

piℓ∗

i = H(X).

Goutam Paul Pre-Quantum Information Theory Slide 12 of 34

slide-35
SLIDE 35

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Entropy and Data Compression

For integer choice of codeword lengths, H(X) ≤ L∗ < H(X) + 1.

Goutam Paul Pre-Quantum Information Theory Slide 13 of 34

slide-36
SLIDE 36

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Entropy and Data Compression

For integer choice of codeword lengths, H(X) ≤ L∗ < H(X) + 1. For supersymbols with n-symbols at a time, H(X) ≤ L∗

n < H(X) + 1

n and L∗

n = H(X) is achievable for stationary distribution.

Goutam Paul Pre-Quantum Information Theory Slide 13 of 34

slide-37
SLIDE 37

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Entropy and Data Compression

For integer choice of codeword lengths, H(X) ≤ L∗ < H(X) + 1. For supersymbols with n-symbols at a time, H(X) ≤ L∗

n < H(X) + 1

n and L∗

n = H(X) is achievable for stationary distribution.

This is Shannon’s Source/Noiseless Coding Theorem.

Goutam Paul Pre-Quantum Information Theory Slide 13 of 34

slide-38
SLIDE 38

Roadmap

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-39
SLIDE 39

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Entropy as a Measure of Randomness

Goutam Paul Pre-Quantum Information Theory Slide 15 of 34

slide-40
SLIDE 40

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Entropy as a Measure of Randomness

Suppose pi ≥ 0, for 1 ≤ i ≤ n.

Goutam Paul Pre-Quantum Information Theory Slide 15 of 34

slide-41
SLIDE 41

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Entropy as a Measure of Randomness

Suppose pi ≥ 0, for 1 ≤ i ≤ n. Maximize

  • i

pilogpi

  • s.t.
  • i

pi = 1 gives

Goutam Paul Pre-Quantum Information Theory Slide 15 of 34

slide-42
SLIDE 42

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Entropy as a Measure of Randomness

Suppose pi ≥ 0, for 1 ≤ i ≤ n. Maximize

  • i

pilogpi

  • s.t.
  • i

pi = 1 gives p1 = p2 = · · · = pn.

Goutam Paul Pre-Quantum Information Theory Slide 15 of 34

slide-43
SLIDE 43

Roadmap

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-44
SLIDE 44

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Encryption increases Entropy

Goutam Paul Pre-Quantum Information Theory Slide 17 of 34

slide-45
SLIDE 45

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Encryption increases Entropy

The goal of encryption is to make the transmitted message look random.

Goutam Paul Pre-Quantum Information Theory Slide 17 of 34

slide-46
SLIDE 46

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Encryption increases Entropy

The goal of encryption is to make the transmitted message look random. Typically, H(C) > H(P).

Goutam Paul Pre-Quantum Information Theory Slide 17 of 34

slide-47
SLIDE 47

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Encryption increases Entropy

The goal of encryption is to make the transmitted message look random. Typically, H(C) > H(P). But, H(P | C) may be < H(P)

Goutam Paul Pre-Quantum Information Theory Slide 17 of 34

slide-48
SLIDE 48

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy

Given

Goutam Paul Pre-Quantum Information Theory Slide 18 of 34

slide-49
SLIDE 49

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy

Given Three possible plaintexts: a, b, c, with probabilities 0.5, 0.3, 0.2.

Goutam Paul Pre-Quantum Information Theory Slide 18 of 34

slide-50
SLIDE 50

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy

Given Three possible plaintexts: a, b, c, with probabilities 0.5, 0.3, 0.2. Three possible ciphertexts: U, V, W.

Goutam Paul Pre-Quantum Information Theory Slide 18 of 34

slide-51
SLIDE 51

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy

Given Three possible plaintexts: a, b, c, with probabilities 0.5, 0.3, 0.2. Three possible ciphertexts: U, V, W. Two possible keys: k1, k2, equally likely.

Goutam Paul Pre-Quantum Information Theory Slide 18 of 34

slide-52
SLIDE 52

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy

Given Three possible plaintexts: a, b, c, with probabilities 0.5, 0.3, 0.2. Three possible ciphertexts: U, V, W. Two possible keys: k1, k2, equally likely. Encryption under k1: U, V, W. Encryption under k2: U, W, V.

Goutam Paul Pre-Quantum Information Theory Slide 18 of 34

slide-53
SLIDE 53

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy (... contd)

One can calculate

Goutam Paul Pre-Quantum Information Theory Slide 19 of 34

slide-54
SLIDE 54

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy (... contd)

One can calculate p(U) = 0.5, p(V) = p(W) = 0.25.

Goutam Paul Pre-Quantum Information Theory Slide 19 of 34

slide-55
SLIDE 55

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy (... contd)

One can calculate p(U) = 0.5, p(V) = p(W) = 0.25. p(a | V) = 0 p(b | V) = 0.6 p(c | V) = 0.4

Goutam Paul Pre-Quantum Information Theory Slide 19 of 34

slide-56
SLIDE 56

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy (... contd)

One can calculate p(U) = 0.5, p(V) = p(W) = 0.25. p(a | V) = 0 p(b | V) = 0.6 p(c | V) = 0.4 Similarly, one can calculate probabilities of a, b, c given W.

Goutam Paul Pre-Quantum Information Theory Slide 19 of 34

slide-57
SLIDE 57

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy (... contd)

Thus,

Goutam Paul Pre-Quantum Information Theory Slide 20 of 34

slide-58
SLIDE 58

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy (... contd)

Thus, H(P) = − (0.5 log2(0.5) + 0.3 log2(0.3) + 0.2 log2(0.2)) = 1.485

Goutam Paul Pre-Quantum Information Theory Slide 20 of 34

slide-59
SLIDE 59

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Example: Plaintext Entropy (... contd)

Thus, H(P) = − (0.5 log2(0.5) + 0.3 log2(0.3) + 0.2 log2(0.2)) = 1.485 H(P | C) = −

  • x∈{U,V,W}
  • y∈{a,b,c}

p(x)p(y|x) log2 p(y|x) = 0.485

Goutam Paul Pre-Quantum Information Theory Slide 20 of 34

slide-60
SLIDE 60

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Perfect Secrecy

Goutam Paul Pre-Quantum Information Theory Slide 21 of 34

slide-61
SLIDE 61

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Perfect Secrecy

Information Theoretic Security:

Goutam Paul Pre-Quantum Information Theory Slide 21 of 34

slide-62
SLIDE 62

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Perfect Secrecy

Information Theoretic Security: H(P | C) = H(P)

Goutam Paul Pre-Quantum Information Theory Slide 21 of 34

slide-63
SLIDE 63

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Perfect Secrecy

Information Theoretic Security: H(P | C) = H(P) Or, equivalently, Prob(P | C) = Prob(P).

Goutam Paul Pre-Quantum Information Theory Slide 21 of 34

slide-64
SLIDE 64

Measures of Information Measures of Information Flow Quantum Information Uncertainty Compressibility Randomness Encryption

Perfect Secrecy

Information Theoretic Security: H(P | C) = H(P) Or, equivalently, Prob(P | C) = Prob(P). A necessary condition for this is H(K) ≥ H(P).

Goutam Paul Pre-Quantum Information Theory Slide 21 of 34

slide-65
SLIDE 65

Roadmap

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-66
SLIDE 66

Roadmap

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-67
SLIDE 67

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Discrete Channel

Input alphabet X. Output alphabet Y. Probability Transition Matrix p(y|x). Informational Channel Capacity C = max

p(x) I(X; Y).

Goutam Paul Pre-Quantum Information Theory Slide 24 of 34

slide-68
SLIDE 68

Roadmap

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-69
SLIDE 69

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

An (M, n) Code

Goutam Paul Pre-Quantum Information Theory Slide 26 of 34

slide-70
SLIDE 70

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

An (M, n) Code

An index set {1, 2, . . . , M}.

Goutam Paul Pre-Quantum Information Theory Slide 26 of 34

slide-71
SLIDE 71

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

An (M, n) Code

An index set {1, 2, . . . , M}. An encoding function C : {1, 2, . . . , M} → X n.

Goutam Paul Pre-Quantum Information Theory Slide 26 of 34

slide-72
SLIDE 72

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

An (M, n) Code

An index set {1, 2, . . . , M}. An encoding function C : {1, 2, . . . , M} → X n. A decoding function D : Y n → {1, 2, . . . , M}.

Goutam Paul Pre-Quantum Information Theory Slide 26 of 34

slide-73
SLIDE 73

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Error probability

Goutam Paul Pre-Quantum Information Theory Slide 27 of 34

slide-74
SLIDE 74

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Error probability

Conditional error probability given index i was sent: ǫi = Pr(D(Y n) = i|X n = C(i)) =

  • D(yn)=i

p(y n|c(i)).

Goutam Paul Pre-Quantum Information Theory Slide 27 of 34

slide-75
SLIDE 75

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Error probability

Conditional error probability given index i was sent: ǫi = Pr(D(Y n) = i|X n = C(i)) =

  • D(yn)=i

p(y n|c(i)). Maximum error probability ǫmax = max

i∈{1,2,...,M} ǫi.

Goutam Paul Pre-Quantum Information Theory Slide 27 of 34

slide-76
SLIDE 76

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Error probability

Conditional error probability given index i was sent: ǫi = Pr(D(Y n) = i|X n = C(i)) =

  • D(yn)=i

p(y n|c(i)). Maximum error probability ǫmax = max

i∈{1,2,...,M} ǫi.

Average error probability ǫavg = 1

M M

  • i=1

ǫi.

Goutam Paul Pre-Quantum Information Theory Slide 27 of 34

slide-77
SLIDE 77

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Rate

R = log2 M

n

bits per transmission.

Goutam Paul Pre-Quantum Information Theory Slide 28 of 34

slide-78
SLIDE 78

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Rate

R = log2 M

n

bits per transmission. A rate R is said to be achievable if there exists a sequence of (⌈2nR⌉, n) codes such that ǫmax → 0 as n → ∞.

Goutam Paul Pre-Quantum Information Theory Slide 28 of 34

slide-79
SLIDE 79

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Rate

R = log2 M

n

bits per transmission. A rate R is said to be achievable if there exists a sequence of (⌈2nR⌉, n) codes such that ǫmax → 0 as n → ∞. Operational channel capacity is the supremum of all achievable rates.

Goutam Paul Pre-Quantum Information Theory Slide 28 of 34

slide-80
SLIDE 80

Roadmap

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-81
SLIDE 81

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Shannon’s Noisy Channel Coding Theorem

Goutam Paul Pre-Quantum Information Theory Slide 30 of 34

slide-82
SLIDE 82

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Shannon’s Noisy Channel Coding Theorem

All rates below capacity are achievable.

Goutam Paul Pre-Quantum Information Theory Slide 30 of 34

slide-83
SLIDE 83

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Shannon’s Noisy Channel Coding Theorem

All rates below capacity are achievable. ∀R < C, ∃ a sequence of codes such that ǫmax → 0 as n → ∞.

Goutam Paul Pre-Quantum Information Theory Slide 30 of 34

slide-84
SLIDE 84

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Shannon’s Noisy Channel Coding Theorem

All rates below capacity are achievable. ∀R < C, ∃ a sequence of codes such that ǫmax → 0 as n → ∞. Informational capacity = operational capacity.

Goutam Paul Pre-Quantum Information Theory Slide 30 of 34

slide-85
SLIDE 85

Measures of Information Measures of Information Flow Quantum Information Channel Capacity Code Noisy Coding

Band Limited Gaussian Channel

C = W log

  • 1 +

P N0W

  • bits per second, where N0

2 watts/Hz is the noise spectral

density and P is the signal power.

Goutam Paul Pre-Quantum Information Theory Slide 31 of 34

slide-86
SLIDE 86

Roadmap

1

Measures of Information Uncertainty Compressibility Randomness Encryption

2

Measures of Information Flow Channel Capacity Code Noisy Coding

3

Quantum Information

slide-87
SLIDE 87

Measures of Information Measures of Information Flow Quantum Information

From Pre-Quantum to Quantum

Goutam Paul Pre-Quantum Information Theory Slide 33 of 34

slide-88
SLIDE 88

Measures of Information Measures of Information Flow Quantum Information

From Pre-Quantum to Quantum

von Neumann entropy.

Goutam Paul Pre-Quantum Information Theory Slide 33 of 34

slide-89
SLIDE 89

Measures of Information Measures of Information Flow Quantum Information

From Pre-Quantum to Quantum

von Neumann entropy. Schumacher’s quantum noiseless coding theorem.

Goutam Paul Pre-Quantum Information Theory Slide 33 of 34

slide-90
SLIDE 90

Measures of Information Measures of Information Flow Quantum Information

From Pre-Quantum to Quantum

von Neumann entropy. Schumacher’s quantum noiseless coding theorem. Holevo bound: upper bound of accessible information.

Goutam Paul Pre-Quantum Information Theory Slide 33 of 34

slide-91
SLIDE 91

Measures of Information Measures of Information Flow Quantum Information

From Pre-Quantum to Quantum

von Neumann entropy. Schumacher’s quantum noiseless coding theorem. Holevo bound: upper bound of accessible information. Classical capacity and quantum capacity of quantum channels.

Goutam Paul Pre-Quantum Information Theory Slide 33 of 34

slide-92
SLIDE 92

THANK YOU Questions / Comments ?

Homepage: http://www.goutampaul.com Email: goutam.k.paul@gmail.com