Information Theory and Synthetic Steganography CSM25 Secure - - PowerPoint PPT Presentation

information theory and synthetic steganography
SMART_READER_LITE
LIVE PREVIEW

Information Theory and Synthetic Steganography CSM25 Secure - - PowerPoint PPT Presentation

Information Theory and Synthetic Steganography CSM25 Secure Information Hiding Dr Hans Georg Schaathun University of Surrey Spring 2008 Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 1 / 53 Learning


slide-1
SLIDE 1

Information Theory and Synthetic Steganography

CSM25 Secure Information Hiding Dr Hans Georg Schaathun

University of Surrey

Spring 2008

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 1 / 53

slide-2
SLIDE 2

Learning Outcomes

understand the relationship between steganography and established disciplines like

communications, information theory, data compression, and coding theory.

be familiar with at least one way of doing steganography by cover synthesis

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 2 / 53

slide-3
SLIDE 3

Reading

Core Reading Peter Wayner: Disappearing Cryptography Ch. 6-7 Core Reading Cox et al.: Appendix A Suggested Reading Lin & Costello: Error-Control Coding

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 3 / 53

slide-4
SLIDE 4

Communications essentials Communications and Redundancy

Outline

1

Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction

2

Compression Huffmann Coding Huffmann Steganography

3

Miscellanea Synthesis by Grammar Redundancy in Images

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 4 / 53

slide-5
SLIDE 5

Communications essentials Communications and Redundancy

The communications problem

m ˆ m

  • Alice

Bob Bob’s problem

Estimate m, given (partly) random output ˆ m from the channel

How much (un)certainty does Bob have about m? Information theory and Shannon entropy.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 5 / 53

slide-6
SLIDE 6

Communications essentials Communications and Redundancy

The communications problem

m ˆ m Noisy channel

  • Alice

Bob Bob’s problem

Estimate m, given (partly) random output ˆ m from the channel

How much (un)certainty does Bob have about m? Information theory and Shannon entropy.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 5 / 53

slide-7
SLIDE 7

Communications essentials Communications and Redundancy

The communications problem

m ˆ m Noisy channel

  • Alice

Bob Bob’s problem

Estimate m, given (partly) random output ˆ m from the channel

How much (un)certainty does Bob have about m? Information theory and Shannon entropy.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 5 / 53

slide-8
SLIDE 8

Communications essentials Communications and Redundancy

The communications problem

m ˆ m Noisy channel

  • Alice

Bob Bob’s problem

Estimate m, given (partly) random output ˆ m from the channel

How much (un)certainty does Bob have about m? Information theory and Shannon entropy.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 5 / 53

slide-9
SLIDE 9

Communications essentials Communications and Redundancy

The communications problem

m ˆ m Noisy channel Encode Decode

  • c
  • r
  • Alice

Bob Bob’s problem

Estimate m, given (partly) random output ˆ m from the channel

How much (un)certainty does Bob have about m? Information theory and Shannon entropy.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 5 / 53

slide-10
SLIDE 10

Communications essentials Communications and Redundancy

The communications problem

m ˆ m Noisy channel Encode Decode

  • c
  • r
  • Alice

Bob Bob’s problem

Estimate m, given (partly) random output ˆ m from the channel

How much (un)certainty does Bob have about m? Information theory and Shannon entropy.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 5 / 53

slide-11
SLIDE 11

Communications essentials Communications and Redundancy

Redundancy of English

Fact The English language is more than 50% redundant. Message destroyed on the channel

Redundancy allows Bob to determine the original m.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 6 / 53

slide-12
SLIDE 12

Communications essentials Communications and Redundancy

Redundancy of English

Fact The English language is more than 50% redundant. t** p*oce*s o**hid**g *ata**nsid* o*her**ata. For ex*****, a **xt f*le c**ld*** hid*** "in**de"****im*ge or***s**nd *ile* By look****at t*e im*g***or list***** to th**s**nd,*yo* w*u*d n*t *no**that***ere is *x*ra info******* *r*sent. from http://www.cdt.org/crypto/glossary.shtml Message destroyed on the channel

Redundancy allows Bob to determine the original m.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 6 / 53

slide-13
SLIDE 13

Communications essentials Communications and Redundancy

Redundancy of English

Fact The English language is more than 50% redundant. t** p*oce*s o**hid**g *ata**nsid* o*her**ata. For ex*****, a **xt f*le c**ld*** hid*** "in**de"****im*ge or***s**nd *ile* By look****at t*e im*g***or list***** to th**s**nd,*yo* w*u*d n*t *no**that***ere is *x*ra info******* *r*sent. from http://www.cdt.org/crypto/glossary.shtml Message destroyed on the channel

Redundancy allows Bob to determine the original m.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 6 / 53

slide-14
SLIDE 14

Communications essentials Communications and Redundancy

Redundancy of English

Fact The English language is more than 50% redundant. t*e p*oce*s o* hid**g *ata*insid* o*her*data. For ex*m***, a t*xt f*le c**ld*b* hidd** "ind*de" a**im*ge or*a*s*und *ile* By look**g*at t*e im*g*,*or list**in* to th* s**nd,*yo* w*uld n*t *no**that *here is *x*ra info*****on *r*sent. from http://www.cdt.org/crypto/glossary.shtml Message destroyed on the channel

Redundancy allows Bob to determine the original m.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 6 / 53

slide-15
SLIDE 15

Communications essentials Communications and Redundancy

Redundancy of English

Fact The English language is more than 50% redundant. the process of hiding data inside other data. For example, a text file could be hidden "inside" an image or a sound file. By looking at the image, or listening to the sound, you would not know that there is extra information present. from http://www.cdt.org/crypto/glossary.shtml Message destroyed on the channel

Redundancy allows Bob to determine the original m.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 6 / 53

slide-16
SLIDE 16

Communications essentials Communications and Redundancy

Benefits of redundancy

Cross-word puzzles Understand foreigners with imperfect pronounciation.

How much would you understand of a lecture without redundancy?

Hear in a noisy environment. Read bad hand writing

How could I mark exam scripts without redundancy?

Cryptanalysis? Steganalysis?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 7 / 53

slide-17
SLIDE 17

Communications essentials Communications and Redundancy

Benefits of redundancy

Cross-word puzzles Understand foreigners with imperfect pronounciation.

How much would you understand of a lecture without redundancy?

Hear in a noisy environment. Read bad hand writing

How could I mark exam scripts without redundancy?

Cryptanalysis? Steganalysis?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 7 / 53

slide-18
SLIDE 18

Communications essentials Communications and Redundancy

Benefits of redundancy

Cross-word puzzles Understand foreigners with imperfect pronounciation.

How much would you understand of a lecture without redundancy?

Hear in a noisy environment. Read bad hand writing

How could I mark exam scripts without redundancy?

Cryptanalysis? Steganalysis?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 7 / 53

slide-19
SLIDE 19

Communications essentials Communications and Redundancy

Benefits of redundancy

Cross-word puzzles Understand foreigners with imperfect pronounciation.

How much would you understand of a lecture without redundancy?

Hear in a noisy environment. Read bad hand writing

How could I mark exam scripts without redundancy?

Cryptanalysis? Steganalysis?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 7 / 53

slide-20
SLIDE 20

Communications essentials Communications and Redundancy

Benefits of redundancy

Cross-word puzzles Understand foreigners with imperfect pronounciation.

How much would you understand of a lecture without redundancy?

Hear in a noisy environment. Read bad hand writing

How could I mark exam scripts without redundancy?

Cryptanalysis? Steganalysis?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 7 / 53

slide-21
SLIDE 21

Communications essentials Communications and Redundancy

Benefits of redundancy

Cross-word puzzles Understand foreigners with imperfect pronounciation.

How much would you understand of a lecture without redundancy?

Hear in a noisy environment. Read bad hand writing

How could I mark exam scripts without redundancy?

Cryptanalysis? Steganalysis?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 7 / 53

slide-22
SLIDE 22

Communications essentials Communications and Redundancy

Benefits of redundancy

Cross-word puzzles Understand foreigners with imperfect pronounciation.

How much would you understand of a lecture without redundancy?

Hear in a noisy environment. Read bad hand writing

How could I mark exam scripts without redundancy?

Cryptanalysis? Steganalysis?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 7 / 53

slide-23
SLIDE 23

Communications essentials Communications and Redundancy

What if there were no redundancy?

No use for steganography! Any text would be meaningful,

in particular, ciphertext would be meaningful

Simple encryption would give a stegogramme

indistinguishable from cover-text.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 8 / 53

slide-24
SLIDE 24

Communications essentials Communications and Redundancy

What if there were no redundancy?

No use for steganography! Any text would be meaningful,

in particular, ciphertext would be meaningful

Simple encryption would give a stegogramme

indistinguishable from cover-text.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 8 / 53

slide-25
SLIDE 25

Communications essentials Communications and Redundancy

What if there were no redundancy?

No use for steganography! Any text would be meaningful,

in particular, ciphertext would be meaningful

Simple encryption would give a stegogramme

indistinguishable from cover-text.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 8 / 53

slide-26
SLIDE 26

Communications essentials Anderson and Petitcolas 1999

Outline

1

Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction

2

Compression Huffmann Coding Huffmann Steganography

3

Miscellanea Synthesis by Grammar Redundancy in Images

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 9 / 53

slide-27
SLIDE 27

Communications essentials Anderson and Petitcolas 1999

Perfect compression

Compression removes redundancy

Minimises average string length (file size) Retaining information contents

Decompression replaces the redundancy

Recover original (loss-less compression)

Perfect means no redundancy in compressed string

Consequently all strings are used A(ny) random string can be decompressed ... and yield sensible output

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 10 / 53

slide-28
SLIDE 28

Communications essentials Anderson and Petitcolas 1999

Perfect compression

Compression removes redundancy

Minimises average string length (file size) Retaining information contents

Decompression replaces the redundancy

Recover original (loss-less compression)

Perfect means no redundancy in compressed string

Consequently all strings are used A(ny) random string can be decompressed ... and yield sensible output

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 10 / 53

slide-29
SLIDE 29

Communications essentials Anderson and Petitcolas 1999

Steganography by Perfect Compression

Anderson and Petitcolas 1998

A perfect compression scheme. A secure cipher. Decompress Encryption C

  • Compress

S

  • Decrypt

C

  • Message
  • Message
  • Key
  • Steganography without data hiding.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 11 / 53

slide-30
SLIDE 30

Communications essentials Anderson and Petitcolas 1999

Steganography by Perfect Compression

Anderson and Petitcolas 1998

A perfect compression scheme. A secure cipher. Decompress Encryption C

  • Compress

S

  • Decrypt

C

  • Message
  • Message
  • Key
  • Steganography without data hiding.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 11 / 53

slide-31
SLIDE 31

Communications essentials Digital Communications

Outline

1

Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction

2

Compression Huffmann Coding Huffmann Steganography

3

Miscellanea Synthesis by Grammar Redundancy in Images

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 12 / 53

slide-32
SLIDE 32

Communications essentials Digital Communications

Problems in natural language

How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy

Others have very little

Unstructured: hard to automate correction Structured redundancy is necessary for digital comms

Coding Theory

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 13 / 53

slide-33
SLIDE 33

Communications essentials Digital Communications

Problems in natural language

How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy

Others have very little

Unstructured: hard to automate correction Structured redundancy is necessary for digital comms

Coding Theory

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 13 / 53

slide-34
SLIDE 34

Communications essentials Digital Communications

Problems in natural language

How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy

Others have very little

Unstructured: hard to automate correction Structured redundancy is necessary for digital comms

Coding Theory

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 13 / 53

slide-35
SLIDE 35

Communications essentials Digital Communications

Problems in natural language

How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy

Others have very little

Unstructured: hard to automate correction Structured redundancy is necessary for digital comms

Coding Theory

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 13 / 53

slide-36
SLIDE 36

Communications essentials Digital Communications

Problems in natural language

How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy

Others have very little

Unstructured: hard to automate correction Structured redundancy is necessary for digital comms

Coding Theory

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 13 / 53

slide-37
SLIDE 37

Communications essentials Digital Communications

Problems in natural language

How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy

Others have very little

Unstructured: hard to automate correction Structured redundancy is necessary for digital comms

Coding Theory

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 13 / 53

slide-38
SLIDE 38

Communications essentials Digital Communications

Problems in natural language

How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy

Others have very little

Unstructured: hard to automate correction Structured redundancy is necessary for digital comms

Coding Theory

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 13 / 53

slide-39
SLIDE 39

Communications essentials Digital Communications

Coding

Channel and source coding

Source coding (aka. compression)

Remove redundancy Make a compact representation

Channel coding (aka. error-control coding)

Add mathematically structured redundancy Computationally efficient error-correction Optimised (low error-rate, small space)

Two aspect of Information Theory

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 14 / 53

slide-40
SLIDE 40

Communications essentials Digital Communications

Coding

Channel and source coding

Source coding (aka. compression)

Remove redundancy Make a compact representation

Channel coding (aka. error-control coding)

Add mathematically structured redundancy Computationally efficient error-correction Optimised (low error-rate, small space)

Two aspect of Information Theory

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 14 / 53

slide-41
SLIDE 41

Communications essentials Digital Communications

Coding

Channel and source coding

Source coding (aka. compression)

Remove redundancy Make a compact representation

Channel coding (aka. error-control coding)

Add mathematically structured redundancy Computationally efficient error-correction Optimised (low error-rate, small space)

Two aspect of Information Theory

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 14 / 53

slide-42
SLIDE 42

Communications essentials Digital Communications

Channel and Source Coding

Message Comp.

  • Encode

Decode Channel

  • Decom.

r

  • Dr Hans Georg Schaathun

Information Theory and Synthetic Steganography Spring 2008 15 / 53

slide-43
SLIDE 43

Communications essentials Digital Communications

Channel and Source Coding

Message Comp.

  • Encode

Decode Channel

  • Decom.

r

  • Remove redundancy

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 15 / 53

slide-44
SLIDE 44

Communications essentials Digital Communications

Channel and Source Coding

Message Comp.

  • Encode

Decode Channel

  • Decom.

r

  • Remove redundancy

Add redundancy

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 15 / 53

slide-45
SLIDE 45

Communications essentials Digital Communications

Channel and Source Coding

Message Comp.

  • Encode

Decode Channel

  • Decom.

r

  • Encrypt.
  • Decrypt.
  • Scramble

Remove redundancy Add redundancy

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 15 / 53

slide-46
SLIDE 46

Communications essentials Shannon Entropy

Outline

1

Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction

2

Compression Huffmann Coding Huffmann Steganography

3

Miscellanea Synthesis by Grammar Redundancy in Images

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 16 / 53

slide-47
SLIDE 47

Communications essentials Shannon Entropy

Uncertainty

Shannon Entropy

m and r are stochastic variables

(drawn at random from a distribution)

How much uncertainty about the message m?

Uncertainty measured by entropy H(m) before any message is received. H(m|r) after receipt of the message

Conditional entropy

Mutual Information is derived from entropy

I(m; r) = H(m) − H(m|r) I(m; r) is the amount of information contained in r about m. I(m; r) = I(r; m)

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 17 / 53

slide-48
SLIDE 48

Communications essentials Shannon Entropy

Uncertainty

Shannon Entropy

m and r are stochastic variables

(drawn at random from a distribution)

How much uncertainty about the message m?

Uncertainty measured by entropy H(m) before any message is received. H(m|r) after receipt of the message

Conditional entropy

Mutual Information is derived from entropy

I(m; r) = H(m) − H(m|r) I(m; r) is the amount of information contained in r about m. I(m; r) = I(r; m)

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 17 / 53

slide-49
SLIDE 49

Communications essentials Shannon Entropy

Uncertainty

Shannon Entropy

m and r are stochastic variables

(drawn at random from a distribution)

How much uncertainty about the message m?

Uncertainty measured by entropy H(m) before any message is received. H(m|r) after receipt of the message

Conditional entropy

Mutual Information is derived from entropy

I(m; r) = H(m) − H(m|r) I(m; r) is the amount of information contained in r about m. I(m; r) = I(r; m)

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 17 / 53

slide-50
SLIDE 50

Communications essentials Shannon Entropy

Shannon entropy

Definition

Random variable X ∈ X Hq(X) = −

  • x∈X

Pr(X = x) logq Pr(X = x) Usually q = 2, giving entropy in bits

q = e (natural logarithm) gives entropy in nats

If Pr(X = xi) = pi for x1, x2, . . . ∈ X, we write

H(X) = h(p1, p2, . . .)

Example: One question Q; Yes/No is 50-50 probability

H(Q) = −2

  • 1

2 log 1 2

  • = 1

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 18 / 53

slide-51
SLIDE 51

Communications essentials Shannon Entropy

Shannon entropy

Definition

Random variable X ∈ X Hq(X) = −

  • x∈X

Pr(X = x) logq Pr(X = x) Usually q = 2, giving entropy in bits

q = e (natural logarithm) gives entropy in nats

If Pr(X = xi) = pi for x1, x2, . . . ∈ X, we write

H(X) = h(p1, p2, . . .)

Example: One question Q; Yes/No is 50-50 probability

H(Q) = −2

  • 1

2 log 1 2

  • = 1

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 18 / 53

slide-52
SLIDE 52

Communications essentials Shannon Entropy

Shannon entropy

Definition

Random variable X ∈ X Hq(X) = −

  • x∈X

Pr(X = x) logq Pr(X = x) Usually q = 2, giving entropy in bits

q = e (natural logarithm) gives entropy in nats

If Pr(X = xi) = pi for x1, x2, . . . ∈ X, we write

H(X) = h(p1, p2, . . .)

Example: One question Q; Yes/No is 50-50 probability

H(Q) = −2

  • 1

2 log 1 2

  • = 1

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 18 / 53

slide-53
SLIDE 53

Communications essentials Shannon Entropy

Shannon entropy

Definition

Random variable X ∈ X Hq(X) = −

  • x∈X

Pr(X = x) logq Pr(X = x) Usually q = 2, giving entropy in bits

q = e (natural logarithm) gives entropy in nats

If Pr(X = xi) = pi for x1, x2, . . . ∈ X, we write

H(X) = h(p1, p2, . . .)

Example: One question Q; Yes/No is 50-50 probability

H(Q) = −2

  • 1

2 log 1 2

  • = 1

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 18 / 53

slide-54
SLIDE 54

Communications essentials Shannon Entropy

Example

Alice has a 1-bit message m, with 50-50 distribution

The entropy (Bob’s uncertainty) is H(m

The Binary Symmetric Channel has an error rate of 25%

i.e. 25% risk that Alice’s message is flipped

Alice’s uncertainty about the received message is

H(r|1) = H(r|0) = −0.25 log 0.25 − 0.75 log 0.75 ≈ 0.811 H(r|m) = 0.5H(r|0) + 0.5H(r|1) = 0.811

The information received by Bob is

I(m; r) = H(m) − H(m|r) = H(r) − H(r|m) = 1 − 0.811 = 0.189

What if the error rate is 50%? Or 10%?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 19 / 53

slide-55
SLIDE 55

Communications essentials Shannon Entropy

Shannon entropy

Properties

1

Additive, if X and Y are independent, then H(X, Y) = H(X) + H(Y).

If you are uncertain about two completely different questions, the entropy is the sum of uncertainty for each question

2

If X is uniformly distributed,

then H(X) increase when the size of X increases. The more possibilities, the more uncertainty

3

Continuity, h(p1, p2, . . .) is continuous in each pi. Shannon entropy is a measure in mathematical terms

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 20 / 53

slide-56
SLIDE 56

Communications essentials Shannon Entropy

What it tells us

Shannon entropy

Consider a message X of entropy k = H(X) (in bits) The average size of a file F describing X is

at least k bits

If the size of F is exactly k bits on average

then we have found a perfect compression of F Each message bit contains one bit of information on average

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 21 / 53

slide-57
SLIDE 57

Communications essentials Shannon Entropy

What it tells us

Shannon entropy

Consider a message X of entropy k = H(X) (in bits) The average size of a file F describing X is

at least k bits

If the size of F is exactly k bits on average

then we have found a perfect compression of F Each message bit contains one bit of information on average

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 21 / 53

slide-58
SLIDE 58

Communications essentials Shannon Entropy

Exemple banale

A single bit may contain more than a 1 bit of information E.G. Image Compression

0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F-16 11101: Che Guevarra

  • 11111. . . : other images

However, on average,

Maximum information in one bit is one bit (most of the time it is less)

The example is based on Huffmann coding

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 22 / 53

slide-59
SLIDE 59

Communications essentials Shannon Entropy

Exemple banale

A single bit may contain more than a 1 bit of information E.G. Image Compression

0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F-16 11101: Che Guevarra

  • 11111. . . : other images

However, on average,

Maximum information in one bit is one bit (most of the time it is less)

The example is based on Huffmann coding

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 22 / 53

slide-60
SLIDE 60

Communications essentials Shannon Entropy

Exemple banale

A single bit may contain more than a 1 bit of information E.G. Image Compression

0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F-16 11101: Che Guevarra

  • 11111. . . : other images

However, on average,

Maximum information in one bit is one bit (most of the time it is less)

The example is based on Huffmann coding

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 22 / 53

slide-61
SLIDE 61

Communications essentials Shannon Entropy

Exemple banale

A single bit may contain more than a 1 bit of information E.G. Image Compression

0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F-16 11101: Che Guevarra

  • 11111. . . : other images

However, on average,

Maximum information in one bit is one bit (most of the time it is less)

The example is based on Huffmann coding

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 22 / 53

slide-62
SLIDE 62

Communications essentials Shannon Entropy

Exemple banale

A single bit may contain more than a 1 bit of information E.G. Image Compression

0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F-16 11101: Che Guevarra

  • 11111. . . : other images

However, on average,

Maximum information in one bit is one bit (most of the time it is less)

The example is based on Huffmann coding

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 22 / 53

slide-63
SLIDE 63

Communications essentials Shannon Entropy

Exemple banale

A single bit may contain more than a 1 bit of information E.G. Image Compression

0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F-16 11101: Che Guevarra

  • 11111. . . : other images

However, on average,

Maximum information in one bit is one bit (most of the time it is less)

The example is based on Huffmann coding

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 22 / 53

slide-64
SLIDE 64

Communications essentials Shannon Entropy

Exemple banale

A single bit may contain more than a 1 bit of information E.G. Image Compression

0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F-16 11101: Che Guevarra

  • 11111. . . : other images

However, on average,

Maximum information in one bit is one bit (most of the time it is less)

The example is based on Huffmann coding

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 22 / 53

slide-65
SLIDE 65

Communications essentials Shannon Entropy

Exemple banale

A single bit may contain more than a 1 bit of information E.G. Image Compression

0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F-16 11101: Che Guevarra

  • 11111. . . : other images

However, on average,

Maximum information in one bit is one bit (most of the time it is less)

The example is based on Huffmann coding

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 22 / 53

slide-66
SLIDE 66

Communications essentials Security

Outline

1

Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction

2

Compression Huffmann Coding Huffmann Steganography

3

Miscellanea Synthesis by Grammar Redundancy in Images

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 23 / 53

slide-67
SLIDE 67

Communications essentials Security

Cryptography

Alice ciphertext Bob, m → c → m Eve Eve seeks information about m, observing c

If I(m; c) > 0 then Eve succeeds in theory

  • r if I(k; c) > 0

If H(m|c) = H(m) then the system is absolutely secure.

The above are strong statements

Even if Eve has information I(m; c) > 0,

she may be unable to make sense of it.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 24 / 53

slide-68
SLIDE 68

Communications essentials Security

Stegananalysis

Question: Does Alice send secret information to Bob? Answer: X ∈ {yes, no} What is the uncertainty H(X)? Eve intercepts a message S,

Is there any information I(X; S)?

If H(X|S) = H(X), then the system is absolutely secure.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 25 / 53

slide-69
SLIDE 69

Communications essentials Security

Stegananalysis

Question: Does Alice send secret information to Bob? Answer: X ∈ {yes, no} What is the uncertainty H(X)? Eve intercepts a message S,

Is there any information I(X; S)?

If H(X|S) = H(X), then the system is absolutely secure.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 25 / 53

slide-70
SLIDE 70

Communications essentials Security

Stegananalysis

Question: Does Alice send secret information to Bob? Answer: X ∈ {yes, no} What is the uncertainty H(X)? Eve intercepts a message S,

Is there any information I(X; S)?

If H(X|S) = H(X), then the system is absolutely secure.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 25 / 53

slide-71
SLIDE 71

Communications essentials Security

Stegananalysis

Question: Does Alice send secret information to Bob? Answer: X ∈ {yes, no} What is the uncertainty H(X)? Eve intercepts a message S,

Is there any information I(X; S)?

If H(X|S) = H(X), then the system is absolutely secure.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 25 / 53

slide-72
SLIDE 72

Communications essentials Security

Stegananalysis

Question: Does Alice send secret information to Bob? Answer: X ∈ {yes, no} What is the uncertainty H(X)? Eve intercepts a message S,

Is there any information I(X; S)?

If H(X|S) = H(X), then the system is absolutely secure.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 25 / 53

slide-73
SLIDE 73

Communications essentials Prediction

Outline

1

Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction

2

Compression Huffmann Coding Huffmann Steganography

3

Miscellanea Synthesis by Grammar Redundancy in Images

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 26 / 53

slide-74
SLIDE 74

Communications essentials Prediction

Random sequences

Text is a sequence of random samples (letters)

(l1, l2, l3, . . .); li ∈ A = {A, B, . . . , Z}

Each letter has a probability distribution P(l), l ∈ A. Statistical dependence (implies redundancy)

P(li|li−1) = P(li) H(li|li−1) < H(li): Letter i − 1 contains information about li Use this information to guess li

The more letters li−j, . . . , li−1 we have seen

the more reliable can we predict li

Wayner (Ch 6.1) gives example of first, second, . . . , fifth order prediction

Using j = 0, 1, 2, 3, 4

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 27 / 53

slide-75
SLIDE 75

Communications essentials Prediction

First-order prediction

Example from Wayner

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 28 / 53

slide-76
SLIDE 76

Communications essentials Prediction

Second-order prediction

Example from Wayner

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 29 / 53

slide-77
SLIDE 77

Communications essentials Prediction

Third-order prediction

Example from Wayner

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 30 / 53

slide-78
SLIDE 78

Communications essentials Prediction

Fourth-order prediction

Example from Wayner

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 31 / 53

slide-79
SLIDE 79

Communications essentials Prediction

Markov models

Markov source is a sequence M1, M2, . . .

  • f stochastic (random) variables

An n-th order Markov source

completely described by probability distributions

P[M1, M2, . . . , Mn] P[Mi|Mi−n, . . . , Mi−1] (identical for all i)

This is a finite-state machine (automaton) State of the source

last n bits Mi−n, . . . , Mi−1 determines probability distribution of next symbol

The random texts from Wayner are generated using

1st, 2nd, 3rd, and 4th order Markov models

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 32 / 53

slide-80
SLIDE 80

Communications essentials Prediction

A related example

A group of MIT students

software generating random ‘science’ papers random paper accepted for WMSCI 2005

You can generate your own paper on-line

http://pdos.csail.mit.edu/scigen/

Source code available (SCIgen) If you are brave – as a poster topic

modify SCIgen for steganography

Or maybe for your dissertation

if you have a related topic you can tweek

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 33 / 53

slide-81
SLIDE 81

Compression Huffmann Coding

Outline

1

Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction

2

Compression Huffmann Coding Huffmann Steganography

3

Miscellanea Synthesis by Grammar Redundancy in Images

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 34 / 53

slide-82
SLIDE 82

Compression Huffmann Coding

Compression

❋∗ is set of binary strings of arbitrary length Definition A compression system is a function c : ❋∗ → ❋∗, such that E(length m) > E(length(c(m))) when m is drawn from ❋∗. The compressed string is expected to be shorter than the original. Definition A compression c is perfect if all target strings are used, i.e. if for any m ∈ ❋∗, c−1(m) is a sensible file (cover-text). Decompress a random string, and it makes sense!

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 35 / 53

slide-83
SLIDE 83

Compression Huffmann Coding

Huffmann Coding

Short codewords for frequent quantities Long codewords for unusual quantities Each symbol (bit) should be equally probable

  • 50%
  • 1
  • 25%
  • 25%
  • 1

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 36 / 53

slide-84
SLIDE 84

Compression Huffmann Coding

Example

  • 1
  • 25%
  • 1
  • 12 1

2%

  • 1
  • 25%
  • 25%
  • 1
  • 7 1

4%

  • 1
  • 7 1

4%

  • Dr Hans Georg Schaathun

Information Theory and Synthetic Steganography Spring 2008 37 / 53

slide-85
SLIDE 85

Compression Huffmann Coding

Decoding

Huffmann codes are prefix free

No codeword is the prefix of another This simplifies the decoding

This is expressed in the Huffmann tree,

follow edges for each coded bit (only) leaf node resolves to a message symbol

When a message symbol is recovered, start over for next symbol.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 38 / 53

slide-86
SLIDE 86

Compression Huffmann Coding

Ideal Huffmann code

Each branch equally likely: P(bi|bi−1, bi−2, . . .) = 1/2 Maximum entropy H(Bi|Bi−1, Bi−2, . . .) = 1

uniform distribution of compressed files implies perfect compression

In practice, the probabilities are rarely powers of 1

2

hence the Huffmann code is imperfect

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 39 / 53

slide-87
SLIDE 87

Compression Huffmann Steganography

Outline

1

Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction

2

Compression Huffmann Coding Huffmann Steganography

3

Miscellanea Synthesis by Grammar Redundancy in Images

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 40 / 53

slide-88
SLIDE 88

Compression Huffmann Steganography

Reverse Huffmann

Core Reading Peter Wayner: Disappearing Cryptography Ch. 6-7 Use a Huffmann code for each state in the Markov model

Stegano-encoder: Huffmann decompression Stegano-decoder: Huffmann compression

Is this similar to Anderson & Petitcolas

Steganography by Perfect Compression?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 41 / 53

slide-89
SLIDE 89

Compression Huffmann Steganography

The Steganogram

Steganogram looks like random text

use probability distribution based on sample text higher-order statistics make it look natural

Fifth-order statistics is reasonable Higher order will look more natural

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 42 / 53

slide-90
SLIDE 90

Compression Huffmann Steganography

The Steganogram

Steganogram looks like random text

use probability distribution based on sample text higher-order statistics make it look natural

Fifth-order statistics is reasonable Higher order will look more natural

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 42 / 53

slide-91
SLIDE 91

Compression Huffmann Steganography

Example

Fifth order

For each 5-tupple of letters A0, A1, A2, A3, A4,

Let li−4, . . . , li be consecutive letters in natural text tabulate P(li = A0|li−j = Aj, j = 1, 2, 3, 4)

For each 4-tuple A1, A2, A3, A4

make an (approximate) Huffmann code for A0.

we may ommit some values of A0,

  • r have non-unique codewords

We encode a message by Huffmann decompression

using Huffmann code depending on the last four stegogramme symbols

  • btaining a fifth-order random text

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 43 / 53

slide-92
SLIDE 92

Compression Huffmann Steganography

Example

Fifth order

Consider four preceeding letters comp Next letter may be letter r e l a

  • probability

40% 12% 22% 18% 8% combined 52% 22% 26% rounded 50% 25% 25% Rounding to power of 1

2

Combining several letters reduces rounding error.

The example is arbitrary and fictuous.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 44 / 53

slide-93
SLIDE 93

Compression Huffmann Steganography

Example

The Huffmann code

Huffmann code based on fifth-order conditional probabilities

  • r/e
  • 1
  • l
  • a/o
  • 1

When two letters are possible,

choose at random (according to probability in natural text) decoding (compression) is still unique encoding (decompression) is not unique

This evens out the statistics in the stegogramme

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 45 / 53

slide-94
SLIDE 94

Miscellanea Synthesis by Grammar

Outline

1

Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction

2

Compression Huffmann Coding Huffmann Steganography

3

Miscellanea Synthesis by Grammar Redundancy in Images

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 46 / 53

slide-95
SLIDE 95

Miscellanea Synthesis by Grammar

Grammar

A grammar describes the structure of a language Simple grammar

sentence → noun verb noun → Mr. Brown | Miss Scarlet verb → eats | drinks

Each choice can map to a message symbol

0 : Mr. Brown, eats 1 : Miss Scarlet, drinks

Two messages can be stego-encrypted No cover-text is input.

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 47 / 53

slide-96
SLIDE 96

Miscellanea Synthesis by Grammar

More complex grammar

sentence → noun verb addition noun → Mr. Brown | Miss Scarlet | . . . | Mrs. White verb → eats | drinks | celebrates | . . . | cooks addition → addition term | ∅ term → on Monday | in March | with Mr. Green | . . . | in Alaska | at home general → sentence | question question → Does noun verb addition ? xgeneral → general | sentence, because sentence

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 48 / 53

slide-97
SLIDE 97

Miscellanea Synthesis by Grammar

More complex grammar

sentence → noun verb addition noun → Mr. Brown | Miss Scarlet | . . . | Mrs. White verb → eats | drinks | celebrates | . . . | cooks addition → addition term | ∅ term → on Monday | in March | with Mr. Green | . . . | in Alaska | at home general → sentence | question question → Does noun verb addition ? xgeneral → general | sentence, because sentence

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 48 / 53

slide-98
SLIDE 98

Miscellanea Synthesis by Grammar

More complex grammar

sentence → noun verb addition noun → Mr. Brown | Miss Scarlet | . . . | Mrs. White verb → eats | drinks | celebrates | . . . | cooks addition → addition term | ∅ term → on Monday | in March | with Mr. Green | . . . | in Alaska | at home general → sentence | question question → Does noun verb addition ? xgeneral → general | sentence, because sentence

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 48 / 53

slide-99
SLIDE 99

Miscellanea Synthesis by Grammar

Is this practical?

Exercise

Choose either the reverse Huffmann or the grammar-based steganography technique, and write a short critique (approx. 1 page) where you answer some of the following questions. How can you do steganalysis?

Under what condition will it be secure?

Is the system practical? Useful?

Which implementation issues do you foresee? How could it be implemented?

Could the technique estend to images?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 49 / 53

slide-100
SLIDE 100

Miscellanea Redundancy in Images

Outline

1

Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction

2

Compression Huffmann Coding Huffmann Steganography

3

Miscellanea Synthesis by Grammar Redundancy in Images

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 50 / 53

slide-101
SLIDE 101

Miscellanea Redundancy in Images

Returning to Images

Communications deal with abstract and discrete data

Arbitrary bit strings

How does redundancy apply to images? Some say that the LSB is redundant

... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits

Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information

any value in the LSB would be meaningful/valid

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 51 / 53

slide-102
SLIDE 102

Miscellanea Redundancy in Images

Returning to Images

Communications deal with abstract and discrete data

Arbitrary bit strings

How does redundancy apply to images? Some say that the LSB is redundant

... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits

Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information

any value in the LSB would be meaningful/valid

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 51 / 53

slide-103
SLIDE 103

Miscellanea Redundancy in Images

Returning to Images

Communications deal with abstract and discrete data

Arbitrary bit strings

How does redundancy apply to images? Some say that the LSB is redundant

... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits

Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information

any value in the LSB would be meaningful/valid

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 51 / 53

slide-104
SLIDE 104

Miscellanea Redundancy in Images

Returning to Images

Communications deal with abstract and discrete data

Arbitrary bit strings

How does redundancy apply to images? Some say that the LSB is redundant

... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits

Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information

any value in the LSB would be meaningful/valid

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 51 / 53

slide-105
SLIDE 105

Miscellanea Redundancy in Images

Returning to Images

Communications deal with abstract and discrete data

Arbitrary bit strings

How does redundancy apply to images? Some say that the LSB is redundant

... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits

Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information

any value in the LSB would be meaningful/valid

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 51 / 53

slide-106
SLIDE 106

Miscellanea Redundancy in Images

Returning to Images

Communications deal with abstract and discrete data

Arbitrary bit strings

How does redundancy apply to images? Some say that the LSB is redundant

... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits

Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information

any value in the LSB would be meaningful/valid

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 51 / 53

slide-107
SLIDE 107

Miscellanea Redundancy in Images

Lossy and loss-less compression

The Huffmann code is loss-less

decompression restores the original exactly

How does image processing work? Lossy ... i.e. information is unrevocably lost

down-sampling (reduce resolution) reduce colour depth (e.g. discard LSB) ... and similar approaches in the transform domain

Loss-less image compression is still possible

... but the loss/compression trade-off favours lossy compression

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 52 / 53

slide-108
SLIDE 108

Miscellanea Redundancy in Images

Lossy and loss-less compression

The Huffmann code is loss-less

decompression restores the original exactly

How does image processing work? Lossy ... i.e. information is unrevocably lost

down-sampling (reduce resolution) reduce colour depth (e.g. discard LSB) ... and similar approaches in the transform domain

Loss-less image compression is still possible

... but the loss/compression trade-off favours lossy compression

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 52 / 53

slide-109
SLIDE 109

Miscellanea Redundancy in Images

Lossy and loss-less compression

The Huffmann code is loss-less

decompression restores the original exactly

How does image processing work? Lossy ... i.e. information is unrevocably lost

down-sampling (reduce resolution) reduce colour depth (e.g. discard LSB) ... and similar approaches in the transform domain

Loss-less image compression is still possible

... but the loss/compression trade-off favours lossy compression

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 52 / 53

slide-110
SLIDE 110

Miscellanea Redundancy in Images

LSB in English text

Thought experiment

Hide in redundant characters in English prose

analogously to LSB

Would it work? Why (not)? Correct character can be predicted

spelling mistakes would be suspicious

Would it work using

spelling mistakes in an MSc dissertation by an overseas student?

Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2008 53 / 53