Application of Information Theory, Lecture 1
Basic Definitions and Facts
Iftach Haitner
Tel Aviv University.
October 28, 2014
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 1 / 12
Basic Definitions and Facts Iftach Haitner Tel Aviv University. - - PowerPoint PPT Presentation
Application of Information Theory, Lecture 1 Basic Definitions and Facts Iftach Haitner Tel Aviv University. October 28, 2014 Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 1 / 12 The entropy function X
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 1 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 2 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 2 / 12
x p(x) log p(x) = EX log 1 p(X) = EY=p(X) log 1 Y
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 2 / 12
x p(x) log p(x) = EX log 1 p(X) = EY=p(X) log 1 Y
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 2 / 12
x p(x) log p(x) = EX log 1 p(X) = EY=p(X) log 1 Y
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 2 / 12
x p(x) log p(x) = EX log 1 p(X) = EY=p(X) log 1 Y
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 2 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
3.
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
3.
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
◮ H(X) = H(p, q) = −p log p − q log q Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
◮ H(X) = H(p, q) = −p log p − q log q ◮ H(1, 0) = (0, 1) = 0 Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
◮ H(X) = H(p, q) = −p log p − q log q ◮ H(1, 0) = (0, 1) = 0 ◮ H( 1
2, 1 2) = 1
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
◮ H(X) = H(p, q) = −p log p − q log q ◮ H(1, 0) = (0, 1) = 0 ◮ H( 1
2, 1 2) = 1
◮ h(p) := H(p, 1 − p) is continuous Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
◮ H(X) = H(p, q) = −p log p − q log q ◮ H(1, 0) = (0, 1) = 0 ◮ H( 1
2, 1 2) = 1
◮ h(p) := H(p, 1 − p) is continuous Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
2, 1 4, 1 4):
2, PX(x2) = 1 4, PX(x3) = 1 4)
2 log 1 2 − 1 4 log 1 4 − 1 4 log 1 4 = 1 2 + 1 4 · 2 + 1 4 · 2 = 1 1 2.
2, 1 4, 1 4).
i=1 1 2n log 1 2n = − log 1 2n = n.
◮ n bits are needed to describe X ◮ n bits are needed to create X
◮ H(X) = H(p, q) = −p log p − q log q ◮ H(1, 0) = (0, 1) = 0 ◮ H( 1
2, 1 2) = 1
◮ h(p) := H(p, 1 − p) is continuous Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 3 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 4 / 12
2, 1 2) = 1
p1 p1+p2 , p2 p1+p2 )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 4 / 12
2, 1 2) = 1
p1 p1+p2 , p2 p1+p2 )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 4 / 12
2, 1 2) = 1
p1 p1+p2 , p2 p1+p2 )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 4 / 12
2, 1 2) = 1
p1 p1+p2 , p2 p1+p2 )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 4 / 12
2, 1 2) = 1
p1 p1+p2 , p2 p1+p2 )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 4 / 12
i=1 pi.
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=1 pi.
S2 , p2 S2 ).
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=1 pi.
S2 , p2 S2 ).
Sk , . . . , pk Sk )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=1 pi.
S2 , p2 S2 ).
Sk , . . . , pk Sk )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=1 pi.
S2 , p2 S2 ).
Sk , . . . , pk Sk )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=1 pi.
S2 , p2 S2 ).
Sk , . . . , pk Sk )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=1 pi.
S2 , p2 S2 ).
Sk , . . . , pk Sk )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=1 pi.
S2 , p2 S2 ).
Sk , . . . , pk Sk )
k
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=1 pi.
S2 , p2 S2 ).
Sk , . . . , pk Sk )
k
k−1
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=1 pi.
S2 , p2 S2 ).
Sk , . . . , pk Sk )
k
k−1
k
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=1 pi.
S2 , p2 S2 ).
Sk , . . . , pk Sk )
k
k−1
k
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 5 / 12
i=kt
C1 , . . . , pk2−1 C1 ) + . . . + Cq · H( pkq+1 Cq , . . . , pm Cq )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 6 / 12
i=kt
C1 , . . . , pk2−1 C1 ) + . . . + Cq · H( pkq+1 Cq , . . . , pm Cq )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 6 / 12
i=kt
C1 , . . . , pk2−1 C1 ) + . . . + Cq · H( pkq+1 Cq , . . . , pm Cq )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 6 / 12
i=kt
C1 , . . . , pk2−1 C1 ) + . . . + Cq · H( pkq+1 Cq , . . . , pm Cq )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 6 / 12
i=kt
C1 , . . . , pk2−1 C1 ) + . . . + Cq · H( pkq+1 Cq , . . . , pm Cq )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 6 / 12
i=kt
C1 , . . . , pk2−1 C1 ) + . . . + Cq · H( pkq+1 Cq , . . . , pm Cq )
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 6 / 12
i=kt
C1 , . . . , pk2−1 C1 ) + . . . + Cq · H( pkq+1 Cq , . . . , pm Cq )
3, 1 3, 1 3)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 6 / 12
i=kt
C1 , . . . , pk2−1 C1 ) + . . . + Cq · H( pkq+1 Cq , . . . , pm Cq )
3, 1 3, 1 3)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 6 / 12
i=kt
C1 , . . . , pk2−1 C1 ) + . . . + Cq · H( pkq+1 Cq , . . . , pm Cq )
3, 1 3, 1 3)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 6 / 12
i=kt
C1 , . . . , pk2−1 C1 ) + . . . + Cq · H( pkq+1 Cq , . . . , pm Cq )
3, 1 3, 1 3)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 6 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 7 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 7 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 7 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 7 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 7 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 7 / 12
⌊n log 3⌋ n
n
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 7 / 12
⌊n log 3⌋ n
n
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 7 / 12
⌊n log 3⌋ n
n
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 7 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 8 / 12
m and q = m−k m , where m is the smallest
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 8 / 12
m and q = m−k m , where m is the smallest
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 8 / 12
m and q = m−k m , where m is the smallest
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 8 / 12
m and q = m−k m , where m is the smallest
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 8 / 12
m and q = m−k m , where m is the smallest
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 8 / 12
m and q = m−k m , where m is the smallest
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 8 / 12
m and q = m−k m , where m is the smallest
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 8 / 12
m and q = m−k m , where m is the smallest
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 8 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 9 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 9 / 12
m , q = k2 m and p3 = k3 m , where
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 9 / 12
m , q = k2 m and p3 = k3 m , where
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 9 / 12
m , q = k2 m and p3 = k3 m , where
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 9 / 12
m , q = k2 m and p3 = k3 m , where
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 9 / 12
m , q = k2 m and p3 = k3 m , where
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 9 / 12
m , q = k2 m and p3 = k3 m , where
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 9 / 12
m , q = k2 m and p3 = k3 m , where
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 9 / 12
m , q = k2 m and p3 = k3 m , where
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 9 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
◮ H(p1, . . . , pm) = 0 for (p1, . . . , pm) = (1, 0, . . . , 0). Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
◮ H(p1, . . . , pm) = 0 for (p1, . . . , pm) = (1, 0, . . . , 0). ◮ H(p1, . . . , pm) = log m for (p1, . . . , pm) = ( 1
m, . . . , 1 m).
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
◮ H(p1, . . . , pm) = 0 for (p1, . . . , pm) = (1, 0, . . . , 0). ◮ H(p1, . . . , pm) = log m for (p1, . . . , pm) = ( 1
m, . . . , 1 m).
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
◮ H(p1, . . . , pm) = 0 for (p1, . . . , pm) = (1, 0, . . . , 0). ◮ H(p1, . . . , pm) = log m for (p1, . . . , pm) = ( 1
m, . . . , 1 m).
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
◮ H(p1, . . . , pm) = 0 for (p1, . . . , pm) = (1, 0, . . . , 0). ◮ H(p1, . . . , pm) = log m for (p1, . . . , pm) = ( 1
m, . . . , 1 m).
i λi = 1
i λiti)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
◮ H(p1, . . . , pm) = 0 for (p1, . . . , pm) = (1, 0, . . . , 0). ◮ H(p1, . . . , pm) = log m for (p1, . . . , pm) = ( 1
m, . . . , 1 m).
i λi = 1
i λiti)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
◮ H(p1, . . . , pm) = 0 for (p1, . . . , pm) = (1, 0, . . . , 0). ◮ H(p1, . . . , pm) = log m for (p1, . . . , pm) = ( 1
m, . . . , 1 m).
i λi = 1
i λiti)
x2 ) is
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
◮ H(p1, . . . , pm) = 0 for (p1, . . . , pm) = (1, 0, . . . , 0). ◮ H(p1, . . . , pm) = log m for (p1, . . . , pm) = ( 1
m, . . . , 1 m).
i λi = 1
i λiti)
x2 ) is
i pi log 1 pi ≤ log i pi 1 pi = log m
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
◮ H(p1, . . . , pm) = 0 for (p1, . . . , pm) = (1, 0, . . . , 0). ◮ H(p1, . . . , pm) = log m for (p1, . . . , pm) = ( 1
m, . . . , 1 m).
i λi = 1
i λiti)
x2 ) is
i pi log 1 pi ≤ log i pi 1 pi = log m
1 PX (X) ≤ log EX 1 PX (X) = log m
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 10 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
x : g(x)=y log PX(x)
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 11 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 12 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 12 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 12 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 12 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 12 / 12
Iftach Haitner (TAU) Application of Information Theory, Lecture 1 October 28, 2014 12 / 12