SLIDE 8 CS480/680 Winter 2020 Zahra Sheikhbahaee
Mutual Information
πΌ π¦ π§ β πΌ π¦ = β&,< π π = π§ π π = π¦ π = π§ . log,
3 C π = π¦ π = π§ β β& π(
) π = π¦ log,
3 I J6&
β< π π = π§ π = π¦ = β&,< π(X = π¦ β© Y = y). log,
I(J6&) C π = π¦ π = π§ =
β&,< π(X = π¦ β© Y = y). log,
I J6& I(O6<) C(J6&β©O6<) β€ log,[β&,< π(π = π¦ β© π = π§)] I J6& I(O6<) C(J6&β©O6<) ]=log, 1 = 0
Definition: The Mutual Information of two random variables π and π , written π½(π; π ) : π½(π; π) = πΌ(π) β πΌ(π|π ) = πΌ(π ) β πΌ(π |π) = π½(π ; π) In the case that π and π are independent, as noted above π½(π; π ) = πΌ(π) β πΌ(π|π ) =
University of Waterloo
8