Examples where diversity is beneficial as a secondary goal o Games of - - PowerPoint PPT Presentation

examples where diversity is beneficial as a secondary goal
SMART_READER_LITE
LIVE PREVIEW

Examples where diversity is beneficial as a secondary goal o Games of - - PowerPoint PPT Presentation

being unpredictable Examples where diversity is beneficial as a secondary goal o Games of strategy, e.g., balancing your range in poker o Construction of investment portfolios (hedging) o Evolution (genetic diversity) The utility of


slide-1
SLIDE 1

being unpredictable

Examples where diversity is beneficial as a secondary goal

  • Games of strategy, e.g., “balancing your range” in poker
  • Construction of investment portfolios (hedging)
  • Evolution (genetic diversity)

The utility of imagining an adversary…

slide-2
SLIDE 2

being simplistic

Examples where simplicity is beneficial as a secondary goal

“Among competing hypotheses, the one with the fewest assumptions should be selected.” Occam’s razor: Jaynes’ Principle of maximum entropy: “Given some data, among all hypothetical probability distributions that agree with the

data, the one of maximum entropy best represents the current state of knowledge.”

slide-3
SLIDE 3

a measure of unpredictability

The Shannon entropy

  • If 𝑌 denotes a random message from some distribution, then the average number of

bits needed to communicate (or compress) 𝑌 is ≈ 𝐼(𝑌) If 𝑌 is a random variable taking values in a finite state space Ω, we define the Shannon entropy of 𝒀 by 𝐼 𝑌 ≔ ෍

𝑦∈Ω

ℙ 𝑌 = 𝑦 log 1 ℙ 𝑌 = 𝑦

  • English text has between 0.6 and 1.3 bits of entropy per character.

(with the contention that 0 log 0 = 0). Also, we will use “log” for the base-2 logarithm, except when we use it for the natural logarithm…

slide-4
SLIDE 4

a measure of unpredictability

The Shannon entropy

The probability mass function of 𝒀 is given by 𝑞 𝑦 = ℙ[𝑌 = 𝑦]. We will also write 𝐼 𝑞 . Important fact: 𝐼 is a concave function of 𝑞. If 𝑌 is a random variable taking values in a finite state space Ω, we define the Shannon entropy of 𝒀 by 𝐼 𝑌 ≔ ෍

𝑦∈Ω

ℙ 𝑌 = 𝑦 log 1 ℙ 𝑌 = 𝑦 (with the contention that 0 log 0 = 0). Also, we will use “log” for the base-2 logarithm, except when we use it for the natural logarithm…

slide-5
SLIDE 5

a measure of unpredictability

𝐼 𝑞 ≔ ෍

𝑦∈Ω

𝑞 𝑦 log 1 𝑞 𝑦 𝐼 is a strictly concave function of 𝑞.

slide-6
SLIDE 6

examples

The Shannon entropy

If 𝑌 is a random variable taking values in a finite state space Ω, we define the Shannon entropy of 𝒀 by 𝐼 𝑌 ≔ ෍

𝑦∈Ω

ℙ 𝑌 = 𝑦 log 1 ℙ 𝑌 = 𝑦 Outcome of a presidential poll vs. outcome of a fair coin flip

slide-7
SLIDE 7

examples

If 𝑌 is a random variable taking values in a finite state space Ω, we define the Shannon entropy of 𝒀 by 𝐼 𝑌 ≔ ෍

𝑦∈Ω

ℙ 𝑌 = 𝑦 log 1 ℙ 𝑌 = 𝑦 Suppose there are 𝑜 possible outcomes Ω = 1, 2, … , 𝑜 . What’s the maximum entropy of 𝑌?

slide-8
SLIDE 8

second law of thermodynamics

The universe is maximizing entropy

slide-9
SLIDE 9

two applications today

  • Part I: Entropy to encourage simplicity: Matrix scaling
  • Part II: Entropy to encourage diversification: Caching and paging