examples where diversity is beneficial as a secondary goal
play

Examples where diversity is beneficial as a secondary goal o Games of - PowerPoint PPT Presentation

being unpredictable Examples where diversity is beneficial as a secondary goal o Games of strategy, e.g., balancing your range in poker o Construction of investment portfolios (hedging) o Evolution (genetic diversity) The utility of


  1. being unpredictable Examples where diversity is beneficial as a secondary goal o Games of strategy, e.g., “balancing your range” in poker o Construction of investment portfolios (hedging) o Evolution (genetic diversity) The utility of imagining an adversary…

  2. being simplistic Examples where simplicity is beneficial as a secondary goal Occam’s razor: “ Among competing hypotheses, the one with the fewest assumptions should be selected.” Jaynes’ Principle of maximum entropy: “ Given some data, among all hypothetical probability distributions that agree with the data, the one of maximum entropy best represents the current state of knowledge.”

  3. a measure of unpredictability The Shannon entropy If 𝑌 is a random variable taking values in a finite state space Ω , we define the Shannon entropy of 𝒀 by 1 𝐼 𝑌 ≔ ෍ ℙ 𝑌 = 𝑦 log ℙ 𝑌 = 𝑦 𝑦∈Ω (with the contention that 0 log 0 = 0 ). Also, we will use “log” for the base -2 logarithm, except when we use it for the natural logarithm… - If 𝑌 denotes a random message from some distribution, then the average number of bits needed to communicate (or compress) 𝑌 is ≈ 𝐼(𝑌) - English text has between 0.6 and 1.3 bits of entropy per character.

  4. a measure of unpredictability The Shannon entropy If 𝑌 is a random variable taking values in a finite state space Ω , we define the Shannon entropy of 𝒀 by 1 𝐼 𝑌 ≔ ෍ ℙ 𝑌 = 𝑦 log ℙ 𝑌 = 𝑦 𝑦∈Ω (with the contention that 0 log 0 = 0 ). Also, we will use “log” for the base -2 logarithm, except when we use it for the natural logarithm… The probability mass function of 𝒀 is given by 𝑞 𝑦 = ℙ[𝑌 = 𝑦] . We will also write 𝐼 𝑞 . Important fact: 𝐼 is a concave function of 𝑞 .

  5. a measure of unpredictability 1 𝐼 𝑞 ≔ ෍ 𝑞 𝑦 log 𝑞 𝑦 𝑦∈Ω 𝐼 is a strictly concave function of 𝑞 .

  6. examples The Shannon entropy If 𝑌 is a random variable taking values in a finite state space Ω , we define the Shannon entropy of 𝒀 by 1 𝐼 𝑌 ≔ ෍ ℙ 𝑌 = 𝑦 log ℙ 𝑌 = 𝑦 𝑦∈Ω Outcome of a presidential poll vs. outcome of a fair coin flip

  7. examples If 𝑌 is a random variable taking values in a finite state space Ω , we define the Shannon entropy of 𝒀 by 1 𝐼 𝑌 ≔ ෍ ℙ 𝑌 = 𝑦 log ℙ 𝑌 = 𝑦 𝑦∈Ω Suppose there are 𝑜 possible outcomes Ω = 1, 2, … , 𝑜 . What’s the maximum entropy of 𝑌 ?

  8. second law of thermodynamics The universe is maximizing entropy

  9. two applications today o Part I: Entropy to encourage simplicity: Matrix scaling o Part II: Entropy to encourage diversification: Caching and paging

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend