mathematics for complex systems overview and motivation
play

Mathematics for Complex Systems: Overview and Motivation The - PowerPoint PPT Presentation

SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 1 SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 2 Mathematics for Complex Systems: Overview and Motivation The Objective


  1. SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 1 SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 2 Mathematics for Complex Systems: Overview and Motivation The Objective Relativity of Complexity and Entropy • Complex systems pose a challenge for mathematics and mathematical sciences. David Feldman • Can mathematics be used at all for such systems? Or are such systems College of the Atlantic simply too complex to be simplified via mathematics? and • Central premise: the abstractions of mathematics and mathematical models The Santa Fe Institute can be used to gain qualitative insight into complex systems. • In my remarks I will focus on two questions: 1. What is complexity? http://hornacek.coa.edu/dave/ 2. What does it mean to model? Collaborator: Jim Crutchfield (UC Davis and SFI) • I hope to convince you that the first question cannot be answered without answering the second question. Thanks to: Carl McTague, Cosma Shalizi, Karl Young David P . Feldman http://hornacek.coa.edu/dave David P . Feldman http://hornacek.coa.edu/dave SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 3 SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 4 Measurement and Modeling Why Complexity? | A | Encoder • Complexity is generally understood to be a measure of the difficulty of ...adbck7d... Observer Instrument describing a thing or a process. 1 • There are many different contexts in which the term complexity is used: – Complexity as a measure of difficulty of learning a pattern (Bialek, et al, • On the left is “nature.” 2001) • The act of measurement projects this system down to a lower dimension. – Biological and ecological systems exhibit different levels of complexity and • These measurements are discretized. organization which we can study • The measurements may then be encoded or corrupted by noise. – Complexity(?) in evolution (McShea, 1991) – Complexity as measure of structure or pattern or correlation. • They then reach the observer on the right, who wishes to make inferences about “nature.” • I will focus on this last sort of complexity, but I think my general results extend to other types of complexity. • Figure source: Crutchfield, 1992. David P . Feldman http://hornacek.coa.edu/dave David P . Feldman http://hornacek.coa.edu/dave

  2. SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 5 SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 6 Modeling and Inference Shannon Entropy Any time we use a probability distribution, this indicates some uncertainty. B However, Some distributions indicate more uncertainty than others. 1 Pr(s ) ...001011101000... 3 1 0 The Shannon Entropy H is the measure of the uncertainty associated with A C 1 0 0 0 0 1 1 1 1 0 0 0 1 1 0 0 1 1 0 1 0 1 0 1 0 1 a probability distribution: System Process Observer � H [ X ] ≡ − Pr( x ) log 2 Pr( x ) . (1) • In very idealized form, the observer is faced with a long string of binary x measurement data: • A Fair Coin : (Probability of heads = 1 2 ) has an unpredictability of 1 . ... 01101011101010101110101101010111011101 ... • A Biased Coin : (Probability of heads = 0 . 9 ) has an unpredictability of 0 . 47 . • A Perfectly Biased Coin : (Probability of heads = 1 . 0 ) has an unpredictability • What can the observer infer from this? of 0 . 00 . • The observer can determine the frequency (or probability) of occurrence of The conditional entropy is defi ned via: different sequences of 0 ’s and 1 ’s. � H [ X | Y ] ≡ − Pr( x, y ) log 2 Pr( x | y ) . (2) • Information theory gives us a way to measure properties of the sequence. x David P . Feldman http://hornacek.coa.edu/dave David P . Feldman http://hornacek.coa.edu/dave SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 7 SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 8 Entropy Rate Excess Entropy • The entropy rate h µ is defined via • The excess entropy E is defined as the total amount of randomness that is L →∞ H [ S L | S L − 1 S L − 2 . . . S 0 ] . lim “explained away” by considering larger blocks of variables. • One can also show that E is equal to the mutual information between the • In words: the entropy rate is the average uncertainty of the next symbol, given “past” and the “future”: that an arbitrarily large number of symbols have already been seen. → ← → → ← E = I ( S ; S ) ≡ H [ S ] − H [ S | S ] . • h µ is the irreducible randomness: the randomness that persists even after statistics over arbitrarily long sequences are taken into account. • E is thus the amount one half “remembers” about the other, the reduction in uncertainty about the future given knowledge of the past. • h µ is a measure of unpredictability. • Equivalently, E is the “cost of amnesia:” how much more random the future appears if all historical information is suddenly lost. David P . Feldman http://hornacek.coa.edu/dave David P . Feldman http://hornacek.coa.edu/dave

  3. SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 9 SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 10 Excess Entropy and Entropy Rate Summary Example I: Disorder as the Price of Ignorance • Let us suppose that an observer seeks to estimate the entropy rate. • Excess entropy E is a measure of complexity (order, pattern, regularity, • To do so, it considers statistics over sequences of length L and then correlation ... ) estimates h µ using an estimator that assumes E = 0 . • Entropy rate h µ is a measure of unpredictability. ′ ( L ) . Then, the difference between the • Call this estimated entropy h µ • Both E and h µ are well understood and have clear interpretations. estimate and the true h µ is (Proposition 13, Crutchfield and Feldman, 2003): • For more, see, e.g., Grassberger 1986; Crutchfield and Feldman, 2003. µ ( L ) − h µ = E h ′ L . (3) • I’ll now consider 3 examples that illustrate some of the subtleties that are • In words: The system appears more random than it really is by an amount associated with measuring h µ and E . that is directly proportional to the the complexity E . • In other words: regularities ( E ) that are missed are converted into apparent randomness ( h ′ µ ( L ) − h µ ). David P . Feldman http://hornacek.coa.edu/dave David P . Feldman http://hornacek.coa.edu/dave SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 11 SHE Workshop, 19 October 2006: The Objective Relativity of Complexity and Entropy 12 Example II: A Randomness Puzzle Example III: Unpredictability due to Asynchrony • Suppose we consider the binary expansion of π . Calculate its entropy rate • Imagine a strange island where the weather repeats itself every 5 days. It’s h µ and we’ll find that it’s 1 . rainy for two days, then sunny for three days. Rain • How can π be random? Isn’t there a simple, deterministic algorithm to A calculate digits of π ? B Rain • Yes. However, it is random if one uses histograms and builds up probabilities Sun over sequences. E • This points out the model-sensitivity of both randomness and complexity. C Sun D B Sun 1 Pr(s ) ...001011101000... 3 1 0 A C 1 0 0 0 0 1 1 1 1 0 0 0 1 1 0 0 1 1 • You arrive on this deserted island, ready to begin your vacation. But, you 0 1 0 1 0 1 0 1 Process System Observer don’t know what day it is: { A, B, C, D, E } . • Eventually, however, you will figure it out. • Histograms are a type of model. See, e.g., Knuth 2006. David P . Feldman http://hornacek.coa.edu/dave David P . Feldman http://hornacek.coa.edu/dave

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend