An Introduction to Statistical Complexity
MIR@W Statistical Complexity Day University of Warwick
David P. Feldman
18 February 2008
College of the Atlantic
and
An Introduction to Statistical Complexity MIR@W Statistical - - PowerPoint PPT Presentation
An Introduction to Statistical Complexity MIR@W Statistical Complexity Day University of Warwick David P. Feldman 18 February 2008 College of the Atlantic and Santa Fe Institute dave@hornacek.coa.edu http://hornacek.coa.edu/dave/ MIR@W
MIR@W Statistical Complexity Day University of Warwick
18 February 2008
and
MIR@W Statistical Complexity. 18 February 2008 2
Introduction
measures of complexity and (un)predictability.
fun properties of statistical complexity measures.
future work.
I’ve developed for the Santa Fe Institute’s Complex Systems Summer School in China, 2004–2007 and the ISC-PIF Complex Systems Summer School in Paris, 2007.
consult them for much more detail and many more references.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 3
Outline
(a) Entropy Rate (b) Excess Entropy
The next slide shows a highly schematic view of the universe of complex systems
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 4
Exploitation vs. Exploration And many more? Complexity Increases? Stability through Diversity Stability Through Hierarchy Increasing Returns −−> "Power laws"
Themes/General Principles?? Tools/Methods
Nonlinear Dynamics Machine Learning Cellular Automata Symbolic Dynamics Evolutionary Game Theory Agent−Based Models Information Theory Stochastic Processes Statistical Mechanics/RG
Topics/Models
Neural Networks (real & fake) Spin Glasses Evolution (real & fake) Immune System Gene Regulation Pattern Formation Soft Condensed Matter Origins of Life Origins of Civilization Origin and Evolution of Language Networks
Foundations
Measures of Complexity Representation and Detection of Organization Computability, No Free Lunch Theorems And many more ... And many more... And many, many, more... Population Dynamics
Based on Fig. 1.1 from Shalizi, ”Methods and Techniques in Complex Systems Science: An Overview”, pp. 33-114 in Deisboeck and Kresh (eds.), Complex Systems Science in Biomedicine (New York: Springer-Verlag, 2006); http://arxiv.org/abs/nlin.AO/0307015
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 5
Comments on the Complex Systems Quadrangle
unifying principles? Loose similarities? No relationships at all?
and real systems.
questions about relationships between structure and randomness, and between the observer and the observed.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 6
Complexity: Initial Thoughts
how difficult it to describe it.
description.
discuss a bunch of these in my lectures.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 7
Predictability, Unpredictability, and Complexity
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 8
Information Theoretic View of Randomness and Structure
The Shannon entropy of a random variable X is given by:
(1)
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 9
Interpretations of Entropy
uniform distribution, and independent of the manner in which subsets of events are grouped, uniquely determines H.
N × average length of optimal binary code of N
copies of X.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 10
Applying Information Theory to Stochastic Processes
measurements.
Context: Consider a long sequence of discrete random variables. These could be:
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 11
The Measurement Channel
generalized measurement process:
Instrument 1 |A| Encoder ...adbck7d...
Observer
discretizes them.
Complex Systems. L. Lam and H. C. Morris, eds. Springer-Verlag, 1992: 66-10.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 12
Stochastic Process Notation
↔
←
→
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 13
Entropy Growth
0.5 1 1.5 2 2.5 3 3.5 4 1 2 3 4 5 6 7 8 H(L) L
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 14
Entropy Rate
µ
+ h L E
L→∞ hµ(L).
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 15
Entropy Rate, continued
L→∞ hµ(L).
previous L symbols have been observed.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 16
Interpretations of Entropy Rate
for correlations over arbitrarily large blocks of variables.
H(L) L
.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 17
How does hµ(L) approach hµ?
1 L h (L)
µ
hµ E H(1)
entropy density converges to hµ.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 18
The Excess Entropy 1 L h (L)
µ
hµ E H(1)
as the shaded area above:
∞
considering larger blocks of variables.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 19
Excess Entropy: Other expressions and interpretations Mutual information
and the “future”:
←
→
↔
s }
↔
↔
←
→
about one variable given the outcome of the other:
uncertainty about the future given knowledge of the past.
appears if all historical information is suddenly lost.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 20
Excess Entropy: Other expressions and interpretations Geometric View
L H(L)
µ
+ h L E
E H(L)
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 21
Excess Entropy Summary
to entropy.
needed for minimal stochastic model of system
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 22
Example I: Fair Coin
2 4 6 8 10 12 14 16 2 4 6 8 10 12 14 H(L) L H(L): Fair Coin H(L): Biased Coin, p=.7
entropy.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 23
Example II: Periodic Sequence
0.5 1 1.5 2 2.5 3 3.5 4 4.5 2 4 6 8 10 12 14 16 18 H(L) L H(L) E + hµL 0.2 0.4 0.6 0.8 1 1.2 1.4 2 4 6 8 10 12 14 16 18 hµ(L) L hµ(L)
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 24
Example II, continued
For many more examples, see Crutchfield and Feldman, Chaos, 15: 25-54, 2003.
For more than you probably ever wanted to know about periodic sequences, see Feldman and Crutchfield, Synchronizing to Periodicity: The Transient Information and Synchronization Time of Periodic Sequences. Advances in Complex Systems. 7(3-4): 329-355, 2004.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 25
Excess Entropy: Notes on Terminology All of the following terms refer to essentially the same quantity.
enyi) Information: Sz´ epfalusy, Gy¨
as
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 26
Excess Entropy: Selected References and Applications
7:201-223, 1983. [Dynamical systems]
dynamical systems]
epfalusy and Gy¨
systems]
as and Sz´ epfalusy, Phys. Rev. A, 39:4767-4777. 1989. [Dynamical Systems]
E 55:R1239-42. 1997. [One-dimensional Ising models]
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 27
Excess Entropy: Selected References and Applications, continued
Ising models]
processing]
machine learning]
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 28
Estimating Probabilities
1 1 1 ...001011101000...
1 1 1 1 1 1 1 1 1 1 1 1 Pr(s )
3
Observer System A B C Process
uses these to estimate Pr(sL), from which E and hµ may be readily calculated. However, this will lead to a biased under-estimate for hµ. For more sophisticated and accurate ways of inferring hµ, see, e.g.,
urmann and Grassberger. Chaos 6:414-427. 1996.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 29
A look ahead
1 1 1 ...001011101000...
1 1 1 1 1 1 1 1 1 1 1 1 Pr(s )
3
Observer System A B C Process
and C.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 30
An Introduction to Computational Mechanics
complexity or regularities.
models of formal computation to provide a direct, structural accounting of a system’s intrinsic information processing.
manipulates information. Context:
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 31
An Initial Example: The Prediction Game
predicting, as best you can, subsequent values of the sequence.
be impossible.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 32
Discovery!
The other symbols are 0 or 1 with equal probability.
before.
computer to do pattern discovery?
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 33
Initial example, continued
determines the next state (A or B).
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 34
Initial Example: Why Two States?
information about the future sequences
certainty that a 1 is next.
sequences that give rise to different predictive information.
below.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 35
What do you Need to Remember in Order to Predict?
011 1 01 11 10 111 010 011 110 101 1111 1010 0101 1110 1101 1011 0111 1111 11111 11110 11101 11011 10111 01111 10101 01111 01110 01011 111111 110111 010111 Do I really have to remember all this?? My memory isn’t good enough. Space of all possible pasts.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 36
One Only Needs to Remember the Causal States.
Causal states partition the space of all past sequences
11110 110 011 010 01110 1010 01011 01111 110111 11011 010111 1110 0111 10101 10 0101 11101 101 1101 10111 01111 01 This is better! I only need to remember the causal state, A or B. 0111
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 37
How Might We Find Causal States?
←
→
←
knowledge about
→
←
←
←
→
←
→
←
aggregate variables necessary for optimal prediction of
→
→
→
under ∼.
→
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 38
states are an ǫ-machine, a minimal model capable of statistically reproducing the original configuration.
formed may be distorted via noise or the discretization process.
1|1 1 | 1/2 0 | 1/2
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 39
Distribution over Causal States
states is the left eigenvector of the transition matrix T :
(2)
α pα = 1.
1 2 1 2
(3)
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 40
distribution of the causal states:
(4)
causal states.
needed to perform optimal prediction.
present in the system.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 41
Some Important Properties of ǫ-machines
→
←
→
(5) I.e., all the information about the future is contained in the causal states.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 42
Statistical Complexity vs. Excess Entropy
complexity or structure or pattern or organization. However, they are not the same.
process.
(6)
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 43
Example I Fair Coin:
Entropy rate hµ = 1, Statistical Complexity Cµ = 0.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 44
Example II Period 2 Pattern:
Entropy rate hµ = 0, Statistical complexity Cµ = 1.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 45
A non-minimal example Consider this machine for a period 2 sequence:
about the future.
ensure that ǫ-machines are minimal.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 46
Algorithms for Inferring ǫ-machines There are two basic approaches
rise to the same future distribution. I.e., merge states that are equivalent under ∼.
predictability.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 47
CSSR
known as CSSR. (Causal State Splitting Algorithm)
(eds.), Uncertainty in Artificial Intelligence: Proceedings of the Twentieth Conference, http://arxiv.org/abs/cs.LG/0406011.
languages, anomaly detection, natural languages, and more.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 48
Computational Mechanics References and Applications Almost all of the papers below can be found online either on arXiv.org or with a little bit of searching.
causal states. Careful proofs of optimality and minimality.]
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 49
Applications and Extensions of Causal States
e, Phys. Rev. E, 55:2338-2344, 1997. [Coupled Map Lattices]
review, calculations of excess entropy, and comparisons to statistical mechanical quantities.]
inferred from empirical data.]
[Dynamical systems on random networks]
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 50
Applications and Extensions of Causal States, Continued
dimension]
Natural Language Processing. 2005.
Applications to Alzheimer’s disease.]
finding!]
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 51
Computational Mechanics Conclusions: Questions:
Summary:
is to recognize them.
probabilistic setting.
present in a system?
that we haven’t seen before.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 52
MIR@W Statistical Complexity Day University of Warwick
18 February 2008
and
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 53
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 54
Thoughts on the Subjectivity of Complexity
make.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 55
Example I: Disorder as the Price of Ignorance
estimates hµ using an estimator that assumes E = 0.
′(L). Then, the difference between the
estimate and the true hµ is (Prop. 13, Crutchfield and Feldman, 2003):
µ(L) − hµ = E
that is directly proportional to the the complexity E.
randomness (h′
µ(L) − hµ).
2003.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 56
Example II: Effects of Bad Discretization
equal to the entropy of the original sequence of numbers.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 57
Example II: Effects of Bad Discretization (continued)
0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 0.2 0.4 0.6 0.8 1 Excess Entropy E, Entropy Rate hµ Critical Value for Partition xc hµ E
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 58
Example III: A Randomness Puzzle
calculate digits of π?
algorithm to produce the digits of π.
sequences.
1 1 1 ...001011101000...
1 1 1 1 1 1 1 1 1 1 1 1 Pr(s )
3
Observer System A B C Process
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 59
Example IV: Unpredictability due to Asynchrony
rainy for two days, then sunny for three days.
B C D E A Rain Rain Sun Sun Sun
don’t know what day it is: {A, B, C, D, E}.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 60
Example IV: Unpredictability due to Asynchrony
perfectly predictable; hµ = 0.
made.
Information T:
∞
(8)
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 61
Example IV: Unpredictability due to Asynchrony
different T’s.
(9) and
2 P ,
(10)
(11)
Periodicity.” Advances in Complex Systems. 7:329-355. 2004.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 62
under-estimate of E.
statement about the observer, the observed, and the relationship between the two.
way.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 63
Modeling Modeling
an abstraction of the modeling process itself.
between, say, the complexity of a model and the observed unpredictability of the object under study.
This influence can be understood.
insight into modeling.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 64
Model Dependence
assumptions.
statement about complexity will always be, to some extent, a statement about both the observer and the observed.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 65
else?
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 66
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 67
complexity as a function of entropy:
class of apparently different systems.
critical phenomena and condensed matter physics.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 68
accounts in a direct way for the correlations and organization in a system.
to entropy.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 69
0.2 0.4 0.6 0.8 1 3 3.2 3.4 3.6 3.8 4 final states r
– r = 3.2: Period 2. – r = 3.5: Period 5. – r = 3.7: Chaotic.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 70
Complexity vs. Entropy: Logistic Equation Plot the excess entropy E and the entropy rate hµ for the logistic equation as a function of the parameter r.
1 2 3 4 5 3.2 3.3 3.4 3.5 3.6 3.7 3.8 3.9 4 Excess Entropy E, Entropy Rate hµ r E hµ
entropy sequences. r was varied by increments of 0.0001.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 71
Complexity-Entropy Diagrams
to entropy.
two variables against each other instead of as a function of time. This shows how the two variables are related.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 72
Complexity-Entropy Diagram for Logistic Equation
1 2 3 4 5 0.2 0.4 0.6 0.8 1 Excess Entropy E Entropy Rate h
diagram is self-similar.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 73
Consider a one- or two-dimensional Ising system with nearest and next nearest neighbor interactions:
distribution:
T H(C) .
same or the opposite thing.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 74
0.5 1 1.5 2 0.2 0.4 0.6 0.8 1 Excess Entropy E Entropy Rate hµ
with anti-ferromagnetic couplings.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 75
1 2 3 4 5 0.2 0.4 0.6 0.8 1 Excess Entropy Ei Entropy Density hµ
the two-dimensional Ising model with AFM couplings
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 76
0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.2 0.4 0.6 0.8 1 Excess Entropy Ec Entropy Density hµ
two-dimensional Ising model with NN couplings and no external field.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 77
0.2 0.4 0.6 0.8 1 1 2 3 4 5 6 Entropy Density hµ Temperature T 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 1 2 3 4 5 6 Excess Entropy Ec Temperature T
temperature T for the two-dimensional Ising model with NN couplings and no external field.
than when plotted against hµ as on the previous slide.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 78
temperature.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 79
a given rule
Example:
Rule
Time Condition Initial
as the radius of the CA.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 80
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 81
0.5 1 1.5 2 2.5 3 3.5 4 0.2 0.4 0.6 0.8 1 Excess Entropy E Entropy Density hµ
radius-2 CAs?
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 82
1 2 3 4 5 6 7 8 0.2 0.4 0.6 0.8 1 Excess Entropy E Entropy Density hµ
entire space.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 83
0.5 1 1.5 2 0.2 0.4 0.6 0.8 1 Excess Entropy E Entropy Rate hµ
the previous two symbols, as in the 1D NNN Ising model.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 84
5 1 1 2 3 4 1 1
1
1 1
1 1 2 1 3 1 4 1 5 1 6 1
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 85
number of states.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 86
0.5 1 1.5 2 2.5 3 0.2 0.4 0.6 0.8 1 Statistical complexity cµ Entropy rate hµ n=6 n=5 n=4 n=3 n=2 n=1
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 87
A Gallery of Complexity-Entropy Diagrams The next slide shows, left to right, top to bottom, complexity-entropy diagrams for:
interactions
interactions
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 88
1 2 3 4 5 0.2 0.4 0.6 0.8 1 Excess Entropy E Entropy Rate h
0.5 1 1.5 2 0.2 0.4 0.6 0.8 1 Excess Entropy E Entropy Rate hµ
1 2 3 4 5 0.2 0.4 0.6 0.8 1 Excess Entropy Ei Entropy Density hµ
1 2 3 4 5 6 7 8 0.2 0.4 0.6 0.8 1 Excess Entropy E Entropy Density hµ
0.5 1 1.5 2 0.2 0.4 0.6 0.8 1 Excess Entropy E Entropy Rate hµ
0.5 1 1.5 2 2.5 3 0.2 0.4 0.6 0.8 1 Statistical complexity cµ Entropy rate hµ n=6 n=5 n=4 n=3 n=2 n=1
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 89
provide a useful way to compare the information processing abilities of different systems.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 90
Complexity-Entropy Diagrams: Conclusions
processing abilities of different systems in a parameter-free way.
processing abilities of very different model classes on similar terms.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 91
Some Thoughts on the Past, Present, and Future of Complexity Measures
we think about and measure complexity, memory, structure, and pattern.
approach structural complexity. Useful for: – Analyzing real data – Deepening understanding of model systems and fundamental sources of complexity or regularity. – Shedding light on foundational issues in pattern discovery.
that have turned out to be not as useful as one may have hoped.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 92
A Few Cautionary Notes
needed to help distinguish between different uses of the word.
to do so.
the complexity measure.
drawbacks:
loses the ability to distinguish between systems that can be described by computational models less powerful than a UTM.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 93
Complexity = Order × Disorder?
where equilibrium and equiprobability are sometimes considered to be synonymous. In my view these sorts of complexity measures have some serious shortcomings:
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 94
Open Questions and Future Directions
(a) Ay’s and L¨
(b) Situations in which the excess entropy and/or the statistical complexity diverge
(a) Non-stationary data (b) Two-dimensional systems
(c) Complexity of networks
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 95
Open Questions and Future Directions
(a) Understand more fully the relation between various complexity measures and critical phenomena. (b) Disordered or inhomogeneous systems, e.g. spin glasses. (c) Agent-based models. (d) Empirical data, a.k.a., the real world. (Watkins’ talk. ) (e) Other model systems. (Nerukh’s talk.)
(a) Better estimators for causal states, statistical complexity, etc. (b) Connection between measures of complexity and the difficulty of learning a pattern. (c) On-line complexity estimation.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 96
Open Questions and Future Directions
questions of complexity, organization, and emergence.
David P . Feldman
http://hornacek.coa.edu/dave
MIR@W Statistical Complexity. 18 February 2008 97
Thanks and Acknowledgments
McTague, Cris Moore, Richard Scalettar, Cosma Shalizi, Dan Upper, Dowman Varn, Jon Wilkins, Karl Young,
Complex Systems Summer Schools in Beijing, China, and Santa Fe, USA.
David P . Feldman
http://hornacek.coa.edu/dave