anomalous statistics of dynamical systems on networks
play

Anomalous statistics of dynamical systems on networks Stefan - PowerPoint PPT Presentation

Anomalous statistics of dynamical systems on networks Stefan Thurner www.complex-systems.meduniwien.ac.at www.santafe.edu trento jul 23 2012 with R. Hanel and M. Gell-Mann PNAS 108 (2011) 6390-6394 Europhys Lett 93 (2011) 20006 Europhys


  1. Anomalous statistics of dynamical systems on networks Stefan Thurner www.complex-systems.meduniwien.ac.at www.santafe.edu trento jul 23 2012

  2. with R. Hanel and M. Gell-Mann PNAS 108 (2011) 6390-6394 Europhys Lett 93 (2011) 20006 Europhys Lett 96 (2011) 50003 trento jul 23 2012 1

  3. Why are networks cool? • Tell you who interacts with whom • Same statistical system on different networks can behave totally different trento jul 23 2012 2

  4. How ? • Simple example: Ising spins on constant-connectency networks • Show: this is not of Boltzmann Gibbs type – give exact statistics trento jul 23 2012 3

  5. Why Statistics ? • Central concept: understanding macroscopic system behavior on the basis of microscopic elements and interactions → entropy • Functional form of entropy: must encode information on interactions too! • Entropy relates number of states to an extensive quantity, plays funda- mental role in the thermodynamical description • Hope: ’thermodynamical’ relations → phase diagrams, etc. trento jul 23 2012 4

  6. 3 Ingredients • Entropy has scaling properties → what are entropies for non-ergodic systems? • How does entropy grow with system size? → what n.e. system is realized? • Symmetry in thermodynamic systems → if broken: entropy has no thermodynamic meaning → forget dream about handling system with TD trento jul 23 2012 5

  7. What is the entropy of strongly interacting systems? trento jul 23 2012 6

  8. Appendix 2, Theorem 2 C.E. Shannon, The Bell System Technical Journal 27 , 379-423, 623-656, 1948. trento jul 23 2012 7

  9. Entropy W � S [ p ] = g ( p i ) i =1 p i ... probability for a particular (micro) state of the system, � i p i = 1 W ... number of states g ... some function. What does it look like? trento jul 23 2012 8

  10. The Shannon-Khinchin axioms • SK1: S depends continuously on p → g is continuous • SK2: entropy maximal for equi-distribution p i = 1 /W → g is concave • SK3: S ( p 1 , p 2 , · · · , p W ) = S ( p 1 , p 2 , · · · , p W , 0) → g (0) = 0 • SK4: S ( A + B ) = S ( A ) + S ( B | A ) Theorem: If SK1-SK4 hold, the only possibility is Boltzmann-Gibbs-Shannon entropy W � S [ p ] = g ( p i ) with g ( x ) = − x ln x i =1 trento jul 23 2012 9

  11. Shannon-Khinchin axiom 4 is non-sense for NWs → SK4 violated for strongly interacting systems → nuke SK4 SK4 corresponds to weak interactions or Markovian processes trento jul 23 2012 10

  12. The Complex Systems axioms • SK1 holds • SK2 holds • SK3 holds • S g = � W i g ( p i ) , W ≫ 1 Theorem: All systems for which these axioms hold (1) can be uniquely classified by 2 numbers, c and d (2) have the unique entropy � W � e Γ (1 + d , 1 − c ln p i ) − c � S c,d = e · · · Euler const 1 − c + cd e i =1 trento jul 23 2012 11

  13. The argument: generic mathematical properties of g • Scaling transformation W → λW : how does entropy change ? trento jul 23 2012 12

  14. Mathematical property I: an unexpected scaling law ! S g ( Wλ ) S g ( W ) = ... = λ 1 − c lim W →∞ g ( zx ) Theorem 1: Define f ( z ) ≡ lim x → 0 g ( x ) with (0 < z < 1) . Then for systems satisfying SK1, SK2, SK3: f ( z ) = z c , 0 < c ≤ 1 trento jul 23 2012 13

  15. Theorem 1 Let g be a continuous, concave function on [0 , 1] with g (0) = 0 and let f ( z ) = lim x → 0 + g ( zx ) /g ( x ) be continuous, then f is of the form f ( z ) = z c with c ∈ (0 , 1] . Proof. Note that f ( ab ) = lim x → 0 g ( abx ) /g ( x ) = lim x → 0 ( g ( abx ) /g ( bx ))( g ( bx ) /g ( x )) = f ( a ) f ( b ) . All pathological solutions are excluded by the requirement that f is continuous. So f ( ab ) = f ( a ) f ( b ) implies that f ( z ) = z c is the only possible solution of this equation. Further, since g (0) = 0 , also lim x → 0 g (0 x ) /g ( x ) = 0 , and it follows that f (0) = 0 . This necessarily implies that c > 0 . f ( z ) = z c also has to be concave since g ( zx ) /g ( x ) is concave in z for arbitrarily small, fixed x > 0 . Therefore c ≤ 1 . trento jul 23 2012 14

  16. Mathematical properties II: yet another one !! S ( W 1+ a ) S ( W ) W a (1 − c ) = ... = (1 + a ) d lim W →∞ Theorem 2: Define h c ( a ) ≡ ... trento jul 23 2012 15

  17. Theorem 2 Let g be like in Theorem 1 and let f ( z ) = z c then h c given in Eq. (8) is a constant of the form h c ( a ) = (1 + a ) d for some constant d . Proof. We determine h c ( a ) again by a similar trick as we have used for f . ( x b ) ( a +1 − 1 ) +1 � � b g g ( x a +1 ) g ( x b ) h c ( a ) = lim x → 0 x ac g ( x ) = ( x b ) ( a +1 − 1 ) c g ( x b ) x ( b − 1) c g ( x ) b � a +1 � = h c − 1 h c ( b − 1) , b for some constant b . By a simple transformation of variables, a = bb ′ − 1 , one gets h c ( bb ′ − 1) = h c ( b − 1) h c ( b ′ − 1) . Setting H ( x ) = h c ( x − 1) one again gets H ( bb ′ ) = H ( b ) H ( b ′ ) . So H ( x ) = x d for some constant d and consequently h c ( a ) is of the form (1 + a ) d . trento jul 23 2012 16

  18. Summary Strongly interacting systems → SK1-SK3 hold S g ( W λ ) S g ( W ) = λ 1 − c → lim W →∞ 0 ≤ c < 1 S ( W 1+ a ) S ( W ) W a (1 − c ) = (1 + a ) d → lim W →∞ d real Remarkable: • all systems are characterized by 2 exponents: ( c, d ) – universality class • Which S fulfills above? → S c,d = � W i =1 re Γ (1 + d , 1 − c ln p i ) − rc � � 1 � � − d B (1+ x W k r ) d − W k ( B ) 1 − c • Which distribution maximizes S c,d → p c,d ( x ) = e � ∞ 1 − c + cd , B = 1 − c 1 � 1 − c � dt ta − 1 exp( − t ) ; Lambert- W : solution to x = W ( x ) eW ( x ) r = cd exp , Γ( a, b ) = b cd trento jul 23 2012 17

  19. Holds very generically • for all non-ergodic systems • for all non-Markovian systems (complex systems) trento jul 23 2012 18

  20. Examples • S 1 , 1 = � i g 1 , 1 ( p i ) = − � i p i ln p i + 1 (BG entropy) i p q 1 − � • S q, 0 = � i g q, 0 ( p i ) = i + 1 (Tsallis entropy) q − 1 i g 1 ,d ( p i ) = e i Γ (1 + d , 1 − ln p i ) − 1 • S 1 ,d> 0 = � � d (AP entropy) d • ... trento jul 23 2012 19

  21. Classification of entropies: order in the zoo entropy c d S BG = � i p i ln(1 /p i ) 1 1 1 − � pq i • S q< 1 = ( q < 1) c = q < 1 0 q − 1 i p i ( p κ i − p − κ • S κ = � ) / ( − 2 κ ) ( 0 < κ ≤ 1 ) c = 1 − κ 0 i 1 − � pq i • S q> 1 = ( q > 1) 1 0 q − 1 i (1 − e − bpi ) + e − b − 1 • S b = � ( b > 0) 1 0 pi − 1 pi ) • S E = � i p i (1 − e 1 0 i Γ( η +1 η , − ln p i ) − p i Γ( η +1 • S η = � η ) ( η > 0) 1 d = 1 /η i p i ln 1 /γ (1 /p i ) • S γ = � 1 d = 1 /γ i p β • S β = � i ln(1 /p i ) c = β 1 S c,d = � i er Γ( d + 1 , 1 − c ln p i ) − cr c d trento jul 23 2012 20

  22. Distribution functions of CS • p (1 , 1) → exponentials (Boltzmann distribution) • p ( q, 0) → power-laws ( q -exponentials) • p (1 ,d> 0) → stretched exponentials • p ( c,d ) all others → Lambert- W exponentials NO OTHER POSSIBILITIES trento jul 23 2012 21

  23. q -exponentials Lambert-exponentials (c) r=exp( − d/2)/(1 − c) (b) d=0.025, r=0.9/(1 − c) 0 0 10 10 c=0.2 − 10 10 (0.3, − 4) c=0.4 (0.3, − 2) p(x) p(x) (0.3, 2) c=0.6 − 20 − 20 (0.3, 4) 10 10 (0.7, − 4) (0.7, − 2) (0.7, 2) − 30 10 (0.7, 4) c=0.8 0 5 0 5 10 10 10 10 x x trento jul 23 2012 22

  24. The world beyond Shannon violates K2 compact support BG − entropy (1,0) of distr. function 1 Stretched exponentials − asymptotically stable violates K2 (c,d) − entropy, d<0 (c,d) − entropy, d>0 c Lambert W − 1 exponentials Lambert W 0 exponentials q − entropy, 0<q<1 (c,0) 0 (0,0) violates K3 − 1 0 1 2 d trento jul 23 2012 23

  25. Scaling property opens door to ... • ...bring order in the zoo of entropies through universality classes • ...understand ubiquity of power laws (and extremely similar functions) • ...understand where Tsallis entropy comes from • ...understand statistical systems on networks trento jul 23 2012 24

  26. The requirement of extensivity trento jul 23 2012 25

  27. Needed for TD program to work: extensive entropies System has N elements → W ( N ) ... phasespace volume (system property) Extensive: S ( W A + B ) = S ( W A ) + S ( W B ) = · · · [use scaling property I] → � � �� 1 d Can proof: extensive is equivalent to W ( N ) = exp 1 − c W k µ (1 − c ) N d W ′ ( N ) N →∞ 1 − 1 c = lim N W ( N ) � 1 � W d = N →∞ log W lim W ′ + c − 1 N Message: Growth of phasespace volume determines entropy and vice versa trento jul 23 2012 26

  28. Examples • W ( N ) = 2 N → ( c, d ) = (1 , 1) and system is BG • W ( N ) = N b → ( c, d ) = (1 − 1 b , 0) and system is Tsallis • W ( N ) = exp( λN γ ) → ( c, d ) = (1 , 1 γ ) • ... Can explicitly verify statements in theory of binary processes and spin- systems on networks trento jul 23 2012 27

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend