rhythm and the enactive sense of extent and duration SYNTHESIS ASU - - PowerPoint PPT Presentation

rhythm and the enactive sense of extent and duration
SMART_READER_LITE
LIVE PREVIEW

rhythm and the enactive sense of extent and duration SYNTHESIS ASU - - PowerPoint PPT Presentation

rhythm and the enactive sense of extent and duration SYNTHESIS ASU SHA XIN WEI Sha Xin Wei Synthesis Center + AME @ ASU Topological Media Lab @ Concordia Sha SLSA 161105 synthesiscenter.net Koyaanisqatsi, Godfrey Reggio, Philip


slide-1
SLIDE 1 SYNTHESIS • ASU SHA XIN WEI

rhythm and the enactive sense of extent and duration

Sha Xin Wei Synthesis Center + AME @ ASU Topological Media Lab @ Concordia

slide-2
SLIDE 2 Sha • SLSA 161105 synthesiscenter.net

Koyaanisqatsi, Godfrey Reggio, Philip Glass, Ron Fricke.

slide-3
SLIDE 3 Sha • SLSA 161105 synthesiscenter.net

financial dynamics

Koyaanisqatsi, Godfrey Reggio, Philip Glass, Ron Fricke.

slide-4
SLIDE 4 Sha • SLSA 161105 synthesiscenter.net

non-anthropocentric temporality

Koyaanisqatsi, Godfrey Reggio, Philip Glass, Ron Fricke.

slide-5
SLIDE 5 SYNTHESIS • ASU SHA XIN WEI

multiscale rhythmanalytics

Media Gesturing (bodies) Organizations Cities

5

slide-6
SLIDE 6 SYNTHESIS • ASU SHA XIN WEI

parameter | predicate > effect time > temporality (lived) duration felt experience

slide-7
SLIDE 7 SYNTHESIS • ASU SHA XIN WEI

subjectivity

In my many years of research on how and when a mixture of sounds will blend or be heard as separate sounds, my own personal experience and those of my students has played a central role in deciding what to study and how to study it. When I encouraged students to spend a lot of time listening to the stimuli and trying out different patterns of sound to see which ones would show the effect we were interested in, far into the academic year, and nearing the time that they should have been carrying out their experiments, they would get nervous and ask when they would start doing the “real research”. I told them that what they were doing now was the real research, and the formal experiment with subjects and statistics was just to convince other people.

  • - Al Bregman, CIRMMT McGill 2008

7

THE INHABITANT OR VISITOR

slide-8
SLIDE 8 SYNTHESIS • ASU SHA XIN WEI

temporality ≠ time

slide-9
SLIDE 9 SYNTHESIS • ASU SHA XIN WEI

methodological

[R]eject as abstract any analysis of bodily space that considers

  • nly figures and points [sensor data], since figures and points can

neither be conceived nor exist at all without [convention | context]. When I say that an object is on a table, I always place myself (in thought) in the table or the object, and I apply a category to them that in principle fits the relation between my body and external

  • bjects. Stripped of this anthropological contribution, the word

“on” is no longer distinguished from the word “under” or the term “next to . . .”

9

Merleau-Ponty, Phenomenology of Perception

slide-10
SLIDE 10 SYNTHESIS • ASU SHA XIN WEI

grasping | touching ≠ pointing

One patient asked to point to a part of his body, such as his nose, only succeeds if he is allowed to grasp it. If the patient is directed to interrupt the movement before it reaches its goal, or if he is

  • nly allowed to touch his nose with a wooden

ruler, then the movement becomes impossible. It must thus be admitted that “grasping” and “touching” are different from “pointing,” even for the body.

10

enaction

Merleau-Ponty, Phenomenology of Perception
slide-11
SLIDE 11 SYNTHESIS • ASU SHA XIN WEI

the body is not in space, it inhabits space

If my hand executes a complicated movement in the air, I do not have to add together all the movements in one direction and subtract the movements in the other in

  • rder to know its final position. 


11

Merleau-Ponty, Phenomenology of Perception

slide-12
SLIDE 12 SYNTHESIS • ASU SHA XIN WEI

If I possess the habit of driving a car, then I enter into a lane and see that “I can pass” without comparing the width of the lane to that of the fender, just as I go through a door without comparing the width of the door to that of my body. The … automobile [has] ceased to be objects whose size and volume would be determined through a comparison with other objects. They have become voluminous powers and the necessity of a certain free space

12

Merleau-Ponty, Phenomenology of Perception

slide-13
SLIDE 13 SYNTHESIS • ASU SHA XIN WEI

man - cane - world

The blind man’s cane has ceased to be an object for him, it is no longer perceived for itself; rather, the cane’s furthest point is transformed into a sensitive zone, it increases the scope and the radius of the act

  • f touching and has become

analogous to a gaze. In the exploration of objects, the length of the cane does not explicitly intervene nor act as a middle term: the blind man knows its length by the position of the objects, rather than the position of the objects through the cane’s length.

13

Merleau-Ponty, Phenomenology of Perception

slide-14
SLIDE 14 SYNTHESIS • ASU SHA XIN WEI

habit ~ musician’s “muscle memory”

[Trained organist adapting to a strange organ]: Such a brief apprenticeship prohibits the assumption that new conditioned reflexes are simply substituted for the already established collection, unless, that is, they together form a system and if the change is global, but this would be to go beyond the mechanistic theory since in that case the reactions would be mediated by a total hold on the instrument.

14

Merleau-Ponty, Phenomenology of Perception

slide-15
SLIDE 15 SYNTHESIS • ASU SHA XIN WEI

spatiality ≠ space

sense of extension | spatial relation

slide-16
SLIDE 16 SYNTHESIS • ASU SHA XIN WEI

We report on how people acquire a sense of spatial relation to each other and to our media rich environment. We call this sense spatiality, after Merleau-Ponty’s treatment which resorts neither to physical abstractions external to subjective experience, nor to purely mental phenomena.

16

slide-17
SLIDE 17 SYNTHESIS • ASU SHA XIN WEI

spatiality

The sense of space requires taking subjective experience as primary data rather than subject-independent measures (such as clock time or “objective” sensor data), which requires phenomenologically informed methodology.

  • Ex. auditory illusions, Al Bregman, CIRMMT, McGill 2008

17

slide-18
SLIDE 18 SYNTHESIS • ASU SHA XIN WEI

spatiality of own body

If my arm is resting on the table, I will never think to say that it is next to the ashtray in the same way that the ashtray is next to the telephone. …the body’s parts relate to each other in a peculiar way: they are not laid out side by side… My hand…is not a collection of points. [ allochiria example] space of my hand is not a mosaic of spatial values. I hold my body as an indivisible possession …

18

Merleau-Ponty, Phenomenology of Perception

slide-19
SLIDE 19 SYNTHESIS • ASU SHA XIN WEI

positional situational body spatiality

I am not unaware of the location of my shoulders or my waist; rather, this awareness is enveloped in my awareness of my hands and my entire stance is read, so to speak, in how my hands lean upon the desk

19

Merleau-Ponty, Phenomenology of Perception

slide-20
SLIDE 20 SYNTHESIS • ASU SHA XIN WEI

positional situational body spatiality

if I hold my pipe in a closed hand, the position of my hand is not determined discursively by the angle that it makes with my forearm, my forearm with my arm, my arm with my torso and, finally, my torso with the ground. I have an absolute knowledge

  • f where my pipe is, and from

this I know where my hand is and where my body is…

20

Merleau-Ponty, Phenomenology of Perception

slide-21
SLIDE 21 SYNTHESIS • ASU SHA XIN WEI

intertwined temporality spatiality => study rhythm

slide-22
SLIDE 22 Sha • SLSA 161105 synthesiscenter.net

rhythm = inhomogeneous matter x body x relative movement

slide-23
SLIDE 23 SYNTHESIS • ASU SHA XIN WEI

experiments

Embodiment : Delay audio Embodiment : Delay video Embodiment : Novel prosthesis Lanterns Correlation orientation

  • 23
slide-24
SLIDE 24 SYNTHESIS • ASU SHA XIN WEI

interval ?

What constitutes a tick?

slide-25
SLIDE 25 Sha • SLSA 161105 synthesiscenter.net

“The difference 'between' two things is only empirical, and the corresponding determinations are only extrinsic.…

Deleuze, Difference and Repetition, p. 28

slide-26
SLIDE 26 Sha • SLSA 161105 synthesiscenter.net

Lightning, for example, distinguishes itself from the black sky but must also trail it behind, as though it were distinguishing itself from that which does not distinguish itself from it [the lightning].”

\

  • Deleuze, Difference and Repetition, p. 28
slide-27
SLIDE 27 SYNTHESIS • ASU SHA XIN WEI

delay

slide-28
SLIDE 28
slide-29
SLIDE 29
slide-30
SLIDE 30 Sha • SLSA 161105 synthesiscenter.net

Ouija: Delay

Ouija Experiments • Topological Media Lab Sha, Montanaro, et al. Hexagram Concordia Montreal 2006

slide-31
SLIDE 31 SYNTHESIS • ASU SHA XIN WEI

demo: time filter

( live: timespace : tml )

slide-32
SLIDE 32 Sha • SLSA 161105 synthesiscenter.net

What is the measurable and the non- measurable? Isn't time (which seems to escape measure on account of its own fluidity) that which measures itself?

Henri Lefebvre, Rhythmanalysis: Space, Time and Everyday Life (2004)

The present tense is important.

slide-33
SLIDE 33 Sha • SLSA 161105 synthesiscenter.net

Maturana & Varela: time as a dimension = description Bergson: chain of instants is analytic artifact

slide-34
SLIDE 34 SYNTHESIS • ASU SHA XIN WEI

“Time” as description [A] mode of behavioral distinction between

  • therwise equivalent interactions in a domain that

has to do with the states of the organism and not with the ambient features which define the interaction, gives rise to a referential dimension.… [S]equence as a dimension is defined in the domain

  • f interactions of the organism, not in the operation
  • f the nervous system as a closed neuronal,

network.…

Humberto Maturana and Francisco Varela, Autopoiesis and Cognition (1980) 133.

slide-35
SLIDE 35 SYNTHESIS • ASU SHA XIN WEI

Similarly, the behavioral distinction by the

  • bserver of sequential states in his recurrent

states of nervous activity, as he recursively interacts with them, constitutes the generation of time as a dimension of the descriptive domain. Accordingly, time is a dimension in the domain

  • f descriptions, not a feature of the ambience.

Humberto Maturana and Francisco Varela, Autopoiesis and Cognition (1980), p. 133.

“Time” as description

slide-36
SLIDE 36 SYNTHESIS • ASU SHA XIN WEI

The apparent discontinuity of the psychical life is then due to our attention being fixed on it by a series of separate acts: actually there is only a gentle slope; but in following the broken line of our acts of attention, we think we perceive separate steps.

Henri Bergson, Creative Evolution (1907), tr. A. Mitchell (1911), p. 5

“Time” as uni-dimension

slide-37
SLIDE 37 SYNTHESIS • ASU SHA XIN WEI

[ body ] [ matter ] [ form ] [ number ]

effect | lens | instrument ?

put in play

slide-38
SLIDE 38 SYNTHESIS • ASU SHA XIN WEI

experiments apparatuses instruments

slide-39
SLIDE 39 Sha • SLSA 161105 synthesiscenter.net

ensemble rhythmic coordination

Lanterns apparatus, rhythmic entrainment of ensembles, Garrett L Johnson, Synthesis Center ASU, Britta J

39

slide-40
SLIDE 40 SYNTHESIS • ASU SHA XIN WEI

semblance typology of entrainments

Some English words are built from Greek or Latin prefixes for "same" or "together" or “with”: synonym - together in meaning synorogeny - form boundaries conclude - close together symptosis - fall together symptomatic - happening together symphesis - growing together sympatria - in the same region symmetrophilic - lover of collecting in pairs synaleiphein - melt together

40

Adrian Freed CNMAT Berkeley Topological Media Lab

slide-41
SLIDE 41 SYNTHESIS • ASU SHA XIN WEI

context is indeterminate, unbounded

meaning of a sign is its response

[J Dewey, M Peckham]

slide-42
SLIDE 42 Sha • SLSA 161105 synthesiscenter.net

ligeti poème symphonique

slide-43
SLIDE 43 SYNTHESIS • ASU SHA XIN WEI

rhythm as a textural sense

texture >> dimension texture >> relation

43

slide-44
SLIDE 44 Sha • SLSA 161105 synthesiscenter.net

continuous variation

wind

slide-45
SLIDE 45 Sha • SLSA 161105 synthesiscenter.net

textural entrainment

http://lightingrhythm.weebly.com/

slide-46
SLIDE 46 Sha • SLSA 161105 synthesiscenter.net

textural entrainment

http://lightingrhythm.weebly.com/

delay structure

slide-47
SLIDE 47 SYNTHESIS • ASU SHA XIN WEI

rhythm

Event(s) Metrical regularity Unidimensional Abstraction (form) Sense data

Not perceived but apperceived Not solely human

Texture >> dimension Special case of temporality

47

slide-48
SLIDE 48 SYNTHESIS • ASU SHA XIN WEI

supplements

SynthesisCenter.net | ArtsMediaEngineering.net Arizona State University | Phoenix

slide-49
SLIDE 49 Sha • SLSA 161105 synthesiscenter.net

painted film

Stan Brackage

slide-50
SLIDE 50 SYNTHESIS • ASU SHA XIN WEI

moving mesh

Rudolph Laban

50

slide-51
SLIDE 51 Sha • SLSA 161105 synthesiscenter.net

moving mesh

Maria Blaisse

slide-52
SLIDE 52 Sha • SLSA 161105 synthesiscenter.net

multiscale ritual + resonance

Die Audio Gruppe: The Line, ICC Japan

slide-53
SLIDE 53 Sha • SLSA 161105 synthesiscenter.net

Malcolm Sutherland, Forming Game | Jeu de formes

slide-54
SLIDE 54 Sha • SLSA 161105 synthesiscenter.net

image > choreographed points in space

forest3 Chris Ziegler, AME, ASU

slide-55
SLIDE 55 Sha • SLSA 161105 synthesiscenter.net

Synthesis ASU experimental iStage : Responsive Environments

Responsivity continuously adapting over two days, SLSA Nov 2017

slide-56
SLIDE 56 Sha • SLSA 161105 synthesiscenter.net

SynthesisCenter.net

ArtsMediaEngineering.net | ASU | Phoenix

slide-57
SLIDE 57

responsive environments

Topological Media Lab + Alkemie, Hexagram Blackbox, April 2013

57

slide-58
SLIDE 58 Sha • SLSA 161105 synthesiscenter.net

sense >> form

Hrfeatbeat Workshop : Training to frequency by ambient color. Teoma Naccarato & John MacCallum @

slide-59
SLIDE 59 Sha • SLSA 161105 synthesiscenter.net

gestural media: light > sound

Die Audio Gruppe

slide-60
SLIDE 60 Sha • SLSA 161105 synthesiscenter.net

correlating orientations

Mike Krzyzaniak, Julie Akerly, Jessica Rajko, Varsha Iyengar, Rushil Anirudh, Vinay Venkataraman, Pavan Turaga, Sha Xin Wei Synthesis Center, ASU, 2015

slide-61
SLIDE 61 SYNTHESIS • ASU SHA XIN WEI

relationality token + context is inadequate to rhuthmos

how to interpret signal

slide-62
SLIDE 62 Sha • SLSA 161105 synthesiscenter.net

62

making temporality: sense of time

Timespace Y Serita, M Fortin (TML); T Ingalls, C Rawls, B Lahey (Synthesis) GPU versions

slide-63
SLIDE 63 SYNTHESIS • ASU SHA XIN WEI

hypercomplex signal correlation

a(t) and b(t) are quaternion-valued time series where ○ indicates correlation, and 𝜐 is the time-lag between signals a and b

63

nion signals thus[3]: a b =

N−1

P

t=0

a(τ + t)b(t)

slide-64
SLIDE 64 SYNTHESIS • ASU SHA XIN WEI

connectedness

Krzyzaniak, Anirudh, Venkataraman, Turaga, Sha

64

Sequence Label Description Identical Both dancers perform the basic movement sequence identically Dissimilar One dancer performs the ba- sic movement sequence while the other does something wholly unrelated (different in each in- stance) Similar One dancer performs the basic movement sequence while the

  • ther performs the basic move-

ment sequence plus noise in the instrumented hands Lag Identical movement but

  • ne

dancer leads the other by a few seconds Amplitude One dancer performs the basic movement sequence,while the

  • ther performs a ’small ampli-

tude’ version of the same

slide-65
SLIDE 65 SYNTHESIS • ASU SHA XIN WEI

Eckmann-Ruelle cross approximate entropy

S.M. Pincus, PNAS 88 (1991), 2297-2301.

65

  • Proc. Natl. Acad. Sci. USA 88 (1991)

monly used entropy algorithms are given by the K-S entropy

(8), K2 entropy [defined by Grassberger and Procaccia (10)],

and a marginal redundancy algorithm given by Fraser (11).

Wolf et al.

(6) have provided the most commonly used

algorithm for computing the Lyapunov spectra.

Other developments further confound a single intuition for each of these concepts. Hausdorff dimension, defined for a

geometric object in an n-dimensional Euclidean space, can give

fractional values. Mandelbrot (12) has named these nonintegral

dimension objects "fractals" and has extensively modeled

  • them. Intuitively, entropy addresses system randomness and

regularity, but precise settings and definitions vary greatly. Classically, it has been part of the modern quantitative devel-

  • pment of thermodynamics, statistical mechanics, and infor-

mation theory (13, 14). In ergodic theory, an entropy definition

for a measure-preserving transformation was invented by Kol-

mogorov, originally to resolve the problem of whether two

Bernoulli shifts are isomorphic (3).

It is distinct from the

concept of metric entropy, also invented by Kolmogorov (15),

in which a purely metric definition is given. Ellis (16) discusses

level 1, 2 (Kuliback-Leibler), and 3 entropies, which assess the

asymptotic behavior of large deviation probabilities.

Invariant measures have been studied apart from chaos

throughout the last 40 years. Grenander (17) developed a theory of probabilities on algebraic structures, including laws

  • f large numbers and a central limit theorem for stochastic

Lie groups involving these measures. Furstenberg (18)

proved a strong law of large numbers for the norm ofproducts

  • f random matrices, in terms of the invariant measures.

Subsequently Oseledets (5) proved the related result that a

normalized limit of a product of random matrices, times its

adjoint, converges to a nonnegative definite symmetric ma-

  • trix. This latter result, often associated with dynamical

systems, is proved for random matrices in general, and it allows one to deduce the Lyapunov exponents as the eigen- values of the limiting matrix. Pincus (19) analytically derived

an explicit geometric condition for the invariant measures

associated with certain classes of random matrices to be

singular and "fractal-like"' and a first term in an asymptotic

expansion for the largest Lyapunov exponent in a Bernoulli

random matrix setting (20). Thus noninteger dimensionality

and the classification of system evolution by the Lyapunov

spectra make sense in a stochastic environment.

The above discussion suggests that great care must be

taken in concluding that properties true for one dimension or entropy formula are true for another, intuitively related,

  • formula. Second, since invariant measures can arise from

stochastic or deterministic settings, in general it is not valid

to infer the presence of an underlying deterministic system

from the convergence of algorithms designed to encapsulate

properties of invariant measures.

Correlation Dimension, and a Counterexample

A widely used dimension algorithm in data analysis is the

correlation dimension (21). Fix m, a positive integer, and r,

a positive real number. Given a time-series of data u(1), u(2),

. . . , u(N), from measurements equally spaced in time, form

a sequence of vectors x(1), x(2), .

. ., x(N - m + 1) in R',

defined by x(i) = [u(i), u(i + 1),

. . ., u(i + m - 1)]. Next,

define for each i, 1 s i s N - m + 1,

C,(r)=

(number ofj such that d[x(i), x(j)] . r)/(N - m + 1).

[1]

We must define d[x(i), x(j)] for vectors x(i) and x(j). We

follow Takens (22) by defining

d[x(i), x(j)] =

max

(Iu(i + k - 1) - u(j + k - 1)j).

[21

k=1,2,...,m

From the C, (r), define

N-m+l

Cm(r) = (N - m + i)-1

E

C7(r)

i=l

[3]

and define

[4]

13M

= lim

lim log C'(r)/log r.

r- O

N--*-

The assertion is that for m sufficiently large, 13m is the

correlation dimension. Such a limiting slope has been shown

to exist for the commonly studied chaotic attractors.

This procedure has frequently been applied to experimental

data; investigators seek a "scaling range" of r values for which log Cm(r)/log r is nearly constant for large m, and they infer

that this ratio is the correlation dimension (21). In some

instances, investigators have concluded that this procedure

establishes deterministic chaos.

The latter conclusion is not necessarily correct: a con-

verged, finite correlation dimension value does not guarantee

that the defining process

is deterministic. Consider the

following stochastic process. Fix 0 cp c 1. Define Xj = a-l/2 sin(2irj/12) for all j, where a is specified below. Define Yj

as a family of independent identically distributed (i.i.d.) real random variables, with uniform density on the interval

[-f3-, \F3]. Define Zi as afamily of i.i.d. random variables,Zj

  • 1 with probability p, Zj = 0 with probability 1 - p. Set

a = (

sin2(2lrj/12))/12,

[5]

and define MIXj = (1 - Zj) Xj + ZjYj. Intuitively, MIX(p) is

generated by first ascertaining, for each j, whether the jth

sample will be from the deterministic sine wave or from the random uniform deviate, with likelihood (1 - p) ofthe former

choice, then calculating either Xj or Yj. Increasing p marks a

tendency towards greater system randomness.

We now show that almost surely (a.s.) /m in Eq. 4 equals

0 for all m for the MIX(p) process, p # 1. Fix m, define k(j)

= (12m)j - 12m, and define Nj = 1 if (MIXk,)+l,

*

MIXk(j)+m) = (X1, .

. ., Xm), Nj = 0 otherwise. The Nj are

i.i.d. random variables, with the expected value of Nj, E(Nj),

(1 - p)m. By the Strong Law of Large Numbers, a.s.

N

lim E NjIN = E(Nj):- (1

p)m.

N-

c J= 1

Observe that (Xjt1 Nj/l2mN)2 is a lower bound to Cm(r),

since x~i)+i = xku)+l if Ni = Nj = 1. Thus, a.s. for r < 1 lim sup log Cm(r)/log r

  • (1/log r) lim log

N-*oo

~~~~~~~~~~~~~N--.oo

N

2

( , Nj/12mN

log((1 - p)2m/(12m)2)/log r.

J=1

Since (1 - p)2m/(12m)2 is independent of r, a.s. (3m = lim,,0

limN ,, log Cm(r)/log r = 0. Since (3m # 0 with probability 0

for each m, by countable additivity, a.s. for all m, (m = 0.

The MIX(p) process can be motivated by considering an

autonomous unit that produces sinusoidal output, surrounded

by a world of interacting processes that in ensemble produces

  • utput that resembles noise relative to the timing of the unit.

The extent to which the surrounding world interacts with the

unit could be controlled by a gateway between the two, with a larger gateway admitting greater apparent noise to compete

with the sinusoidal signal.

It is easy to show that, given a sequence Xj, a sequence of i.i.d. Yj, defined by a density function and independent ofthe

2298 Mathematics: Pincus

bigger is more correlated

  • Proc. Natl. Acad. Sci. USA 88 (1991)

2299

Xi, and Zj = Xj + Yj, then Zj has an infinite correlation

dimension.

It appears that correlation dimension distin-

guishes between correlated and uncorrelated successive it-

erates, with larger estimates of dimension corresponding to

more uncorrelated data. For a more complete interpretation

  • f correlation dimension results, stochastic processes with

correlated increments should be analyzed.

Error estimates in dimension calculations are commonly

  • seen. In statistics, one presumes a specified underlying sto-

chastic distribution to estimate misclassification probabilities.

Without knowing the form of a distribution, or ifthe system is

deterministic or stochastic, one must be suspicious of error

  • estimates. There often appears to be a desire to establish a

noninteger dimension value, to give a fractal and chaotic

interpretation to the result, but again, prior to a thorough study

  • f the relationship between the geometric Hausdorff dimen-

sion and the time series formula labeled correlation dimension,

it is speculation to draw conclusions from a noninteger cor-

relation dimension value.

K-S Entropy and ApEn Shaw (23) recognized that a measure of the rate of informa-

tion generation of a chaotic system is a useful parameter. In

1983, Grassberger and Procaccia (10) developed a formula,

motivated by the K-S entropy, to calculate such a rate from time series data. Takens (22) varied this formula by intro-

ducing the distance metric given in Eq. 2; and Eckmann and

Ruelle (8) modify the Takens formula to "directly" calculate

the K-S entropy for the physical invariant measure presumed

to underlie the data distribution. These formulas have be-

come the "standard" entropy measures for use with time-

series data. We next indicate the Eckmann-Ruelle (E-R)

entropy formula, with the terminology as above.

N-m+1

Define 4Vm(r) = (N - m + 1)-1

log C7i(r).

[6]

Heuristically, E-R entropy and ApEn measure the (loga- rithmic) likelihood that runs of patterns that are close remain

close on next incremental comparisons. ApEn can be com-

puted for any time series, chaotic or otherwise. The intuition

motivating ApEn is that if joint probability measures (for these "constructed" m-vectors) that describe each of two

systems are different, then their marginal distributions on a

fixed partition are likely different. We typically need orders

  • f magnitude fewer points to accurately estimate these mar-

ginals than to perform accurate density estimation on the

fully reconstructed measure that defines the process.

A nonzero value for the E-R entropy ensures that a known

deterministic system is chaotic, whereas ApEn cannot certify

  • chaos. This observation appears to be the primary insight

provided by E-R entropy and not by ApEn. Also, despite the

algorithm similarities, ApEn(m, r) is not intended as an

approximate value of E-R entropy. In instances with a very

large number of points, a low-dimensional attractor, and a large enough m, the two parameters may be nearly equal. It

is essential to consider ApEn(m, r) as a family of formulas,

and ApEn(m, r, N) as afamily of statistics; system cornpar-

isons are intended with fixed m and r.

ApEn for m = 2

I demonstrate the utility of ApEn(2, r, 1000) by applying this

statistic to two distinct settings, low-dimensional nonlinear

deterministic systems and the MIX stochastic model.

(i) Three frequently studied systems: a Rossler model with

superimposed noise, the Henon map, and the logistic map. Numerical evidence (24) suggests that the following system of

equations, Ross(R) is chaotic for R = 1:

dx/dt = -z - y dy/dt = x + 0.15y

E-R entropy = lim lim

lim [Pm(r) - m+l(r)].

[71

Note that

¢m

(r) -( '(r)

= average over i of log[conditional probability that

Iu(i + m) - u(i + m)l < r, given that Iu(j + k) - u(i + k)l

s<rfor k = O. 1, ... ., m - 1].

[8]

The E-R entropy and variations have been useful in

classifying low-dimensional chaotic systems. In other con-

texts, its utility appears more limited, as it exhibits the

statistical deficiencies noted in the Introduction. Since E-R

entropy is infinity for a process with superimposed noise of

any magnitude (7), for use with experimental data an approx-

imation of Eq. 7 must be employed with a meaningful range

  • f "r" (vector comparison distance) established. As we see

below, a converged "entropy" calculation for a fixed value of

r no longer ensures a deterministic system. Also, E-R

entropy does not distinguish some processes that appear to

differ in complexity; e.g., the E-R entropy for the MIX

process is infinity, for all p # 0.

Fix m and r in Eq. 6; define

ApEn(m, r) = lim [4"'(r) - m+1(r)].

[9]

N-ixo

Given N data points, we implement this formula by defining

the statistic (introduced in ref. 7)

ApEn(m, r, N) = cm(r) - m+l(r).

[10]

dz/dt = 0.20 + R(zx - 5.0).

[11]

Time series were obtained for R = 0.7, 0.8, and 0.9 by

integration via an explicit time-step method with increment

0.005. The y values were recorded at intervals of At = 0.5.

Noise was superimposed on each y value by the addition of

i.i.d. gaussian random variables, mean 0, standard deviation

0.1. The respective system dynamics are given by noise

superimposed on a twice-periodic, four-times-periodic, and

chaotic limit cycle. The logistic map is given by

xi+, = Rxi(l - x,). [12]

Time series were obtained for R = 3.5, 3.6, and 3.8. R = 3.5

produces periodic (period four) dynamics, and R = 3.6 and R

= 3.8 produce chaotic dynamics. A parametrized version of

the Henon map is given by

Xi+i = Ry, + 1 - 1.4x3I

yi+l = 0.3Rxi. [13]

Time series for xi were obtained for R = 0.8 and 1.0, both of

which correspond to chaotic dynamics. All series were

generated after a transient period of 500 points. For each value ofR and each system, ApEn(2, r, N) was calculated for time series of lengths 300, 1000, and 3000, for two values of

  • r. The sample means and standard deviations were also

calculated for each system. Table 1 shows the results.

Notice that for each system, the two choices of r were

constant, though the different systems had different r values.

One can readily distinguish any Rossler output from Henon

  • utput, or from logistic output, on the basis of the quite

Mathematics: Pincus

  • Proc. Natl. Acad. Sci. USA 88 (1991)

2299

Xi, and Zj = Xj + Yj, then Zj has an infinite correlation

dimension.

It appears that correlation dimension distin-

guishes between correlated and uncorrelated successive it-

erates, with larger estimates of dimension corresponding to

more uncorrelated data. For a more complete interpretation

  • f correlation dimension results, stochastic processes with

correlated increments should be analyzed.

Error estimates in dimension calculations are commonly

  • seen. In statistics, one presumes a specified underlying sto-

chastic distribution to estimate misclassification probabilities.

Without knowing the form of a distribution, or if the system is

deterministic or stochastic, one must be suspicious of error

  • estimates. There often appears to be a desire to establish a

noninteger dimension value, to give a fractal and chaotic

interpretation to the result, but again, prior to a thorough study

  • f the relationship between the geometric Hausdorff dimen-

sion and the time series formula labeled correlation dimension,

it is speculation to draw conclusions from a noninteger cor-

relation dimension value.

K-S Entropy and ApEn Shaw (23) recognized that a measure of the rate of informa-

tion generation of a chaotic system is a useful parameter. In

1983, Grassberger and Procaccia (10) developed a formula,

motivated by the K-S entropy, to calculate such a rate from time series data. Takens (22) varied this formula by intro-

ducing the distance metric given in Eq. 2; and Eckmann and

Ruelle (8) modify the Takens formula to "directly" calculate

the K-S entropy for the physical invariant measure presumed

to underlie the data distribution. These formulas have be-

come the "standard" entropy measures for use with time-

series data. We next indicate the Eckmann-Ruelle (E-R)

entropy formula, with the terminology as above.

N-m+1

Define 4Vm(r) = (N - m + 1)-1

log C7i(r).

[6]

Heuristically, E-R entropy and ApEn measure the (loga- rithmic) likelihood that runs of patterns that are close remain

close on next incremental comparisons. ApEn can be com-

puted for any time series, chaotic or otherwise. The intuition

motivating ApEn is that if joint probability measures (for these "constructed" m-vectors) that describe each of two

systems are different, then their marginal distributions on a

fixed partition are likely different. We typically need orders

  • f magnitude fewer points to accurately estimate these mar-

ginals than to perform accurate density estimation on the

fully reconstructed measure that defines the process.

A nonzero value for the E-R entropy ensures that a known

deterministic system is chaotic, whereas ApEn cannot certify

  • chaos. This observation appears to be the primary insight

provided by E-R entropy and not by ApEn. Also, despite the

algorithm similarities, ApEn(m, r) is not intended as an

approximate value of E-R entropy. In instances with a very

large number of points, a low-dimensional attractor, and a large enough m, the two parameters may be nearly equal. It

is essential to consider ApEn(m, r) as a family of formulas,

and ApEn(m, r, N) as afamily of statistics; system cornpar-

isons are intended with fixed m and r.

ApEn for m = 2

I demonstrate the utility of ApEn(2, r, 1000) by applying this

statistic to two distinct settings, low-dimensional nonlinear

deterministic systems and the MIX stochastic model.

(i) Three frequently studied systems: a Rossler model with

superimposed noise, the Henon map, and the logistic map. Numerical evidence (24) suggests that the following system of

equations, Ross(R) is chaotic for R = 1:

dx/dt = -z - y dy/dt = x + 0.15y

E-R entropy = lim lim

lim [Pm(r) - m+l(r)].

[71

Note that

¢m

(r) -( '(r)

= average over i of log[conditional probability that

Iu(i + m) - u(i + m)l < r, given that Iu(j + k) - u(i + k)l

s<rfor k = O. 1, ... ., m - 1].

[8]

The E-R entropy and variations have been useful in

classifying low-dimensional chaotic systems. In other con-

texts, its utility appears more limited, as it exhibits the

statistical deficiencies noted in the Introduction. Since E-R

entropy is infinity for a process with superimposed noise of

any magnitude (7), for use with experimental data an approx-

imation of Eq. 7 must be employed with a meaningful range

  • f "r" (vector comparison distance) established. As we see

below, a converged "entropy" calculation for a fixed value of

r no longer ensures a deterministic system. Also, E-R

entropy does not distinguish some processes that appear to

differ in complexity; e.g., the E-R entropy for the MIX

process is infinity, for all p # 0.

Fix m and r in Eq. 6; define

ApEn(m, r) = lim [4"'(r) - m+1(r)].

[9]

N-ixo

Given N data points, we implement this formula by defining

the statistic (introduced in ref. 7)

ApEn(m, r, N) = cm(r) - m+l(r).

[10]

dz/dt = 0.20 + R(zx - 5.0).

[11]

Time series were obtained for R = 0.7, 0.8, and 0.9 by

integration via an explicit time-step method with increment

0.005. The y values were recorded at intervals of At = 0.5.

Noise was superimposed on each y value by the addition of

i.i.d. gaussian random variables, mean 0, standard deviation

0.1. The respective system dynamics are given by noise

superimposed on a twice-periodic, four-times-periodic, and

chaotic limit cycle. The logistic map is given by

xi+, = Rxi(l - x,). [12]

Time series were obtained for R = 3.5, 3.6, and 3.8. R = 3.5

produces periodic (period four) dynamics, and R = 3.6 and R

= 3.8 produce chaotic dynamics. A parametrized version of

the Henon map is given by

Xi+i = Ry, + 1 - 1.4x3I

yi+l = 0.3Rxi. [13]

Time series for xi were obtained for R = 0.8 and 1.0, both of

which correspond to chaotic dynamics. All series were

generated after a transient period of 500 points. For each value ofR and each system, ApEn(2, r, N) was calculated for time series of lengths 300, 1000, and 3000, for two values of

  • r. The sample means and standard deviations were also

calculated for each system. Table 1 shows the results.

Notice that for each system, the two choices of r were

constant, though the different systems had different r values.

One can readily distinguish any Rossler output from Henon

  • utput, or from logistic output, on the basis of the quite

Mathematics: Pincus

m window size

slide-66
SLIDE 66 Sha • SLSA 161105 synthesiscenter.net

experiential enactive atmosphere

WARM CLOUDS, Embodied, ensemble, enactive steerable simulations, dense sensing dense media, 2016-2018 Synthesis: Brandon Mechtley, C. Roberts, Sha, with J Stein, C. Rawls, B Nandin, E Vasquez,

slide-67
SLIDE 67

atmosphere workshop

Papers (Foerster, Navab, Vasquez, Dumont, Sha), Experiments, Workshops (Montreal, Berlin)

67

slide-68
SLIDE 68

Serra Vegetal Life

Oana Suteu Khintirian, Todd Ingalls, Sha Xin Wei, Ginette Laurin, O Vertigo + Synthesis https://vimeo.com/synthesiscenter/serramay2018

68

slide-69
SLIDE 69

time lenses: heterotemporality

Time Lenses 2016, J Stein, OS Khintirian, T. Ingalls, Sha X.W. at Les corps dessinant • EsPAS/ACTE/CNRS, Musée des arts et métiers

69

slide-70
SLIDE 70 Sha • SLSA 161105 synthesiscenter.net

non-anthropocentric temporality

Koyaanisqatsi, Godfrey Reggio, Philip Glass, Ron Fricke.

slide-71
SLIDE 71 SHA • SYNTHESIS • ASU ARTS, MEDIA + ENGINEERING

thanks to

71

Key Synthesis Affiliates Todd Ingalls Adam Nocek Emiddio Vasquez Kostalena Michelaki Pavan Turaga Julian Stein Adrian Freed Evan Montpellier Kristi Garboushian Prashanth Seshasayee Brandon Mechtley Althea Pergakis Gabriella Isaacs Lauren Hayes Robert LiKamWa Chris Roberts Andreas Spanias Garrett L Johnson Loren Olson Qiao Wang Connor Rawls Assegid Kidane Garth Paine Luke Kautz Ronald Broglio Katie Jung Brenda McCaffrey Helga Wild Maja Kuzmanovic Ruokun Chen Pete Weisman Byron Lahey Ian Shelanskey Matthew Briggs Rushil Anirudh Ben Nandin Caroline Fernandez Jessica Rajko Megan Patzem Sander van der Leeuw Chelsea Courtney John MacCallum Mike Krzyzaniak Shahab Sagheb Cheryl Marston Josh Gigantino Nayely Velez-Cruz Sylvia Chavez Chris Ziegler Josh Stark Navid Navab Tain Barzso Chris Zlaket Josée-Anne Drolet Nik Gaffney Tamara Underiner Christian Montoro Keyaanna Pausch Niklas Damiris Tom Hartmann Daisy Nolz Kevin Klawinski Nina Bouchard Varsha Iyengar Dehlia Hannah Kimberlee Swisher Oana Suteu Khintirian Vinay Venkataraman Ed Finn