rhythm and the enactive sense of extent and duration
Sha Xin Wei Synthesis Center + AME @ ASU Topological Media Lab @ Concordia
rhythm and the enactive sense of extent and duration SYNTHESIS ASU - - PowerPoint PPT Presentation
rhythm and the enactive sense of extent and duration SYNTHESIS ASU SHA XIN WEI Sha Xin Wei Synthesis Center + AME @ ASU Topological Media Lab @ Concordia Sha SLSA 161105 synthesiscenter.net Koyaanisqatsi, Godfrey Reggio, Philip
Sha Xin Wei Synthesis Center + AME @ ASU Topological Media Lab @ Concordia
Koyaanisqatsi, Godfrey Reggio, Philip Glass, Ron Fricke.
financial dynamics
Koyaanisqatsi, Godfrey Reggio, Philip Glass, Ron Fricke.
non-anthropocentric temporality
Koyaanisqatsi, Godfrey Reggio, Philip Glass, Ron Fricke.
multiscale rhythmanalytics
Media Gesturing (bodies) Organizations Cities
5
parameter | predicate > effect time > temporality (lived) duration felt experience
In my many years of research on how and when a mixture of sounds will blend or be heard as separate sounds, my own personal experience and those of my students has played a central role in deciding what to study and how to study it. When I encouraged students to spend a lot of time listening to the stimuli and trying out different patterns of sound to see which ones would show the effect we were interested in, far into the academic year, and nearing the time that they should have been carrying out their experiments, they would get nervous and ask when they would start doing the “real research”. I told them that what they were doing now was the real research, and the formal experiment with subjects and statistics was just to convince other people.
7
THE INHABITANT OR VISITOR
[R]eject as abstract any analysis of bodily space that considers
neither be conceived nor exist at all without [convention | context]. When I say that an object is on a table, I always place myself (in thought) in the table or the object, and I apply a category to them that in principle fits the relation between my body and external
“on” is no longer distinguished from the word “under” or the term “next to . . .”
9
Merleau-Ponty, Phenomenology of Perception
grasping | touching ≠ pointing
One patient asked to point to a part of his body, such as his nose, only succeeds if he is allowed to grasp it. If the patient is directed to interrupt the movement before it reaches its goal, or if he is
ruler, then the movement becomes impossible. It must thus be admitted that “grasping” and “touching” are different from “pointing,” even for the body.
10
the body is not in space, it inhabits space
If my hand executes a complicated movement in the air, I do not have to add together all the movements in one direction and subtract the movements in the other in
11
Merleau-Ponty, Phenomenology of Perception
If I possess the habit of driving a car, then I enter into a lane and see that “I can pass” without comparing the width of the lane to that of the fender, just as I go through a door without comparing the width of the door to that of my body. The … automobile [has] ceased to be objects whose size and volume would be determined through a comparison with other objects. They have become voluminous powers and the necessity of a certain free space
12
Merleau-Ponty, Phenomenology of Perception
man - cane - world
The blind man’s cane has ceased to be an object for him, it is no longer perceived for itself; rather, the cane’s furthest point is transformed into a sensitive zone, it increases the scope and the radius of the act
analogous to a gaze. In the exploration of objects, the length of the cane does not explicitly intervene nor act as a middle term: the blind man knows its length by the position of the objects, rather than the position of the objects through the cane’s length.
13
Merleau-Ponty, Phenomenology of Perception
habit ~ musician’s “muscle memory”
[Trained organist adapting to a strange organ]: Such a brief apprenticeship prohibits the assumption that new conditioned reflexes are simply substituted for the already established collection, unless, that is, they together form a system and if the change is global, but this would be to go beyond the mechanistic theory since in that case the reactions would be mediated by a total hold on the instrument.
14
Merleau-Ponty, Phenomenology of Perception
sense of extension | spatial relation
We report on how people acquire a sense of spatial relation to each other and to our media rich environment. We call this sense spatiality, after Merleau-Ponty’s treatment which resorts neither to physical abstractions external to subjective experience, nor to purely mental phenomena.
16
The sense of space requires taking subjective experience as primary data rather than subject-independent measures (such as clock time or “objective” sensor data), which requires phenomenologically informed methodology.
17
If my arm is resting on the table, I will never think to say that it is next to the ashtray in the same way that the ashtray is next to the telephone. …the body’s parts relate to each other in a peculiar way: they are not laid out side by side… My hand…is not a collection of points. [ allochiria example] space of my hand is not a mosaic of spatial values. I hold my body as an indivisible possession …
18
Merleau-Ponty, Phenomenology of Perception
positional situational body spatiality
I am not unaware of the location of my shoulders or my waist; rather, this awareness is enveloped in my awareness of my hands and my entire stance is read, so to speak, in how my hands lean upon the desk
19
Merleau-Ponty, Phenomenology of Perception
positional situational body spatiality
if I hold my pipe in a closed hand, the position of my hand is not determined discursively by the angle that it makes with my forearm, my forearm with my arm, my arm with my torso and, finally, my torso with the ground. I have an absolute knowledge
this I know where my hand is and where my body is…
20
Merleau-Ponty, Phenomenology of Perception
rhythm = inhomogeneous matter x body x relative movement
Embodiment : Delay audio Embodiment : Delay video Embodiment : Novel prosthesis Lanterns Correlation orientation
What constitutes a tick?
“The difference 'between' two things is only empirical, and the corresponding determinations are only extrinsic.…
Deleuze, Difference and Repetition, p. 28
Lightning, for example, distinguishes itself from the black sky but must also trail it behind, as though it were distinguishing itself from that which does not distinguish itself from it [the lightning].”
\
Ouija: Delay
Ouija Experiments • Topological Media Lab Sha, Montanaro, et al. Hexagram Concordia Montreal 2006
( live: timespace : tml )
What is the measurable and the non- measurable? Isn't time (which seems to escape measure on account of its own fluidity) that which measures itself?
Henri Lefebvre, Rhythmanalysis: Space, Time and Everyday Life (2004)
The present tense is important.
Maturana & Varela: time as a dimension = description Bergson: chain of instants is analytic artifact
“Time” as description [A] mode of behavioral distinction between
has to do with the states of the organism and not with the ambient features which define the interaction, gives rise to a referential dimension.… [S]equence as a dimension is defined in the domain
network.…
Humberto Maturana and Francisco Varela, Autopoiesis and Cognition (1980) 133.
Similarly, the behavioral distinction by the
states of nervous activity, as he recursively interacts with them, constitutes the generation of time as a dimension of the descriptive domain. Accordingly, time is a dimension in the domain
Humberto Maturana and Francisco Varela, Autopoiesis and Cognition (1980), p. 133.
“Time” as description
The apparent discontinuity of the psychical life is then due to our attention being fixed on it by a series of separate acts: actually there is only a gentle slope; but in following the broken line of our acts of attention, we think we perceive separate steps.
Henri Bergson, Creative Evolution (1907), tr. A. Mitchell (1911), p. 5
“Time” as uni-dimension
effect | lens | instrument ?
experiments apparatuses instruments
ensemble rhythmic coordination
Lanterns apparatus, rhythmic entrainment of ensembles, Garrett L Johnson, Synthesis Center ASU, Britta J
39
semblance typology of entrainments
Some English words are built from Greek or Latin prefixes for "same" or "together" or “with”: synonym - together in meaning synorogeny - form boundaries conclude - close together symptosis - fall together symptomatic - happening together symphesis - growing together sympatria - in the same region symmetrophilic - lover of collecting in pairs synaleiphein - melt together
40
Adrian Freed CNMAT Berkeley Topological Media Lab
context is indeterminate, unbounded
[J Dewey, M Peckham]
ligeti poème symphonique
texture >> dimension texture >> relation
43
continuous variation
wind
http://lightingrhythm.weebly.com/
http://lightingrhythm.weebly.com/
Event(s) Metrical regularity Unidimensional Abstraction (form) Sense data
Not perceived but apperceived Not solely human
Texture >> dimension Special case of temporality
47
SynthesisCenter.net | ArtsMediaEngineering.net Arizona State University | Phoenix
painted film
Stan Brackage
moving mesh
Rudolph Laban
50
moving mesh
Maria Blaisse
multiscale ritual + resonance
Die Audio Gruppe: The Line, ICC Japan
Malcolm Sutherland, Forming Game | Jeu de formes
image > choreographed points in space
forest3 Chris Ziegler, AME, ASU
Synthesis ASU experimental iStage : Responsive Environments
Responsivity continuously adapting over two days, SLSA Nov 2017
SynthesisCenter.net
ArtsMediaEngineering.net | ASU | Phoenix
responsive environments
Topological Media Lab + Alkemie, Hexagram Blackbox, April 2013
57
sense >> form
Hrfeatbeat Workshop : Training to frequency by ambient color. Teoma Naccarato & John MacCallum @
gestural media: light > sound
Die Audio Gruppe
correlating orientations
Mike Krzyzaniak, Julie Akerly, Jessica Rajko, Varsha Iyengar, Rushil Anirudh, Vinay Venkataraman, Pavan Turaga, Sha Xin Wei Synthesis Center, ASU, 2015
relationality token + context is inadequate to rhuthmos
how to interpret signal
62
making temporality: sense of time
Timespace Y Serita, M Fortin (TML); T Ingalls, C Rawls, B Lahey (Synthesis) GPU versions
hypercomplex signal correlation
a(t) and b(t) are quaternion-valued time series where ○ indicates correlation, and 𝜐 is the time-lag between signals a and b
63
nion signals thus[3]: a b =
N−1
P
t=0
a(τ + t)b(t)
Krzyzaniak, Anirudh, Venkataraman, Turaga, Sha
64
Sequence Label Description Identical Both dancers perform the basic movement sequence identically Dissimilar One dancer performs the ba- sic movement sequence while the other does something wholly unrelated (different in each in- stance) Similar One dancer performs the basic movement sequence while the
ment sequence plus noise in the instrumented hands Lag Identical movement but
dancer leads the other by a few seconds Amplitude One dancer performs the basic movement sequence,while the
tude’ version of the same
Eckmann-Ruelle cross approximate entropy
S.M. Pincus, PNAS 88 (1991), 2297-2301.
65
monly used entropy algorithms are given by the K-S entropy
(8), K2 entropy [defined by Grassberger and Procaccia (10)],
and a marginal redundancy algorithm given by Fraser (11).
Wolf et al.
(6) have provided the most commonly used
algorithm for computing the Lyapunov spectra.
Other developments further confound a single intuition for each of these concepts. Hausdorff dimension, defined for a
geometric object in an n-dimensional Euclidean space, can give
fractional values. Mandelbrot (12) has named these nonintegral
dimension objects "fractals" and has extensively modeled
regularity, but precise settings and definitions vary greatly. Classically, it has been part of the modern quantitative devel-
mation theory (13, 14). In ergodic theory, an entropy definition
for a measure-preserving transformation was invented by Kol-
mogorov, originally to resolve the problem of whether two
Bernoulli shifts are isomorphic (3).
It is distinct from the
concept of metric entropy, also invented by Kolmogorov (15),
in which a purely metric definition is given. Ellis (16) discusses
level 1, 2 (Kuliback-Leibler), and 3 entropies, which assess the
asymptotic behavior of large deviation probabilities.
Invariant measures have been studied apart from chaos
throughout the last 40 years. Grenander (17) developed a theory of probabilities on algebraic structures, including laws
Lie groups involving these measures. Furstenberg (18)
proved a strong law of large numbers for the norm ofproducts
Subsequently Oseledets (5) proved the related result that a
normalized limit of a product of random matrices, times its
adjoint, converges to a nonnegative definite symmetric ma-
systems, is proved for random matrices in general, and it allows one to deduce the Lyapunov exponents as the eigen- values of the limiting matrix. Pincus (19) analytically derived
an explicit geometric condition for the invariant measures
associated with certain classes of random matrices to be
singular and "fractal-like"' and a first term in an asymptotic
expansion for the largest Lyapunov exponent in a Bernoulli
random matrix setting (20). Thus noninteger dimensionality
and the classification of system evolution by the Lyapunov
spectra make sense in a stochastic environment.
The above discussion suggests that great care must be
taken in concluding that properties true for one dimension or entropy formula are true for another, intuitively related,
stochastic or deterministic settings, in general it is not valid
to infer the presence of an underlying deterministic system
from the convergence of algorithms designed to encapsulate
properties of invariant measures.
Correlation Dimension, and a Counterexample
A widely used dimension algorithm in data analysis is the
correlation dimension (21). Fix m, a positive integer, and r,
a positive real number. Given a time-series of data u(1), u(2),
. . . , u(N), from measurements equally spaced in time, forma sequence of vectors x(1), x(2), .
. ., x(N - m + 1) in R',defined by x(i) = [u(i), u(i + 1),
. . ., u(i + m - 1)]. Next,define for each i, 1 s i s N - m + 1,
C,(r)=
(number ofj such that d[x(i), x(j)] . r)/(N - m + 1).
[1]
We must define d[x(i), x(j)] for vectors x(i) and x(j). We
follow Takens (22) by defining
d[x(i), x(j)] =
max
(Iu(i + k - 1) - u(j + k - 1)j).
[21
k=1,2,...,m
From the C, (r), define
N-m+l
Cm(r) = (N - m + i)-1
E
C7(r)
i=l
[3]
and define
[4]
13M
= lim
lim log C'(r)/log r.
r- ON--*-
The assertion is that for m sufficiently large, 13m is the
correlation dimension. Such a limiting slope has been shown
to exist for the commonly studied chaotic attractors.
This procedure has frequently been applied to experimental
data; investigators seek a "scaling range" of r values for which log Cm(r)/log r is nearly constant for large m, and they infer
that this ratio is the correlation dimension (21). In some
instances, investigators have concluded that this procedure
establishes deterministic chaos.
The latter conclusion is not necessarily correct: a con-
verged, finite correlation dimension value does not guarantee
that the defining process
is deterministic. Consider the
following stochastic process. Fix 0 cp c 1. Define Xj = a-l/2 sin(2irj/12) for all j, where a is specified below. Define Yj
as a family of independent identically distributed (i.i.d.) real random variables, with uniform density on the interval
[-f3-, \F3]. Define Zi as afamily of i.i.d. random variables,Zj
a = (
sin2(2lrj/12))/12,
[5]
and define MIXj = (1 - Zj) Xj + ZjYj. Intuitively, MIX(p) is
generated by first ascertaining, for each j, whether the jth
sample will be from the deterministic sine wave or from the random uniform deviate, with likelihood (1 - p) ofthe former
choice, then calculating either Xj or Yj. Increasing p marks a
tendency towards greater system randomness.
We now show that almost surely (a.s.) /m in Eq. 4 equals
0 for all m for the MIX(p) process, p # 1. Fix m, define k(j)
= (12m)j - 12m, and define Nj = 1 if (MIXk,)+l,
*MIXk(j)+m) = (X1, .
. ., Xm), Nj = 0 otherwise. The Nj arei.i.d. random variables, with the expected value of Nj, E(Nj),
(1 - p)m. By the Strong Law of Large Numbers, a.s.
N
lim E NjIN = E(Nj):- (1
p)m.
N-
c J= 1Observe that (Xjt1 Nj/l2mN)2 is a lower bound to Cm(r),
since x~i)+i = xku)+l if Ni = Nj = 1. Thus, a.s. for r < 1 lim sup log Cm(r)/log r
N-*oo
~~~~~~~~~~~~~N--.ooN
2
( , Nj/12mNlog((1 - p)2m/(12m)2)/log r.
J=1
Since (1 - p)2m/(12m)2 is independent of r, a.s. (3m = lim,,0
limN ,, log Cm(r)/log r = 0. Since (3m # 0 with probability 0
for each m, by countable additivity, a.s. for all m, (m = 0.
The MIX(p) process can be motivated by considering an
autonomous unit that produces sinusoidal output, surrounded
by a world of interacting processes that in ensemble produces
The extent to which the surrounding world interacts with the
unit could be controlled by a gateway between the two, with a larger gateway admitting greater apparent noise to compete
with the sinusoidal signal.
It is easy to show that, given a sequence Xj, a sequence of i.i.d. Yj, defined by a density function and independent ofthe
2298 Mathematics: Pincus
bigger is more correlated
2299
Xi, and Zj = Xj + Yj, then Zj has an infinite correlation
dimension.
It appears that correlation dimension distin-
guishes between correlated and uncorrelated successive it-
erates, with larger estimates of dimension corresponding to
more uncorrelated data. For a more complete interpretation
correlated increments should be analyzed.
Error estimates in dimension calculations are commonly
chastic distribution to estimate misclassification probabilities.
Without knowing the form of a distribution, or ifthe system is
deterministic or stochastic, one must be suspicious of error
noninteger dimension value, to give a fractal and chaotic
interpretation to the result, but again, prior to a thorough study
sion and the time series formula labeled correlation dimension,
it is speculation to draw conclusions from a noninteger cor-
relation dimension value.
K-S Entropy and ApEn Shaw (23) recognized that a measure of the rate of informa-
tion generation of a chaotic system is a useful parameter. In
1983, Grassberger and Procaccia (10) developed a formula,
motivated by the K-S entropy, to calculate such a rate from time series data. Takens (22) varied this formula by intro-
ducing the distance metric given in Eq. 2; and Eckmann and
Ruelle (8) modify the Takens formula to "directly" calculate
the K-S entropy for the physical invariant measure presumed
to underlie the data distribution. These formulas have be-
come the "standard" entropy measures for use with time-
series data. We next indicate the Eckmann-Ruelle (E-R)
entropy formula, with the terminology as above.
N-m+1
Define 4Vm(r) = (N - m + 1)-1
log C7i(r).
[6]
Heuristically, E-R entropy and ApEn measure the (loga- rithmic) likelihood that runs of patterns that are close remain
close on next incremental comparisons. ApEn can be com-
puted for any time series, chaotic or otherwise. The intuition
motivating ApEn is that if joint probability measures (for these "constructed" m-vectors) that describe each of two
systems are different, then their marginal distributions on a
fixed partition are likely different. We typically need orders
ginals than to perform accurate density estimation on the
fully reconstructed measure that defines the process.
A nonzero value for the E-R entropy ensures that a known
deterministic system is chaotic, whereas ApEn cannot certify
provided by E-R entropy and not by ApEn. Also, despite the
algorithm similarities, ApEn(m, r) is not intended as an
approximate value of E-R entropy. In instances with a very
large number of points, a low-dimensional attractor, and a large enough m, the two parameters may be nearly equal. It
is essential to consider ApEn(m, r) as a family of formulas,
and ApEn(m, r, N) as afamily of statistics; system cornpar-
isons are intended with fixed m and r.
ApEn for m = 2
I demonstrate the utility of ApEn(2, r, 1000) by applying this
statistic to two distinct settings, low-dimensional nonlinear
deterministic systems and the MIX stochastic model.
(i) Three frequently studied systems: a Rossler model with
superimposed noise, the Henon map, and the logistic map. Numerical evidence (24) suggests that the following system of
equations, Ross(R) is chaotic for R = 1:
dx/dt = -z - y dy/dt = x + 0.15y
E-R entropy = lim lim
lim [Pm(r) - m+l(r)].
[71
Note that
¢m
(r) -( '(r)
= average over i of log[conditional probability that
Iu(i + m) - u(i + m)l < r, given that Iu(j + k) - u(i + k)l
s<rfor k = O. 1, ... ., m - 1].
[8]
The E-R entropy and variations have been useful in
classifying low-dimensional chaotic systems. In other con-
texts, its utility appears more limited, as it exhibits the
statistical deficiencies noted in the Introduction. Since E-R
entropy is infinity for a process with superimposed noise of
any magnitude (7), for use with experimental data an approx-
imation of Eq. 7 must be employed with a meaningful range
below, a converged "entropy" calculation for a fixed value of
r no longer ensures a deterministic system. Also, E-R
entropy does not distinguish some processes that appear to
differ in complexity; e.g., the E-R entropy for the MIX
process is infinity, for all p # 0.
Fix m and r in Eq. 6; define
ApEn(m, r) = lim [4"'(r) - m+1(r)].
[9]
N-ixo
Given N data points, we implement this formula by defining
the statistic (introduced in ref. 7)
ApEn(m, r, N) = cm(r) - m+l(r).
[10]
dz/dt = 0.20 + R(zx - 5.0).
[11]
Time series were obtained for R = 0.7, 0.8, and 0.9 by
integration via an explicit time-step method with increment
0.005. The y values were recorded at intervals of At = 0.5.
Noise was superimposed on each y value by the addition of
i.i.d. gaussian random variables, mean 0, standard deviation
0.1. The respective system dynamics are given by noise
superimposed on a twice-periodic, four-times-periodic, and
chaotic limit cycle. The logistic map is given by
xi+, = Rxi(l - x,). [12]
Time series were obtained for R = 3.5, 3.6, and 3.8. R = 3.5
produces periodic (period four) dynamics, and R = 3.6 and R
= 3.8 produce chaotic dynamics. A parametrized version of
the Henon map is given by
Xi+i = Ry, + 1 - 1.4x3I
yi+l = 0.3Rxi. [13]
Time series for xi were obtained for R = 0.8 and 1.0, both of
which correspond to chaotic dynamics. All series were
generated after a transient period of 500 points. For each value ofR and each system, ApEn(2, r, N) was calculated for time series of lengths 300, 1000, and 3000, for two values of
calculated for each system. Table 1 shows the results.
Notice that for each system, the two choices of r were
constant, though the different systems had different r values.
One can readily distinguish any Rossler output from Henon
Mathematics: Pincus
2299
Xi, and Zj = Xj + Yj, then Zj has an infinite correlation
dimension.
It appears that correlation dimension distin-
guishes between correlated and uncorrelated successive it-
erates, with larger estimates of dimension corresponding to
more uncorrelated data. For a more complete interpretation
correlated increments should be analyzed.
Error estimates in dimension calculations are commonly
chastic distribution to estimate misclassification probabilities.
Without knowing the form of a distribution, or if the system is
deterministic or stochastic, one must be suspicious of error
noninteger dimension value, to give a fractal and chaotic
interpretation to the result, but again, prior to a thorough study
sion and the time series formula labeled correlation dimension,
it is speculation to draw conclusions from a noninteger cor-
relation dimension value.
K-S Entropy and ApEn Shaw (23) recognized that a measure of the rate of informa-
tion generation of a chaotic system is a useful parameter. In
1983, Grassberger and Procaccia (10) developed a formula,
motivated by the K-S entropy, to calculate such a rate from time series data. Takens (22) varied this formula by intro-
ducing the distance metric given in Eq. 2; and Eckmann and
Ruelle (8) modify the Takens formula to "directly" calculate
the K-S entropy for the physical invariant measure presumed
to underlie the data distribution. These formulas have be-
come the "standard" entropy measures for use with time-
series data. We next indicate the Eckmann-Ruelle (E-R)
entropy formula, with the terminology as above.
N-m+1
Define 4Vm(r) = (N - m + 1)-1
log C7i(r).
[6]
Heuristically, E-R entropy and ApEn measure the (loga- rithmic) likelihood that runs of patterns that are close remain
close on next incremental comparisons. ApEn can be com-
puted for any time series, chaotic or otherwise. The intuition
motivating ApEn is that if joint probability measures (for these "constructed" m-vectors) that describe each of two
systems are different, then their marginal distributions on a
fixed partition are likely different. We typically need orders
ginals than to perform accurate density estimation on the
fully reconstructed measure that defines the process.
A nonzero value for the E-R entropy ensures that a known
deterministic system is chaotic, whereas ApEn cannot certify
provided by E-R entropy and not by ApEn. Also, despite the
algorithm similarities, ApEn(m, r) is not intended as an
approximate value of E-R entropy. In instances with a very
large number of points, a low-dimensional attractor, and a large enough m, the two parameters may be nearly equal. It
is essential to consider ApEn(m, r) as a family of formulas,
and ApEn(m, r, N) as afamily of statistics; system cornpar-
isons are intended with fixed m and r.
ApEn for m = 2
I demonstrate the utility of ApEn(2, r, 1000) by applying this
statistic to two distinct settings, low-dimensional nonlinear
deterministic systems and the MIX stochastic model.
(i) Three frequently studied systems: a Rossler model with
superimposed noise, the Henon map, and the logistic map. Numerical evidence (24) suggests that the following system of
equations, Ross(R) is chaotic for R = 1:
dx/dt = -z - y dy/dt = x + 0.15y
E-R entropy = lim lim
lim [Pm(r) - m+l(r)].
[71
Note that
¢m
(r) -( '(r)
= average over i of log[conditional probability that
Iu(i + m) - u(i + m)l < r, given that Iu(j + k) - u(i + k)l
s<rfor k = O. 1, ... ., m - 1].
[8]
The E-R entropy and variations have been useful in
classifying low-dimensional chaotic systems. In other con-
texts, its utility appears more limited, as it exhibits the
statistical deficiencies noted in the Introduction. Since E-R
entropy is infinity for a process with superimposed noise of
any magnitude (7), for use with experimental data an approx-
imation of Eq. 7 must be employed with a meaningful range
below, a converged "entropy" calculation for a fixed value of
r no longer ensures a deterministic system. Also, E-R
entropy does not distinguish some processes that appear to
differ in complexity; e.g., the E-R entropy for the MIX
process is infinity, for all p # 0.
Fix m and r in Eq. 6; define
ApEn(m, r) = lim [4"'(r) - m+1(r)].
[9]
N-ixo
Given N data points, we implement this formula by defining
the statistic (introduced in ref. 7)
ApEn(m, r, N) = cm(r) - m+l(r).
[10]
dz/dt = 0.20 + R(zx - 5.0).
[11]
Time series were obtained for R = 0.7, 0.8, and 0.9 by
integration via an explicit time-step method with increment
0.005. The y values were recorded at intervals of At = 0.5.
Noise was superimposed on each y value by the addition of
i.i.d. gaussian random variables, mean 0, standard deviation
0.1. The respective system dynamics are given by noise
superimposed on a twice-periodic, four-times-periodic, and
chaotic limit cycle. The logistic map is given by
xi+, = Rxi(l - x,). [12]
Time series were obtained for R = 3.5, 3.6, and 3.8. R = 3.5
produces periodic (period four) dynamics, and R = 3.6 and R
= 3.8 produce chaotic dynamics. A parametrized version of
the Henon map is given by
Xi+i = Ry, + 1 - 1.4x3I
yi+l = 0.3Rxi. [13]
Time series for xi were obtained for R = 0.8 and 1.0, both of
which correspond to chaotic dynamics. All series were
generated after a transient period of 500 points. For each value ofR and each system, ApEn(2, r, N) was calculated for time series of lengths 300, 1000, and 3000, for two values of
calculated for each system. Table 1 shows the results.
Notice that for each system, the two choices of r were
constant, though the different systems had different r values.
One can readily distinguish any Rossler output from Henon
Mathematics: Pincus
m window size
experiential enactive atmosphere
WARM CLOUDS, Embodied, ensemble, enactive steerable simulations, dense sensing dense media, 2016-2018 Synthesis: Brandon Mechtley, C. Roberts, Sha, with J Stein, C. Rawls, B Nandin, E Vasquez,
atmosphere workshop
Papers (Foerster, Navab, Vasquez, Dumont, Sha), Experiments, Workshops (Montreal, Berlin)
67
Serra Vegetal Life
Oana Suteu Khintirian, Todd Ingalls, Sha Xin Wei, Ginette Laurin, O Vertigo + Synthesis https://vimeo.com/synthesiscenter/serramay2018
68
time lenses: heterotemporality
Time Lenses 2016, J Stein, OS Khintirian, T. Ingalls, Sha X.W. at Les corps dessinant • EsPAS/ACTE/CNRS, Musée des arts et métiers
69
non-anthropocentric temporality
Koyaanisqatsi, Godfrey Reggio, Philip Glass, Ron Fricke.
71
Key Synthesis Affiliates Todd Ingalls Adam Nocek Emiddio Vasquez Kostalena Michelaki Pavan Turaga Julian Stein Adrian Freed Evan Montpellier Kristi Garboushian Prashanth Seshasayee Brandon Mechtley Althea Pergakis Gabriella Isaacs Lauren Hayes Robert LiKamWa Chris Roberts Andreas Spanias Garrett L Johnson Loren Olson Qiao Wang Connor Rawls Assegid Kidane Garth Paine Luke Kautz Ronald Broglio Katie Jung Brenda McCaffrey Helga Wild Maja Kuzmanovic Ruokun Chen Pete Weisman Byron Lahey Ian Shelanskey Matthew Briggs Rushil Anirudh Ben Nandin Caroline Fernandez Jessica Rajko Megan Patzem Sander van der Leeuw Chelsea Courtney John MacCallum Mike Krzyzaniak Shahab Sagheb Cheryl Marston Josh Gigantino Nayely Velez-Cruz Sylvia Chavez Chris Ziegler Josh Stark Navid Navab Tain Barzso Chris Zlaket Josée-Anne Drolet Nik Gaffney Tamara Underiner Christian Montoro Keyaanna Pausch Niklas Damiris Tom Hartmann Daisy Nolz Kevin Klawinski Nina Bouchard Varsha Iyengar Dehlia Hannah Kimberlee Swisher Oana Suteu Khintirian Vinay Venkataraman Ed Finn