Chemical Carnot cycles, Landauers Principle, and the Thermodynamics - - PowerPoint PPT Presentation

chemical carnot cycles landauer s principle and the
SMART_READER_LITE
LIVE PREVIEW

Chemical Carnot cycles, Landauers Principle, and the Thermodynamics - - PowerPoint PPT Presentation

Chemical Carnot cycles, Landauers Principle, and the Thermodynamics of Natural Selection Abstract It has seemed inescapable to many investigators from Brillouin and Schroedinger onward, that life should be understood as a chemical


slide-1
SLIDE 1

Chemical Carnot cycles, Landauer’s Principle,

and the

Thermodynamics of Natural Selection

slide-2
SLIDE 2

Abstract

  • It has seemed inescapable to many investigators from Brillouin and

Schroedinger onward, that life should be understood as a chemical system in which the flow and storage of energy are related to the flow and storage of

  • information. However, the appropriate definition of information, and the

manner in which its storage or flow may be limited by energetics, are not nearly understood even today. The separation of timescales from metabolic to evolutionary processes, and the flow of constraint and control between them, make an analysis from the details exceedingly difficult, and leave unanswered the question what options may be open to evolutionary innovation. The second law

  • f thermodynamics gives some universal albeit limited constraints on

information flow, but these are difficult to interpret directly in chemical terms. In this talk, I show how a decomposition similar to the Carnot decomposition for heat engines may be performed in chemistry, to give a general calculus and interpretation of chemical information limits. I then show that the decomposition corresponds, step for step, to the familiar Landauer derivation of the limits on computation, giving us an operational interpretation of life as a computational process, and using chemistry to clarify certain assumptions (appropriately) made by Landauer. The elementary application of such results is to metabolism and growth, but for variety I show that it also provides an illuminating analysis of a clever application of Shannon’s theorem to the problem of reliable sequence recognition, proposed by Tom Schneider.

slide-3
SLIDE 3

Outline

  • The subtle task of asking sensible questions

about information in the biosphere

  • Sample questions, difficulties, paradoxes
  • What role for equilibrium reasoning?
  • The chemical Carnot construction
  • The relation to computation
slide-4
SLIDE 4

Big and little questions

  • (Big) how does energy flow limit the

informational state of the biosphere?

  • (Little) how does energy flow limit the

change in information in the biosphere? Requires theory of biological decay Can get from equilibrium thermodynamics (Similar questions can be asked about individuals, species, etc., as about the whole biosphere)

slide-5
SLIDE 5

The obvious (little) answer

  • Follows from dimensional analysis and the

definition of temperature

  • Information gain should be entropy loss
  • Heat is entropy carried by energy
  • Work is an entropy-less energy source

In what senses is such an answer

useful? wrong? irrelevant?

dW = dQ = −TdS ≡ TkBdI

slide-6
SLIDE 6

Intuition about energy and information

Configuration Space Volume: V = e description length: L V sys. + res. = V sys. × V res. If independence L sys. + res. = L sys. + L res. max L sys. + res. at ∂L sys. ∂ Energy = ∂L res. ∂ Energy ≡ 1 kBT and If exchanged energy is the constraint overall

slide-7
SLIDE 7

Entropy and information (about units)

kBT ≡ τ SX kB ≡ σX TSX = τσX Traditionally, chemists recognized description length as entropy: Simplify our notation more natural and sensible energy units ∂S ∂U ≡ 1 T ∂ Desc. Length ∂ Energy = 1 kBT

  • Desc. Length : L

= S kB dI = −dσ dW = dQ = −τdσ = τdI Gain information by reducing description length Traditionally, chemists recognized Relation between energy and information then takes a simpler form?

slide-8
SLIDE 8

Moving information around

  • Suppose you want to go from:

to:

Heat = ∂ Energy ∂ Entropy × d Description length dQ = kBT × dS

But if these variables didn’t have thermal energy to give:

d Work = Heat dW = dQ

Work that must be brought in from outside

slide-9
SLIDE 9
  • I. The complex problem of thinking

about information in the biosphere

  • Many levels, separation of timescales, and

flow of constraint and control make assembling from the molecules very hard

  • Which information? Genes? Heats?
  • Which building process? Metabolism?

Natural selection?

  • What level? Individuals? Ecosystems?

Biosphere?

slide-10
SLIDE 10

How I think about these talks

  • I am not mainly concerned with any one

application

  • In many ways, this work will fall short of

answering any of them adequately

  • I want a framework that is at least

compatible with answering these questions

  • I will try to use examples to identify useful

ways of thinking

slide-11
SLIDE 11

Control flows and error correction

  • Long-lived states

“control” faster processes

  • “Errors” removed

by both control and selection

  • References are

contained in both system and environment

transcription /translation ∼ 101 − 102s

catalysis ∼ 10−6s

assembly, interactions ∼ 10−3 − 102s reproduction, death ∼ 103 − 108s

regulation, placticity ∼ 101 − 106s

allosteric regulation ∼ 10−3 − 100s

slide-12
SLIDE 12

A paradox: What price for evolution?

∆W = ∆F pure − ∆F mixed ∼ kBT

  • ∆S(comb)

pure

− ∆S(comb)

mixed

kBT

  • M

10g/Mol

  • genome entropy 106 − 108

whereas: (extensive) (intensive)

slide-13
SLIDE 13

Extensive and intensive entropies?

  • Scaling relations

suggest that physiology limits memory systems

  • Heat of formation

is like a heat of phase transition

  • Adaptive (species-

level) information behaves like a global order parameter

Cavalier-Smith, Annals of Botany 95: 147-175 (2005)

slide-14
SLIDE 14

The motivation to think about bounds rather than models

  • Bounds from reversible processes

also constrain irreversible ones

  • Reversible-process bounds can be

aggregated through state variables; irreversible models usually cannot be

  • Bounds supersede models, unknown

innovations, and ignorance of details

slide-15
SLIDE 15

The challenge of using equilibrium information for the biosphere

  • Life involves kinetics as well as

energetics

  • Our biosphere could (?) be a

“frozen accident”

  • Only if barriers are small

enough that energy flow is limiting is information a relevant constraint

  • But such limits can be

suggested in surprising places...

slide-16
SLIDE 16

Allometric scaling of growth

West G.B., Brown J.H. & Enquist B.J. (2001) A general model for otogenetic growth. Nature, 413, 628-631

B0m3/4 = Bc mc m + Ec mc dm dt m M 1/4 = 1 − e−τ Energy balance in

  • ntogenetic growth

Consequence: scale-invariant growth trajectories d dτ m M 1/4 = 1 − m M 1/4

slide-17
SLIDE 17

Informational consequences of allometric scaling

Elifetime M = Ec mc τD dτ

  • 1 − e−τ3

EM ∼ kBT M 10gNA Elifetime EM = Ec kBTNA 10g mc τD dτ

  • 1 − e−τ3

Ec kBTNA 10g mc ≈ 30

  • Energy/mass used by any

stage of life is an invariant

  • What minimal energy would

we expect is needed to put “information” into biomass?

  • Energy/ideal by any life

stage is an invariant

  • Formation of biomass is

clocked by information, not directly by energy Q: Does life history depend on energy or information?

slide-18
SLIDE 18

Curious consequences

  • No direct evidence from growth that there

is a cost to maintaining the living state

  • Even decay seems to be created in

proportion to growth and repair processes

  • Living system scale as if they were on the

energy/information bound, even though they deviate from it by an “inefficiency” factor

slide-19
SLIDE 19
  • II. Instantiating chemical

measures of information

  • Would like a model that is as equivocally

metabolic and evolutionary

  • A literal subsystem is more intuitive than an

abstract vision of “life”

  • Consider cycles to leverage the Carnot

construction from engines

slide-20
SLIDE 20

Thermodynamics of chemistry

GX = NXµX µX = ¯ µX + τ log

  • [X]

¯ X

  • Extensive systems and

the chemical potential Often convenient to work with concentrations NX = V NA [X] How improbable is a chemical state? GX = HX − τσX σX = 1 τ (HX − NXµX) = 1 τ (HX − NX ¯ µX) − NX log

  • [X]

¯ X

  • e−GX/τ = e−HX/τ × eσX

Probability to form a state comes from internal and external context Chemical entropy satisfies an informational chain rule (Often choose [X] to refer to an equilibrium), but not always _

slide-21
SLIDE 21

Toy model for metabolism & evolution

http://www.cem.msu.edu/ ~reusch/VirtualText/nucacids.htm

NATP +

N+1

  • i=1

Mαi ⇋ NAMP + 2NPi + Π

α

  • [ATP]

[AMP] [Pi]2 N = [Π

α]

Z

z=1 [Mz]ν

α z K

α(T)

AMP + 2Pi ⇋ ATP

http://www.rpi.edu/dept/bcbp/ molbiochem/MBWeb/mb1/part2/f1fo.htm

Phosphate-driven polymerization ATP regeneration (Possibly sequence-dependent) equilibrium relations

slide-22
SLIDE 22

Can one model be representative?

  • Polymer degradation (digestion) and re-

synthesis (anabolism) account for much of the energy of physiology

  • (and we can generalize to other reactions once we see

how the answer looks)

  • Saw in the evolution example that genomic

information behaves like a global information difference between species

  • Sidenote: the RNA-world idea for origin of

life identifies these two, by equating self- replicating RNA with individuals

slide-23
SLIDE 23

Reactions and chemical work

  • µΠ

α −

Z

  • z=1

ν

α z µMz = N (µATP − µAMP − 2µPi)

dW ≡

  • X

dGX =

  • X

µXdNX The “van’t Hoff reaction box” GX = NXµX µX = ¯ µX + τ log

  • [X]

¯ X

  • Extensivity
  • Typ. concentration

dependence Partial equilibrium Express chemical work from mechanical work

slide-24
SLIDE 24

The “chemical Carnot cycle”

  • ฀฀
  • dW ≡
  • X

dGX =

  • X

µXdNX µΠ

α −

Z

  • z=1

ν

α z µMz = N (µATP − µAMP − 2µPi)

∆NΠ

α = −∆NΠ β

  • dW

= µΠ

α ∆NΠ α + µΠ β ∆NΠ β

= ∆GCD

Π

α + ∆GAB

Π

β

Net work is change in free energies of polymer reservoirs

slide-25
SLIDE 25

Chemical “Carnot efficiency”

  • dW

= µΠ

α ∆NΠ α + µΠ β ∆NΠ β

= ∆GCD

Π

α + ∆GAB

Π

β

  • dW =
  • 1 −

µΠ

β

µΠ

α

  • ∆GCD

Π

α

∆NΠ

α = −∆NΠ β

  • ฀฀
  • Chemical work = area

inside the “Carnot” box

  • Efficiency relates total

work to “capacity” along arc CD

Efficiency

slide-26
SLIDE 26

Unpacking the work/information relation in chemical terms

dW ≡

  • X

dGX =

  • X

µXdNX All terms in the work expression depend on the concentration Chemical work is change in free energy: Here chemical work is referring concentrations to their equilibrium values If equilibrium is a referenece, is its concentration; More important, all are equal; we could just write µX = ¯ µX + τ log

  • [X]

¯ X

  • dNX = V NAd [X]

¯ X

  • ¯

µX dW = ¯ µ

  • X

dNX + τV NA

  • X

d [X] log

  • [X]

¯ X

  • ¯

µ

slide-27
SLIDE 27

Chemical work and information

NΠ ≡

  • α

α

p

α ≡ NΠ

α

NΠ = [Π

α]

  • α [Π

α]

  • dW

= NΠτ

  • α
  • dp

α log p α

π

α

= NΠτ

  • dD(p π)

D(p π) ≡

  • α

p

α log p α

π

α

dW ≡

  • X

dGX =

  • X

µXdNX µX = ¯ µX + τ log

  • [X]

¯ X

  • Consider fractions of polymers

Dilute-solution chemical potentials

  • Express cycle work as

function of distributions relative to equilibrium

  • Kullback-Leibler divergence,
  • r “relative entropy”
slide-28
SLIDE 28

So we have one answer

  • Said we expected a second-law like relation
  • Over cyclic transformations, chemical

measure of information is the K-L divergence from an equilibrium

  • dW = NΠτ
  • dD(p π)

dW = dQ = −TdS ≡ τdI

slide-29
SLIDE 29

Reference uniformity, not equilibrium?

µX = ¯ µX + τ log

  • [X]

¯ X

α]

  • Π

β

= e−(¯

µ

α−¯

µ

β)/τ

¯ Π

α

  • ¯

Π

β

  • π

α ∝ e−¯ µ

α/τ

Recall more particles make higher potential: Shannon entropy refers to uniform distributions: If we use a uniform reference for them all, recover the Gibbs distribution at equilibrium: Can apply to the polymers: Split the K-L divergence into a chemical part and a Shannon entropy S(p) = −

  • α

p

α log p α

D(p q) =

  • α

p

α log

1 π

α

  • − S(p)

= 1 τ

  • α

p

α¯

µ

α − S(p)

slide-30
SLIDE 30

The energy/entropy representation

  • ฀฀
  • ฀฀
  • dW = NΠτ
  • dD(p π)
  • dW −
  • dH = −τ
  • dσ.

τdD(p π) =

  • α

dp

αh0 Π

α − τ

  • dS(p) +
  • α

dp

ασ0 Π

α

  • (Chain rule)

α

= HΠ

α − τσΠ α = NΠ αµΠ α

¯ µΠ

α

= h0

Π

α − τσ0

Π

α

D(p q) = 1 τ

  • α

p

α¯

µ

α − S(p)

slide-31
SLIDE 31

Our second energy-information relation

  • General second-law:
  • Non-internal energy part of the work pays

to move Shannon entropy

  • ฀฀
  • ฀฀
  • dW −
  • dH = −τ
  • dσ.

dW = dQ = −τdσ = τdI

slide-32
SLIDE 32
  • III. The parallel thermodynamics
  • f computation
  • Can we attach a minimum energy cost to

algorithms, and not merely machines?

  • Does the cost aggregate in the same manner

as the logic of computation?

  • What relation of computation to chemistry?
slide-33
SLIDE 33

Attaching energetic costs to algorithms

  • All computable functions can be

generated from a finite list of primitive Boolean operations

  • Decompose every such operation

into input, logic, output, and erasure

  • Recognize that input, logic, and
  • utput can be done reversibly
  • Erasure alone converts data entropy

to heat entropy

  • The cost of a computation is the

cost of the erasures it requires

A B O A B O 1 1 1 1 1 1 1

slide-34
SLIDE 34

Example: the Szilard single-particle gas

  • Consider ideal calculation of

XOR

  • Input: two IID binary streams
  • Output: one IID binary stream
  • “Parity”-entropy of output is a

component of input entropy

  • Sign(x1)-entropy of input

stream is rejected to heat bath

  • ฀฀
  • ฀฀

S(X) = S(Y ) + Q/T

x1 x2 y 1 1 1 1 1 1

“Landauer’s principle”

slide-35
SLIDE 35

The “Landauer cycle”

  • Intake of data bits from high-entropy input

stream is arc AB

  • Erasure/rejection of heat is BC
  • Rejection of data bits to low-entropy
  • utput stream is arc CD
  • Data take the place of ,N

in chemistry

  • ฀฀
  • The Landauer cycle is the

chemical Carnot cycle µ

slide-36
SLIDE 36

Links of computation to chemistry

  • Temperature and entropy are universals for

heat engines, chemistry, and computation

  • Chemical-number variables are the novelty;

correspond to data streams in computation

  • Ensemble treatment of data is equivalent to

ensemble treatment of molecular arrangement (a new insight for computation from chemistry)

slide-37
SLIDE 37

A chemical application of computational theory (Tom Schneider)

  • Classic information theory problem: reliable

signal communication over noisy channels

  • Concept of error-correcting encoding can be

formulated as a computation problem

  • Optimal error correction can be assigned an

energetic cost

  • Through the Landauer-chemistry map, same

ideas can be applied to optimal molecular recognition

http://www-lmmb.ncifcrf.gov/~toms/

slide-38
SLIDE 38

Computation in relation to error-correcting encoding

  • ฀฀฀
  • (reversible

computer) (reversible computer) Traditionally we erase the channel noise, passing the input signal entropy through to the output

slide-39
SLIDE 39
  • ฀฀฀

Shannon’s theorem for channel capacity (Gaussian channel)

C = 1 2 log P + N N

  • [D (P + N)]D

[DN]D ∼ P + N N D = eD log( P +N

N )

  • D (P + N)

√ DN

Q: Can we encode messages so that they can be recovered with probability approaching unity, even at finite channel noise?

Fill D-bit code space with maximally distant spheres Channel capacity per symbol transmitted

slide-40
SLIDE 40

Optimal molecular recognition

  • “Prime” a protein in solution (introduce internal

energy to stress its conformation)

  • Allow binding to a random site on DNA or RNA
  • Allow priming energy to relax as protein

migrates along chain, as a function of sequence

  • Reliably stop migrating only when target

sequence is found Q: What is the minimal energy cost to enable a protein to reliably select a single sequence from a suite of random possibilities?

slide-41
SLIDE 41

Schneider’s new idea

  • Usually think of binding affinity in terms of a

sum of free energies from each bond

  • Sums of free energies are

products of probabilities

  • Equivalent to a message in which each letter

contributes independently to the meaning

  • What if evolution could find a way to use

coordinated variations in position and momentum variables across multiple bonds?

e−GX/τ = e−HX/τ × eσX

slide-42
SLIDE 42
  • Schneider’s Shannon theorem

for reliable discrimination

  • dW −
  • dH = −τ

http://www-lmmb.ncifcrf.gov/~toms/

C = 1 2 log E + kBT kBT

  • D (E + kBT)
  • DkBT

[D (E + kBT)]D [DkBT]D ∼ E + kBT kBT D = e

D log “ E+kBT

kBT

“D × E” Priming (enthalpy) provides energy for D non-covalent associations

(Entropy of the protein/ sequence ensemble)

Coordinate the 2D binding affinities “Machine capacity” per degree of freedom

slide-43
SLIDE 43

Channel versus molecule problems

  • “Priming” energy corresponds to

signal power; kT corresponds to channel noise in Shannon bound

  • Shannon erases the noise power;

Schneider erases the “signal”

  • This use of enthalpy to reject

entropy is the math of 1st-order phase transition

C = 1 2 log P + N N

  • C = 1

2 log E + kBT kBT

  • ฀฀฀
slide-44
SLIDE 44

Concluding thoughts

  • Kinetics of the ensembles of life lend

themselves to a machine-like description

  • Equilibrium bounds on energy and

information work better than they “should”

  • Carnot-like decompositions give clarity to

both metabolism and evolution

  • We have a principled map between

chemistry and computation

slide-45
SLIDE 45

Further reading

  • T. M. Cover and J. A. Thomas, Elements of Information Theory

(Wiley, New York, 1991)

  • E. Fermi, Thermodynamics (Dover, New York, 1956)
  • C. Kittel and H. Kroemer, Thermal Physics, (Freeman, New

York, 1980)

  • Cavalier-Smith, Annals of Botany 95: 147-175 (2005)
  • E. Smith, Thermodynamics of Natural Selection I - III, J.
  • Theor. Biol. http://dx.doi.org/10.1016/j.jtbi.2008.02.010, 008,

013 or SFI preprint #06-03-011

  • Tom Schneider, Theory of Molecular Machines, available at

http://www-lmmb.ncifcrf.gov/~toms/