Naturalism

COMPLEXITY EXPLAINED: 3. Thermodynamic Explanation for the Increasing Complexity of our Ecosphere

(Note: This is the third part in the series on Complexity. Please read Part 1 and Part 2 first.)

file1At and soon after the moment of the Big Bang, there was little or no complexity. Then why is it that we see so much complexity around us? And why is the complexity increasing all the time? Answers to these questions require some groundwork, particularly if unexplained jargon must be avoided. I lay the groundwork in this and the next few articles in this series. In the context of our ecosphere, it is explained that the reason why its complexity has been increasing all the time is that the Sun has been bombarding it with low-entropy or high-grade energy.

We begin by describing how complexity may possibly be quantified. In the literature, most definitions of complexity are actually definitions of degree of complexity. The degree of complexity may be defined either in terms of information theory, or in terms of thermodynamics. The two are really one and the same thing, but they provide different insights into the evolution of complexity in the cosmos. In the language of information theory, the degree of complexity of a system may be defined roughly as the amount of ‘information’ needed for describing the structure and function of the system. In thermodynamic parlance, degree of complexity has been quantified by Eric Chaisson (2001) in terms of ‘rate of flow of free energy per unit mass.’ For explaining terms like ‘information’ and ‘free energy’, we have to introduce the basics of thermodynamics and information theory. We take up the thermodynamic aspect in this article.

In thermodynamics we try to describe the bulk behaviour of macroscopic systems in terms of only a few measurable parameters like pressure P volume V, and temperature T. Classical thermodynamics was formulated to understand what it takes to efficiently convert heat into mechanical work, via a heat engine. This subject has evolved and expanded to cover the study of interconversion of all forms of energy. Some central concepts in thermodynamics are those of energy and entropy, and free energy.

ENERGY AND ENTROPY

In physics we define energy as the capacity for doing mechanical work. The first law of thermodynamics states that energy is conserved. This law also takes note of two types of energy, namely heat energy Q, and work energy W. The law says that if we supply an incremental amount of heat energy dQ to a system, then dQ = dE + dW. That is, a part of dQ may the used up for doing mechanical dW, and the rest is stored as internal energy dE; there is no loss or gain of energy.

Thus energy can be converted from one form to another, but it cannot be created or destroyed. Of course, under some special circumstances, energy does get created when there is a corresponding destruction of mass, and vice versa. This interconversion of mass and energy is quantified by the famous Einstein equation, E = mc2, which says that energy E is equal to the mass m multiplied by the square of the speed of light in vacuum. This equation forms the basis of how energy is produced in a nuclear reactor by the corresponding destruction of a small amount of mass of matter. By the same token, the energy we receive from the Sun is also produced by nuclear reactions entailing a small destruction of mass. But when interconversion of mass and energy is not involved, we can say that energy is neither destroyed nor created. Or we can say that mass and energy together are conserved.

If we regard the conservation of energy as its defining feature, the same term (energy) can be used in other, non-physics, contexts as well; e.g. in neural-network dynamics. The usually conserved nature of energy constrains, and often completely determines, the dynamical and statistical behaviour of a system, in physics as well as elsewhere. An even more general definition of energy is as something that drives change. As we shall understand gradually, energy is the engine that drives the evolution of complexity everywhere in the cosmos.

Clausius (1865), building on the work of Carnot (1824), made the observation that, for an isolated system, the ratio of the heat content to the absolute temperature can never decrease. He called this ratio entropy: dS = dq/T, where dS is the incremental change of entropy S when an amount of heat dQ is exchanged at an average temperature T. Clausius’s observation follows simply from the fact that heat can flow from a hot object to a cold object, but not the other way around. Suppose one part of a gas is at temperature T1 and the other at a lower temperature T2. Then dS = dQ/T2 – dQ/T1 is a positive quantity, meaning that when heat flows, entropy increases. When heat stops flowing, entropy stops increasing, and we speak of a state of equilibrium.

In fact, this observation by Clausius is nothing but the second law of thermodynamics. The law says that, for an isolated system, entropy never decreases; it goes on increasing till the system has reached equilibrium, and then it remains constant.

Imagine a gas in a well-insulated box, which cannot exchange energy or matter with the surroundings, so that we have an isolated system. Suppose the left part of the gas is at a higher temperature than the right part; i.e. there is a ‘thermal gradient’ to start with. The second law of thermodynamics says that heat will flow from the left to the right, till all parts of the gas are at the same temperature; that would be the state of equilibrium. This brings up the important idea of irreversibility of most processes in isolated systems in Nature. The temperature differential in the box got obliterated spontaneously as the entire gas acquired a single temperature throughout. This is something irreversible. The chances that, starting from a zero-thermal-gradient or single-temperature configuration, the gas would spontaneously go back to a state in which different parts have different temperatures are so highly remote that we can say that such a process would never occur. The reality of irreversibility of natural phenomena was beautifully caught in verse by Omar Khayyam (translated by Edward Fitzgerald):

The Moving Finger writes; and, having writ,

Moves on: nor all thy Piety nor Wit

Shall lure it back to cancel half a line,

Nor all thy Tears Wash out a Word of it.

THERMODYNAMIC POTENTIALS

The term ‘potential energy’ is familiar. Imagine an object, say a stone of mass m, lying on the surface of the Earth. It is at rest, so its kinetic energy is zero. Left to itself, it will do nothing, and just stay there (in accordance with Newton’s first law of motion). Now suppose I lift it to a height h. By lifting it I am doing work against the gravitational pull of the Earth. The work I invest in the stone gets stored in it as gravitational potential energy, equal to mgh, where g is the acceleration due to gravity. This energy is available for doing work. For example, the stone, when released from my grip, will fall back on the Earth. At the moment it touches the Earth, it has kinetic energy exactly equal to the potential energy it had at the height h (if we can ignore frictional losses etc.). Thus, in this example, the potential energy is the free energy, or the energy free or available for doing work. Remember, in general, not all energy may be available for doing work; some part of it may be trapped as heat energy, or energy of chaotic motion.

Potential energy per unit volume is just called ‘potential’, and a number of distinct thermodynamic potentials can be defined, relevant in different conditions. For example, suppose dQ = 0 in the equation describing the first law of thermodynamics, namely dQ = dE + dW. Then dW = -dE. This means that the entire energy of the system (e.g. the potential energy of the stone raised to a height h) is free energy or available energy for doing mechanical work (provided there are no other dissipative or irreversible processes in operation). Processes for which dQ = 0, i.e. processes for which Q is a constant, are called adiabatic processes. Thus, for adiabatic processes, the internal energy E per unit volume plays the role of a thermodynamic potential.

A process which occurs at a constant temperature is called an isothermal process; for it dT = 0. For isothermal processes the thermodynamic potential relevant for calculating the amount of work one can extract from a system is the so-called Helmholtz potential, defined by F = E – TS. It should be noted that we are no longer dealing with an isolated system here; for keeping T constant, it should be possible for the system to exchange heat with the surroundings, if required. We skip details, but an important result is that, for such a system, the second law of thermodynamics must be stated in terms of the Helmholtz potential (or the Helmholtz free energy per unit volume), and not in terms of entropy. The law now states that the Helmholtz free energy (or just ‘free energy’, for short) can never increase. It can either decrease, or remain constant.

Another important thermodynamic potential is the Gibbs free energy per unit volume: G = E – TS + pV. Processes for which the pressure p is constant (i.e. dp = 0) are called isobaric processes. The quantity -dG is a measure of the maximum work that can be extracted from a system at constant pressure and constant temperature. The second law says that G can never increase.

Thus, natural processes occur so as to minimize the free energy, either by an increase of entropy S, or by a decrease of internal energy E, or by both these ways of decreasing the free energy. More interestingly (from the point of view of complexity), it can also happen that the entropy term TS actually decreases (instead of increasing) if there is a concomitant decrease of E which is more than the decrease in the term TS. An example of this is the growth of a crystal from a fluid phase. As we shall see shortly, a decrease in entropy means an increase in order, just as an increase in entropy means more disorder. In a crystal the atoms are arranged in an ordered manner on a lattice, whereas they move around in a chaotic way in the fluid. Thus the crystalline state has lower entropy compared to the fluid state. The crystal grows because the decrease in its internal energy (or binding energy) is by a greater magnitude than the magnitude of the decrease in the entropy term TS.

STATISTICAL MECHANICS

The above thermodynamic formulation of the first and the second laws of thermodynamics arose from the desire to understand what limits the amount of work one can extract from heat in a heat engine. An alternative, though equivalent, formulation was given in terms of statistical mechanics. Statistical mechanics is formulated in terms of probabilities, and this is what Ludwig Boltzmann did in the 19th century. The idea was to explain the observed macrostates of a system in terms of its underlying microstates. Some equations, like the Einstein equation mentioned above, have become very famous, and are familiar even to the lay public. The equation Boltzmann derived for entropy is another such equation, which is engraved on his grave: S = k log W. Here k is a constant, now called the Boltzmann constant. The symbol W is not to be confused with the same symbol used above for denoting the work energy. To understand what W means here, let us again consider a gas enclosed in a container of volume V. Different molecules in the gas have different energies, so that there is a range or spectrum of possible energies. We say that the gas can exist in a variety of microstates, each such state corresponding to a distinct distribution of molecules among the allowed states of energy. W is the number of possible microstates of the system under consideration. If this number is large, the system has high entropy. The atoms arranged regularly on a lattice in a crystal have a much smaller number of possible microstates, than the same number of atoms in the fluid from which the crystal grew. This we associate lower entropy with order, and higher entropy with disorder.

In Boltzmann’s formulation for entropy, all the microstates were assumed to be equally probable. This assumption about the equiprobability of the microstates was removed by Gibbs during the 1870s. This resulted in an equation (the so-called Boltzmann-Gibbs equation) for entropy in which log W in the above Boltzmann equation got replaced by a summation of the term -PilogPi from i = 1 to i = W. Here Pi is the probability that the system is in a microstate i. For the equiprobability case, we have Pi = 1/W for all values of the microstate index i, and this equation for entropy becomes the same as the original Boltzmann equation.

To get a feel for the above Gibbs’ formulation of entropy, it is instructive to consider the ‘free expansion’ of a gas. Imagine a gas comprising of n molecules in a chamber, separated on its right from a vacuum chamber of the same volume by a partition. If the partition disappears, the molecules of the gas would be able to move into the right half of the enlarged chamber also. Soon the gas will occupy the entire (doubled) volume uniformly. This free expansion of the gas is governed by random chance processes. What is the probability that all the n molecules of the gas will ever occupy the left half of the chamber again? It is the same as that of flipping a coin n times, and finding ‘heads’ in each case, namely 1/2n. Considering the fact that n is of the order of the Avogadro number, the answer is very close to zero indeed. In other words, the free expansion of the gas is an irreversible process. On removal of the partition, the gas has spontaneously gone into a state of greater disorder, in the sense that the probability of finding a specified molecule at a particular location in the left half of the chamber is now only half its earlier value.

The second law of thermodynamics simply states that, with the passage of time, the universe as a whole (if regarded as an isolated system) moves towards a more probable or more disordered state. The concept of entropy quantifies this fact. How much has the disorder of the gas increased on free expansion to twice the volume? Consider any molecule of the gas. After the expansion, there are twice as many positions at which the molecule may be found. At any instant of time, for any such position, there are twice as many positions at which a second molecule may be found, so that the total number of possibilities for the two molecules is now , or 22. For n molecules, there are 2n more ways in which the gas can fill the chamber after the free expansion. We say that, in the double-sized chamber, the gas has 2n more accessible states, or microstates.

The entropy of a system is defined as the logarithm (to the base 2) of the number of accessible states. For the example of the gas, the increase in entropy on free expansion is logarithm (to the base 2) of 2n, which works out to be just n.

Introduction of the logarithm in the definition of entropy makes it, like energy or mass, a property proportional to the number of molecules in the system. These properties are described as having the additivity feature.

CHAISSON’S PARAMETER FOR QUANTIFYING THE DEGREE OF COMPLEXITY

An important way of defining the degree of complexity was introduced by Eric Chaisson (2001) in his book Cosmic Evolution: The Rise of Complexity in Nature. He emphasized the importance of a central physical quantity for understanding cosmic evolution, namely free- energy rate density, or specific free energy rate, denoted by Φ (capital phi). Chaisson emphasized the fact that ‘energy flow is the principal means whereby all of Nature’s diverse systems naturally generate complexity, some of them evolving to impressive degrees of order characteristic of life and society’. The flow refers to a rate of input and output of free energy. If the input rate is zero, a system would sooner or later come to a state of equilibrium, marking an end to the evolution of complexity. If the output rate is zero, there would be disastrous consequences.

The energy per unit time per unit mass has the units of power (quantifying complexity). Other similar quantities in science are: luminosity-to-mass ratio in astronomy; power density in physics; specific radiation flux in geology; specific metabolic rate in biology; and power-to-mass ratio in engineering. Chaisson estimated the values of this parameter for a variety of systems. The results are amazing, and important. Here are some typical estimated values:

Galaxies (Milky Way) : 0.5 erg s-1 g-1

Stars (Sun) : 2

Planets (Earth) : 75

Plants (biosphere) : 900

Animals (human body) : 20,000

Brains (human cranium) : 150,000

Society (modern culture) : 500,000

Thus the degree of complexity can be seen to be increasing rapidly.

We can now provide a partial answer to the question: Why is the terrestrial complexity increasing all the time (as illustrated above in the estimates of Φ). The answer is that the Sun has been bombarding our ecosphere with low-entropy or ‘high-grade’ energy. Why ‘low-entropy’? Recall that dS = dQ/T, and the average value of the temperature T for the Sun is huge compared to that of the Earth. On entering our ecosphere, the energy of the photons coming from the Sun gets degraded to a large extent through the processes of ‘thermalization’, namely a dissipation into a state of much lower average temperature (and therefore a correspondingly high value for the entropy).

A small fraction of this energy, however, gets trapped as free energy. It is stored in our ecosphere in the form of simple ofile-11r complex molecules. Some of the energy-rich simple molecules in which the free energy from the Sun gets stored are: H2S, FeS, H2, phosphate esters, HCN, pyrophosphates, and thioesters. In the history of chemical and biological evolution on Earth, such simple molecules contributed to the evolution of complex molecules characterising life. When food is consumed by a living organism, its processing by the organism builds up a high information content for the organism, even though there is always a net rise in the global entropy (as demanded by the second law of thermodynamics). In the next article we shall establish the link between thermodynamics and information theory.

CONCLUDING REMARKS

The degree of complexity of a system can be defined in terms of the free-energy rate density Φ. The value of Φ for our ecosphere has been increasing all the time because of the low-entropy energy we have been receiving from the Sun. So far, I have outlined only the equilibrium thermodynamics of large systems. If a system is pushed only slightly away from equilibrium, it would, in all probability, tend to return to equilibrium. But if pushed sufficiently far away from equilibrium, it can really go ‘over the hill’, and is then unable to return to the old equilibrium configuration. If the influx of energy pushing the system more and more away from equilibrium continues to be present, the system would seek new steady states. This can result in an emergent self-organized order and pattern formation, implying a local lowering of entropy (or increase of order), and a concomitant evolution of complexity. Ilya Prigogine made seminal contributions to this field of research. We shall discuss it later.

I had to introduce a few equations in this article. I hope it was not heavy stuff. After all, like Stephen Hawking, we should set ourselves a lofty goal:

Why does the universe go to all the bother of existing? My goal is simple. It is complete understanding of the universe, why it is as it is and why it exists at all‘.

About the author

Vinod Wadhawan

Dr. Vinod Wadhawan is a scientist, rationalist, author, and blogger. He has written books on ferroic materials, smart structures, complexity science, and symmetry. More information about him is available at his website. Since October 2011 he has been writing at The Vinod Wadhawan Blog, which celebrates the spirit of science and the scientific method.

9 Comments

  • Dr, Wadhawan,

    Thanks for adding one more article on subject, for making people in general / common like us, understand the scientific features, the many of which, we common people not understand

  • Plain Pointed Science Writing Does’nt Evoke Respect?
    On Living Systems, Energy Flux And Cosmic Evolution

    A. From Eric J. Chaisson’s
    “Cosmic Evolution: The Rise of Complexity in Nature”
    http://www.2think.org/cosmicevolution.shtml

    “living systems evolved in the past within environments rich in energy flux, and thus have inherited the means to acquire the needed energy flow via metabolic processes. The pathways open to biological evolution are constrained, not because few solutions exist but because energy resources are limited; natural selection exploits energy flows, determining which flows are conducive to the system, thereby apparently optimizing them.” (p. 180)

    B. Why gibber, why not write plainly and to the point

    IMO the outstanding common feature of Chaisson’s, and probably of some other books about life and cosmic evolutions, as well as of many reviews and comments about them, is gibberish in various degrees of pseudoscience and pseudosophistication.

    Points that are clear in the mind should be stated plainly and concisely. If they’re clear. Or, are science readers conditioned to trust, respect and prefer gibbering and suspect plainly to the point writing?

    Dov Henis
    (Comments From The 22nd Century)

    Cosmic Evolution Simplified
    http://www.the-scientist.com/community/posts/list/240/122.page#4427

    • Firstly, like any other creationist you do not understand thermodynamics.

      Secondly, you are quote mining. That’s a disgusting practice. Shows how dishonest you are.

      To quote Dr. Wadhawan’s point in full:

      Thus, natural processes occur so as to minimize the free energy, either by an increase of entropy S, or by a decrease of internal energy E, or by both these ways of decreasing the free energy. More interestingly (from the point of view of complexity), it can also happen that the entropy term TS actually decreases (instead of increasing) if there is a concomitant decrease of E which is more than the decrease in the term TS.

      You conveniently ignored the important part – “if there is a concomitant decrease of E which is more than the decrease in the term TS”.

      Thirdly, the link you gave just obscured where the boundaries lie for a closed system with some equations and fancy talk. The decrease in order of Sun is far far greater than increase in order on Earth. That means Earth would still recieve plenty of useful energy to counter its Ewaste.

      • Supplying more energy does not mean the law of entropy is switched around. And the idea of attaching “isolated system” to the law begs the question as to the existance of a known truely isolated system.

        • Supplying more energy does not mean the law of entropy is switched around

          As I said, you do not understand thermodynamics. First know what the second law of thermodynamics means.

          And the idea of attaching “isolated system” to the law begs the question as to the existance of a known truely isolated system.

          There is something called the Universe, isn’t there?

        • Also, before you start switching goal posts, pray tell me what you found wrong in this statement (from which you quote-mined):

          More interestingly (from the point of view of complexity), it can also happen that the entropy term TS actually decreases (instead of increasing) if there is a concomitant decrease of E which is more than the decrease in the term TS.

          that warranted the utterance “Increase of Entropy is a irreversible law” (which really is a bowdlerized way of stating the second law of thermodynamics).

          • Wow. My point is quote clear. Law of entropy is irreversible. It moves in one direction. But the author of article is suggesting it can be reversible, which is a violation of the law.

            And who exactly proved the universe is a true “isolated system” ? No one. Neither was the law of thermodynamics derived by the unproved assertion of universe being isolated.

          • Wow. My point is quote clear. Law of entropy is irreversible. It moves in one direction. But the author of article is suggesting it can be reversible, which is a violation of the law.

            And that is why I said you know nothing about thermodynamics. You quote-mined the only sentence you were able to understand and drew your conclusion based on it.

Leave a Comment