Naturalism

COMPLEXITY EXPLAINED: 14. Biological Complexity at the Edge of Chaos

(Note: All previous parts in the Complexity Explained series by Dr. Vinod Wadhawan can be accessed through the ‘Related Posts’ listed below the article.)

Living entities have evolved to possess enormous amounts of order and complexity. Can Darwinian natural selection alone explain this order? 1Probably not. We must also take note of the inherent tendency of all complex adaptive systems to move towards self-organized states of optimum order. Biological and other kinds of complexity thrive best at the ‘edge of chaos,’ and this is where evolutionary forces usually operate. The dynamics of complexity around the edge of chaos is ideally suited for evolution that does not destroy self-organization.

14.1 Introduction

Self-organization is a characteristic feature of any open, far-from-equilibrium complex system. As emphasized by Stuart Kauffman, it is on this existing order that Darwinian natural selection operates and further adapts it to the environment. In other words, natural selection is not the sole source of order in biology. Being complex systems, biological entities tend to self-organize anyway. As Kauffman said in At Home in the Universe (1995):

I suspect that the fate of all complex adaptive systems in the biosphere — from single cells to economies — is to evolve to a natural state between order and chaos, a grand compromise between structure and surprise. Here, at this poised state, small and large avalanches of coevolutionary change propagate through the system as a consequence of the small, best choices of the actors themselves, competing and cooperating to survive.

He mentions ‘chaos.’ Let us begin by getting familiar with some elementary concepts in chaos theory.

14.2 Elements of Chaos Theory

A chaotic system is characterized by unpredictable evolution in space or time, even though the differential equations or difference equations describing it are deterministic (if we can neglect noise). The motions in a chaotic system are unstable, and this instability leads to a sensitive dependence on initial conditions. In the language of algorithmic information theory (cf. Part 4), chaos has the largest (but finite) degree of complexity.

Several basic features of chaos can be illustrated by recourse to the so-called logistic equation. It embodies a 1-dimensional feedback system, and was formulated as early as in 1845 to model the population dynamics of a species. The question posed was: How will the population of a species, confined to a certain geographic area, vary from year to year? Obviously, the population in a year t+1 will depend on the population in the previous year t: xt+1 = kxt; here k is some suitable constant of proportionality (or a control parameter). This equation simply models the fact that the larger the population is, the more is the number of offspring it would produce. But this cannot be the full story. There are other factors to consider. For example, if the population in the year t becomes too large, there can be an extra decimation of the numbers, either by predators, or due to shortage of food, or due to the altered competition in the reproduction dynamics. Therefore a more realistic logistic equation is as follows:

xt+1 = k xt (1-xt)

Even this is only a simplistic model of population dynamics, and more sophisticated models have been proposed and investigated. But it is sufficient for providing a good insight into how chaos sets in under certain conditions.

It is convenient to describe the population x as a fraction of the largest value, N, that it can attain. Then x varies between 0 and 1. For any fixed value of k, a plot of xt+1 against xt gives an inverted parabola, and xt = 0.5 corresponds to the top point on the parabola. Putting xt = 0.5 in the above equation gives xt+1 = k/4. Since x cannot be larger than 1, the largest value that the control parameter k can have is 4.

A fascinating variety of dynamics, including chaos, is observed as k takes various values in the range 0 to 4. Suppose we imagine a situation for the population in which k is less than 1. Suppose x0 is the value of the population in a particular year t = 0. Then we can use the logistic equation to calculate the expected population x1 in the next year. Then x1 can be put back into the logistic equation to obtain the population value x2 for the following year. We can carry out such iteration repeatedly.

Figure (a) shows how the population will change in successive years if we take k = 0.95. We find that the population eventually becomes zero. This eventual or final value of x, denoted by x*, is an attractor; attractors were introduced in Section 12.4 (Part 12). There is a basin of attraction such that every starting value x0 is eventually drawn towards the attractor x* = 0. Thus if k < 1, we have a fixed-point attractor at the zero value of the population; the conditions are too inimical for the population to survive.

Fig1Reso666.cdr

A different population dynamics is predicted by the logistic equation for 1 < k < 3. Now x* is not zero; rather it increases from a near-zero value to ~0.667 as the control parameter k is increased from 1 to 3. Figure (b) shows the results for k = 2.8.

A fundamentally different kind of dynamics emerges for 3 < k < 4: The population trajectory no longer converges to a single fixed point or attractor in phase space. Further, the trajectory becomes increasingly sensitive to the value of k. For k = 3.4 (results shown in Figure (c)), the trajectory has not one but two fixed points: one at x1* ≈ 0.452 and the other at x2* ≈ 0.842. This means that, from year to year, the population oscillates between ~45% and ~84% of the maximum possible value. We now have a two-point attractor (Figure (c)). We describe such an oscillating system as having a period 2.

This periodic attractor is actually just the beginning of still more complex dynamics as the value of k is increased. There is a critical value k ≈ 3.4495 beyond which we get a four-point attractor. For example, for k = 3.5 we get x1* ≈ 0.875, x2* ≈ 0.383, x3* ≈ 0.827, x4* ≈ 0.501. Such successive bifurcations of each attractor into two, such that there are 4, 8, 16, 32, .. etc. fixed points, occur with smaller and smaller increases in k.

We move into the chaotic regime of complexity for values of k above ~3.57. The periods now double every time k is increased by even an infinitesimally small amount. The number of points that comprise the attractor is now extremely large, and the trajectory looks quite erratic (Figure (d)), although there are some ranges of k values for which there is apparent stability. I have taken these numbers and the figures from the book Chaos Theory Tamed by G. P. Williams (1997), which should be consulted for many other fascinating details.

If the periods are going to double even for infinitesimally small increases in the value of k, there are two situations to face, one practical and the other computational. The practical aspect is that the logistic equation, or any mathematical model for explaining reality, has to finally interface with (or explain) experimental data, and such data can never be obtained with infinite accuracy. The computational aspect is that, even if we have access to infinitely accurate data, no computer can perform calculations based on the modelling equation(s) with infinite precision. It is irrelevant whether or not the computer program written for modelling a complex system is a simple one. This is why one makes the statement that a chaotic system, or rather a system in a chaotic regime, has the largest (though finite) degree of complexity.

Having achieved a nodding acquaintance with chaos theory, let us now hark back to the work of a stalwart in the field of complexity, namely Stephen Wolfram.

14.3 Wolfram’s Four Universal Classes of Cellular Automata

I introduced Wolfram’s work on cellular automata (CA) in Section 11.4 (Part 11). An extensive empirical analysis by him of all 1-dimensional CA showed that the patterns generated by them (even when we start from random or disordered initial conditions) can be generally divided into four distinct classes:

In Class 1, evolution from almost any initial state leads finally to a unique homogeneous state. This is like the occurrence of a ‘limit point’ or attractor in the phase space of a nonlinear dynamical system.

In Class 2, there is ultimately a sequence of simple stable or periodic structures. This corresponds to the occurrence of ‘limit cycles’ in phase space.

Class 1 patterns are repetitive, and Class 2 patterns are nested. Both are predictable after their repetitive or nested nature has been discerned, and are therefore computationally reducible. The black and white figure I showed in Section 11.4 is an example of a class 2 CA, and is reproduced here again.

Image Source

Image Source : wolframscience.com

Class 3 CA exhibit chaotic or aperiodic long-time behaviour. Such CA grow indefinitely at a fixed speed. Their patterns are often self-similar or scale-invariant. They are characterized by a fractal dimension, with log23 or ~1.59 as the most common value for the dimension.

Class 4 CA are the most interesting from the point of view of complex behaviour. For them the pattern grows and contracts irregularly. There are complicated localized structures, some of which propagate with time. Therefore their long-time behaviour is undecidable.

14.4 Langton’s ‘Edge of Chaos’ Idea

Evolution thrives in systems with a bottom-up organization, which gives rise to flexibility. But at the same time, evolution has to channel the bottom-up approach in a way that doesn’t destroy the organization. There has to be a hierarchy of control – with information flowing from the bottom up as well as from the top down. The dynamics of complexity at the edge of chaos seems to be ideal for this kind of behaviour.

Doyne Farmer

I introduced Neumann’s self-reproducing CA in Section 11.6. Christopher Langton (1989) (and also Norman Packard) extended the CA approach to the field of artificial life (AL) (cf. Section 10.4 in Part 10) by introducing evolution into the Neumann universe. In the self-reproducing CA created by Langton, a set of rules (the GTYPE) specified how each cell interacted with its neighbours, and the overall pattern that resulted was the PTYPE. The local rules could evolve with time, rather than remaining fixed. This pioneering work was a fine example of adaptive computation.

Langton also correlated his work on AL with Wolfram’s four universal classes of CA. We have seen above that small values of the control parameter k in the logistic equation give rise to nonchaotic behaviour. This is similar to the dynamics described by Wolfram’s Class 1 and Class 2 CA. And sufficiently large values of k result in totally chaotic dynamics, which corresponds to Class 3 CA. Langton investigated the introduction of a parameter similar to k into the rules controlling CA behaviour to check this analogy more clearly, and particularly to investigate the connection between Class 4 CA on one hand, and partially chaotic systems on the other.

After a number of trials, he came upon a parameter λ for the CA rules which corresponded to the control parameter k of the logistic equation. This λ was defined as the probability that any cell in the CA will be ‘alive’ after the next time step. In the nested CA figure above, we have chosen the colours black and white, which we can now relate to ‘alive’ and ‘dead.’ For example, if λ = 0 in the rule governing the evolution of a particular set of CA, all cells would be white or dead after one time step. The same would be true if λ = 1.

In his computer experiments, Langton found that, as expected, λ = 0 corresponds to Class 1 rules. The same was true for very small nonzero vales of λ.

As this control parameter was increased gradually, Class 2 features started appearing at some stage, with characteristic oscillating behaviour. With increasing values of the control parameter, the oscillating pattern took longer and longer to settle down.

Taking λ = 0.5 resulted in totally chaotic behaviour, typical of the Wolfram Class 3.

Langton found that clustered around the critical value λ ≈ 0.273 were Class 4 CA.

Thus, as the control parameter increased from zero onwards, he saw a transition from ‘order’ to ‘complexity’ to ‘chaos.’

The next conceptual jump Langton made was to equate this qualitative change of behaviour of CA with a phase transition. Recall that a phase transition can occur in a material when some control parameter like temperature is varied (e.g. water changes to ice as it is cooled through its freezing point). Langton realized that his control parameter λ plays a role in determining the dynamics of CA that is similar to the role played by temperature in a phase transition. At low temperatures a material is solid, say in a crystalline state, which is an ordered state. At high temperatures we have a fluid state (liquid or vapour), which signifies chaos or disorder. Langton drew the analogy with such phase transitions for describing the Class 4 behaviour in CA which sets in for values of λ around 0.273.

Phase transitions in a material are represented in phase diagrams, in which there are lines or boundaries which separate one phase from another. Langton gave the corresponding phase boundary in the Neumann universe (a kind of phase space) the name edge of chaos. We should remember, however, that this ‘edge’ or boundary is not a sharp one. It is more like a thin or thick membrane in phase space, with chaotic behaviour on one side, and ordered behaviour on the other side of the membrane. There is a gradation from chaos to complexity to order across the membrane in phase space. And complex behaviour is at its most versatile within the membrane.

Many instances can be cited for the gradation from order to complexity to chaos. Even a simple computational algorithm like the Game of Life (mentioned in Section 11.3) is a universal computing device. The Game of Life is independent of the computer used for running it, and exists in the Neumann universe, just as other Class 4 CA do. As explained by Wolfram, such CA are capable of information processing and data storage etc. They are a mixture of coherence and chaos. They have enough stability to store information, and enough fluidity to transmit signals over arbitrary distances in the Neumann universe.

There are analogies of this, not only with computation, but with life, economies, and social systems also. After all, they are all just a series of computations. Life, for example, is nothing if it cannot process information. And life strikes a right balance between too static a behaviour and excessively chaotic or noisy behaviour.

14.5 Biological Complexity at the Edge of Chaos

The occurrence of complex phase-transition-like behaviour in the edge-of-chaos domain is something very common in practically all branches of human knowledge. Kauffman (1969), for example, recognized it in genetic regulatory networks (cf. Section 12.5 in Part 12). In his work on such networks in the 1960s, he discovered that if the connections were too sparse, the network would just settle down to a ‘dead’ configuration and stay there. If the connections were too dense, there was a chaotic churning around. Only for an optimum density of connections did the stable state cycles arise.

Similarly, in the mid-1980s, Farmer, Packard and Kauffman found in their autocatalytic-set model (Section 9.4) that when parameters such as the supply of ‘food’ molecules, and the catalytic strength of the chemical reactions etc., were chosen arbitrarily, nothing much happened. Only for an optimum range of these parameters did a ‘phase transition’ to autocatalytic behaviour set in quickly.

Many more examples can be given: Coevolutionary systems; economies; social systems; etc. A right balance of defence and combat in the coevolution of two species ensures the survival and propagation of both. Similarly, the health of economies and social systems can be ensured only by a right mix of feedbacks and regulation on the one hand, and plenty of flexibility and scope for creativity, innovation, and response to new conditions on the other. The dynamics of complexity around the edge of chaos is ideally suited for evolution that does not destroy self-organization.

But why and how do complex systems move towards the edge-of-chaos regime, and then manage to stay there? Per Bak supplied a clear and profound answer in terms of his important notion of self-organized criticality.

14.6 Self-Organized Criticality

Per Bak and coworkers (1996) argued that a particularly important consequence of self-organization (in a complex adaptive system) is the occurrence of self-organized criticality (SOC).

2Let us consider a tabletop on which grains of sand are drizzling down steadily. To start with, the flat sandpile just grows thicker with time, and the sand grains remain close to where they land. A stage comes when the sand starts cascading down the sides of the table. The pile gets steeper and steeper with time, and there are more and more sandslides. With time the sandslides (avalanches or catastrophes) become bigger and bigger, and eventually some of the sandslides may span all or most of the pile. The average slope now becomes constant with time, and we speak of a stationary state.

This is a system far removed from equilibrium. Its behaviour has become collective. Falling of just one more grain on the pile may cause a huge3 avalanche (or it may not). The sandpile is then said to have reached a self-organized critical state. The edges and surfaces of the grains are interlocked in a very intricate pattern, and are just on the verge of giving way. Even the smallest perturbation can lead to a chain reaction (avalanche), which has no relationship to the smallness of the perturbation; the response is unpredictable, except in a statistical-average sense. The period between two avalanches is called a period of tranquillity (stasis) or ‘punctuated equilibrium.’

In a system in an SOC, big avalanches are rare, and small ones frequent. And all sizes are possible. There is power law behaviour: the average frequency of occurrence, N(s), of any particular size, s, of an avalanche is inversely proportional to some power τ of its size: N(s) = s. A log-log plot of this power-law equation gives a straight line, with a negative slope determined by the value of the exponent τ. The system is scale-invariant: Usually the same straight line holds for all values of s. Large catastrophic events (corresponding to large values of s) are consequences of the same dynamics which causes small events.

This is complex behaviour, and according to Bak, large avalanches, not gradual variation, can lead to qualitative changes of behaviour, and may form the basis for emergent phenomena and complexity. Bak (1996) gave several examples to make the point that Nature operates at the SOC state (or equivalently at the edge-of-chaos state). According to him, even biological evolution is an SOC phenomenon.

How do systems reach the SOC state, and then tend to stay there? The sandpile experiment provides an answer. Just like the constant input drizzle of sand in that system, a steady input of energy, or water, or electrons, can drive systems towards criticality, and then they self-organize into criticality by repeated spontaneous pullbacks from super-criticality, so that they are always poised at or near the edge between chaos and order.

We discussed beehives and ant colonies in Part 2. They are again nothing but examples of self-organization in open mutually-interacting systems. Many more examples of such ‘out of control’ complex adaptive systems exist.

14.7 Further Evolution of Complexity at the Edge of Chaos

The next question is: What do complex systems do when they have reached the edge of chaos? In the phase space of the dynamical system, the edge of chaos is a thin membrane, a region of complexity separating the ordered regime from the chaotic regime.

I introduced complex adaptive systems (CASs) formally in Part 5 of this series. John Holland (1998) pointed out the occurrence of ‘perpetual novelty‘ in a CAS, and said that this essentially amounts to saying that the system moves around in the edge-of-chaos membrane. But that is not all. The moving around can actually take the system to states of higher and higher sophistication of structure and complexity. Learning and evolution not only take a CAS towards the edge-of-chaos membrane in phase space, they also make it move within this membrane towards states of higher and higher complexity. The ultimate reason for this, of course, is that the universe is ever expanding, and there is therefore a perpetual input of free energy or negative entropy into it.

Farmer (1986) gave the example of the autocatalytic-set model (which he proposed along with Packard and Kauffman) to further illustrate the point about perpetual novelty and the ever-increasing degree of complexity of a CAS. When certain chemicals can collectively catalyze the formation of one another, their concentrations increase by a large factor spontaneously, far above the equilibrium values. This implies that the set of chemicals as a whole emerges as a new ‘individual’ in a far-from-equilibrium configuration. Such sets of chemicals can maintain and propagate themselves, in spite of the fact that there is no genetic code involved. In a set of experiments, Farmer and colleagues tested the autocatalytic model further by allowing occasionally for novel chemical reactions. Mostly such reactions caused the autocatalytic set to crash or fall apart, but the ones that crashed made way for a further evolutionary leap. New reaction pathways were triggered, and some variations got amplified and stabilized. Of course, the stability lasted only till the next crash. Thus a succession of autocatalytic metabolisms emerged. Apparently, each level of emergence through evolution and adaptation sets the stage for the next level of emergence and organization.

14.8 Concluding Remarks

It is often not realized by Darwinists and neo-Darwinists that natural selection alone cannot lead to such high levels of order and complexity as seen in living organisms. A high degree of order already exits in complex adaptive systems because of their self-organization and perpetual-novelty tendencies. Natural selection only hones this order to still higher levels of complexity.

The self-organization feature of complex adaptive systems may worry the Creationists some more. They have been busy attacking Darwin and his followers for the ‘blasphemies,’ and have been trying to argue that the fascinating degree of order observed in living creatures cannot possibly be the result of a series of ‘accidents’ in the form of mutations etc. The fact is that, as emphasized by Stuart Kauffman and others, Darwin or no Darwin, complex adaptive systems have the fundamental property that they self-organize into states of high (and ever-increasing) degree of order, so long as they are able to exchange matter and energy with the surroundings. Darwinian natural selection does lead to some increase of complexity and order but, by and large, it only hones the already available order and complexity to help a population adapt itself to the prevailing conditions.

Dr. Vinod Kumar Wadhawan is a Raja Ramanna Fellow at the Bhabha Atomic Research Centre, Mumbai and an Associate Editor of the journal PHASE TRANSITIONS

About the author

Vinod Wadhawan

Dr. Vinod Wadhawan is a scientist, rationalist, author, and blogger. He has written books on ferroic materials, smart structures, complexity science, and symmetry. More information about him is available at his website. Since October 2011 he has been writing at The Vinod Wadhawan Blog, which celebrates the spirit of science and the scientific method.

Leave a Comment