Far-from-equilibrium self-organization is a hypothesis that describes the process of self-organization as taking place at a critical phase transition space between order and chaos when the system is far from its equilibrium.1 The essence of the theory of far-from-equilibrium pattern formation is that new forms of organization form when a system is driven far from its stable basin of attraction.2 Far-from-equilibrium behavior is ubiquitous. The scope of phenomena investigated makes the research of far-from-equilibrium systems an intrinsically interdisciplinary activity that crosses between the physics community and researchers in biology, chemistry, the social sciences, applied mathematics, meteorology and engineering.3
Organization is an ordered structure to the arrangement of elements within a system that enables them to function. As such, we can loosely equate it to the concept of order. Both order and organization are highly abstract concepts, neither of which are well defined within the language of mathematics and science. But probably the most powerful method we have for formalizing them is through the theory of symmetry. The theory of symmetry within mathematics is an ancient area of interest originally coming from classical geometry, but within modern mathematics and physics, it has been abstracted to the concept of invariance.4 In this way, symmetry describes how two things are the same under some transformation. For example, if we take two coins, one showing heads and the other tails, by simply flipping one of the coins over it will come to have the same state as the other. Thus we do not need two pieces of information to describe the states within this system. We can describe this system in terms of just one state and a flipping transformation that when we perform it will give us the other state. If instead of having two coins we had an apple and an orange. Now there is no transformation we know of that can map an apple to an orange. They are different things. There is no trivial symmetry or order between them, and thus we need at least two distinct pieces of information to describe this system. This second system requires more bits of information to describe its state. Thus, we can say it has higher statistical entropy.5 Thus we can talk about and quantify order and randomness in terms of information theory. Ordered systems can be described in terms of these transformations which we encode in equations. Ordered systems are governed by equations whereas random systems are not. However, because there is no correlation between the element’s states in these random systems, they are governed by probability theory, the branch of mathematics that analyses random phenomena.
Order & Randomness
Complex systems are by any definition nonlinear. Complexity is always a product of an irreducible interaction or interplay between two or more things. If we can just do away with this core dynamic and interplay, then we simply have a linear system. If the system is homogeneous and everything can be reduced to one level, then it might be a complicated system but it is certainly not a complex system. Thus, one of the main ideas or findings of complexity theory is that complexity is found at what is sometimes called the interesting in-between.6 If we take some parameter to a system, say its rate of change or its degree of diversity, and turn this parameter fully up, what we often get is randomness or a continuous change or total divinity of states without any pattern. Or if we turn it fully down we get complete stasis and homogeneity with very stable and simple patterns. It is often the case that with too much order the system becomes governed by a simple set of symmetries. Too much disorder results in randomness and the system becomes subject to statistical regularities. It is only between the two that we get complexity. On either side of this, there is a single dominant regime or attractor that will come to govern the system’s behavior.
It is only when a system is far from its equilibrium, away from one of these stable attractor regimes that we get a phase transition area representing the interplay between the two regimes. In this space, the system is much more sensitive to small fluctuations that can take it into either basin of attraction.7 This phase transition area is also called the edge-of-chaos. The phrase edge-of-chaos was first used to describe a transition phenomenon discovered by computer scientist Christopher Langton. Langton found a small area conducive to producing cellular automata capable of universal computation. At around the same time, physicist James Crutchfield and others used the phrase “onset of chaos” to describe more or less the same concept. In the sciences in general, the phrase has come to refer to a metaphor that some physical, biological and social systems operate in a region between order and either complete randomness or chaos, where the complexity is maximal.8 The edge-of-chaos concept remains mainly theoretical and somewhat controversial, but it is often posited that self-organization and evolution can only really happen in this phase transition space. There may be a number of different interpretations for why this is so, but one way of understanding it is that self-organization requires entropy and evolution requires variety. Unlike external intervention where we can take a well-ordered system and simply reconfigure it by transferring energy to it from some other external source, in this way we go from one ordered regime to another without the need for entropy to enable the process; we simply need some input of energy. But as we know, self-organization does not happen in this fashion. It is internally generated on the local level and this process requires the presence of entropy and randomness for elements to be available for reconfiguration into a new regime through feedback loops that originate as weak signals or fluctuations.9
A number of different researchers have posited different theories around this process of self-organization far-from-equilibrium. The principle of “order from noise” was formulated by the cybernetician Heinz von Foerster in 1960. It notes that self-organization is facilitated by random perturbations and noise that let the system explore a variety of states in its state space. A similar principle was presented by Ilya Prigogine as “order through fluctuations” or “order out of chaos.” 10 Researcher Per Bak also looked at this phenomenon in terms of what he calls self-organizing criticality, the mechanism by which complex systems tend to maintain themselves on this critical edge. Many of these theories talk about both the need for entropy and variety in order for the system to stay adapting and evolving over a prolonged period of time.