Piero Scaruffi(Copyright © 2006 Piero Scaruffi | Legal restrictions - Termini d'uso )
The Origin of Order
(These are excerpts from, or extensions to, the material published in my book "The Nature of Consciousness")
When Darwin discovered evolution, he also indirectly created the premises for a momentous paradigm shift. Over the centuries, Science had always held that order can only be built by rational agents (e.g., us) who apply a set of fundamental laws of engineering (ultimately, Physics). Scientists such as Galileo and Newton simply refined that model by using more and more sophisticated mathematics. Throughout the theoretical developments of Physics, the fundamental idea remained that, in nature, order needs to be somehow created by external forces. Darwin, instead, showed that order can build itself spontaneously, without any help from the outside. Evolution is such a process: it is capable of building higher and higher degrees of order starting from almost nothing. As far as Darwin was concerned, this paradigm only applied to Biology, but his idea of spontaneous "emergence" of order can be applied to any natural phenomenon.
More than a century later, Darwin caused a dramatic change in the idea of Physics itself: are splitting the atom and observing distant galaxies the right ways to explain the universe? or should we focus instead on the evolutionary process that gradually built the universe the way it is now? Should we study how things are modified when a force is applied (the vast majority of what Physics does today) or should we deal with how things modify themselves spontaneously? Can Physics ever explain how a tree grows, or how a cloud moves, by bombarding particles with radiation?
The macroscopic phenomena that we observe are more likely to be explained by laws about systems than by laws about particles. The subject of Physics is still the origin of order, but the Darwinian perspective provided a new approach.
Furthermore, order is directly related to information, and Darwin’s theory has to do with the creation of information (a new species is a new pattern of information). From this new perspective, Physics may be as much a study of information as it is a study of gravitation or electricity. And the creation of order is inevitably related to the destruction of entropy (or the creation of negative entropy). In the Darwinian view of things, entropy is therefore elevated to a higher rank among physical quantities.
Darwin’s laws (unlike the laws of nature claimed by physical sciences) cannot be written down in the form of differential equations. Darwin’s laws can only be stated in a "narrative" manner, and any attempt to formalize them resorts to algorithms rather than to equations. Algorithms are fundamentally different from equations in that they are discrete, rather than continuous, that they occur in steps rather than instantaneously, and that they can refer to themselves. A Physics based on algorithms would be inherently different from a Physics based on equations.
Finally, Darwin’s paradigm is one that is rooted in the concept of organization and that ultimately aims at explaining organization. Indirectly, Darwin helped us understand the elementary fact that the concept of organization is deeply rooted in the physical universe.
Darwin’s treatise on the origin of species was indeed a treatise on the origin of order. There lies its monumental importance.
Design Without a Designer
Why do children grow up? Why aren't we born adults? Why do all living things (from organs to ecosystems) have to grow, rather than being born directly in their final configuration?
Darwin's principle was that given a population and fairly elementary rules of how the population can evolve (mainly, variation and natural selection), the population will evolve, and get better and better (adapted) over time. Whether natural selection is really the correct rule is a secondary issue. Darwin’s powerful idea was that the target object can be reached not by designing it and then building it, but by taking a primitive object and letting it evolve. The target object will not be built: it will emerge. Trees are not built, they grow. Societies are not built, they form over centuries. Most of the interesting things that we observe in the world are not built, they developed slowly over time. How they happen to be the way they are depends to some extent on the advantages of being the way they are and to some extent on mere chance.
When engineers build a bridge, they don't let chance play with the design and they don't assume that the bridge will grow by itself. They know exactly what the bridge is going to look like and they decide on which day construction will be completed. They know that the bridge is going to work because they can use mathematical formulas. Nature seems to use a different system, in which things use chance to vary, and then variation leads to evolution because of the need for adaptation. By using this system, Nature seems to be able to obtain far bigger and more complex structures than humans can ever dream of building.
It is ironic that, in the process, Nature uses much simpler mathematics. Engineers need to deal with derivatives and cosines. Nature's mathematics (i.e., the mathematics involved in genetic variation) is limited to Arithmetic. Humans have developed a system that is much more complex than anything Nature has ever dreamed of using!
It is stunning that such simple algorithms as used by Nature can produce the complexity of living organisms. Each algorithm can be reduced to even simpler steps. And still the repeated application of those steps eventually yields the complex order of life.
The same theme occurs inside the brain. Neurons exchange simple messages, but the network of those messages over time can produce the very complex behavior of the human mind. That is another simple algorithm that creates complexity.
In both cases the algorithm is simple, but there is a catch. The algorithm is such that every time it ends it somehow remembers the result of its computation and will use it as the starting point for the next run. Species are selected out of the most recently selected species. Neural connections are modified out of the connections already established.
Chaos and Complexity
Both biological and physical sciences need a mathematical model of phenomena of emergence (spontaneous creation of order), and in particular adaptation, as well as a physical justification of their dynamics (which seems to violate physical laws such as the second law of Thermodynamics).
The French physicist Sadi Carnot, one of the founding fathers of Thermodynamics, realized that the statistical behavior of a complex system could be predicted if its parts were all identical and their interactions weak. It was not feasible for Physics to predict the behavior of each single part, but it was relatively easy to predict the statistical behavior of the parts.
At the beginning of the 20th century, another French physicist, Henri Poincare`, realizing that the behavior of a complex system can become unpredictable if it consists of a few parts that interact strongly, invented "chaos" theory. A system is said to exhibit the property of chaos if a slight change in the initial conditions results in large-scale differences in the result. This "chaos" opens up a wealth of possibilities for the future of the system.
The most interesting feature of chaotic systems is that under some circumstances chaotic systems spontaneously "crystallize" into a higher degree of order, i.e. properties begin to emerge.
A system must be "complex" enough for any property to "emerge" out of it. Complexity can be formally defined as nonlinearity. As a matter of fact, the world is mostly nonlinear. The "linear" world studied by classical Physics (Newton’s equations are linear) is really just an exception to the rule.
The science of nonlinear dynamics is also known as "chaos theory" because unpredictable solutions emerge from nonlinear equations.
A useful abstraction to describe the evolution of a system in time is that of a "phase space". Our ordinary space has only three dimensions (width, height, depth) but in theory we can think of spaces with any number of dimensions. A phase space has six dimensions, three of which are the usual spatial dimensions while the other three are the components of velocity along those spatial dimensions. In ordinary 3-dimensional space, a "point" can only represent the position of a system. In 6-dimensional phase space, a point represents both the position and the motion of the system. The evolution of a system is represented by some sort of shape in phase space.
An "attractor" is a region of phase space where the system is doomed to end up eventually. For a linear system the attractor basically describes its typical behavior. A point attractor is the state in which the system ends its motion. A periodic attractor is the state that a system returns to periodically. Chaotic systems do not have point or periodic attractors, but chaotic systems may exhibit attractors that are shapes with fractional dimension. If the region that these "strange attractors" occupy is finite, then, by definition, the behavior of the chaotic system is not truly random. The first "strange attractor" was "discovered" in 1963 by the USA meterologist Edward Lorenz: it was due to a nonlinear system that evolves over time in a non-repeating pattern. In phase space its evolution looks like an infinite series of ever-changing patterns that seem to be attracted to a point but never repeat the same trajectory around it.
Nonlinear processes are ubiquitous. They are processes of emergent order and complexity, of how structure arises from the interaction of many independent units. These processes recur at every level, from morphology to behavior. At every level of science (including the brain and life) the spontaneous emergence of order, or self-organization of complex systems, is a common theme.
Darwin's vision of natural selection as a creator of order is probably not sufficient to explain all the spontaneous order exhibited by both living and inanimate matter. There might be other physical principles at work.
Koestler and Salthe showed how complexity entails hierarchical organization. Von Bertalanffi's general systems theory, Haken's synergetics, and Prigogine's non-equilibrium Thermodynamics belong to the class of theories that extend Physics to dynamic systems.
These theories have in common the fact that they deal with self-organization (how collections of parts can produce structures) and try to provide a unified view of the universe at different levels of organization (from living organisms to physical systems to societies).
The Hungarian writer Arthur Koestler brought together a wealth of biological, physical, anthropological and philosophical notions to construct a unified theory of open hierarchical systems.
Language has to do with a hierarchical process of spelling out implicit ideas in explicit terms by means of rules and feedback. Organisms and societies also exhibit the same hierarchical structure. In these hierarchies, each intermediary entity ("holon") functions as a self-contained whole relative to its subordinates and as one of the dependent parts of its superordinates. Each holon tends to persist and assert its pattern of activity.
Koestler thought that, wherever there is life, it must be hierarchically organized. He argued that life exhibits an integrative property (that manifests itself as symbiosis) that enables the gradual construction of complex hierarchies out of simple holons. In nature there are no separated, indivisible, self-contained units. An "individual" is an oxymoron. An organism, instead, is a hierarchy of self-regulating holons (a "holarchy") that work in conjunction with their environment. Holons at the higher levels of the hierarchy enjoy progressively more degrees of freedom and holons at the lower levels of the hierarchy have progressively less degrees of freedom. Moving up the hierarchy, we encounter more and more complex, flexible and creative patterns of activity. Moving down the hierarchy behavior becomes more and more mechanized.
Hierarchical processes of the same nature can be found in the development of the embryo, in the evolution of species and in consciousness itself. The latter should be analyzed not in the context of the mind/body dichotomy but in the context of a multi-leveled hierarchy and of degrees of consciousness.
They all share common themes: a tendency towards integration (a force that is inherent in the concept of hierarchic order, even if it seems to challenge the second law of Thermodynamics as it increases order), an openness at the top of the hierarchy (towards higher and higher levels of complexity) and the possibility of infinite regression.
Hierarchies from Complexity
The USA zoologist Stanley Salthe developed a multi-dimensional theory: an ontology of the world, a hierarchical representation of the world and a model of the evolution of the world.
Salthe inherits a definition of complexity from the USA biologist Howard Pattee: complexity is the result of interactions between physical and symbolic systems. A physical system is dependent on the rates at which processes occur, whereas a symbolic system is not. Symbolic systems frequently serve as constraints applied to the operation of physical systems, and frequently appear as products of the activity of physical systems (e.g., the genome in a cell). A physical system can be said to be "complex" when a part of it functions as a symbolic system (as a representation, and therefore as an observer) for another part of it.
John Von Neumann was possibly the first scientist to realize that it takes a certain degree of complexity for physical molecules to enter the cycle of Darwinian open-ended evolution. This degree of complexity is determined by description-based reproduction. Once there is description-based reproduction, then there is Darwinian evolution. Von Neumann thus made a distinction between "quiescent" symbolic description and "active" physical dynamics, between descriptions and constructions, between, ultimately, semiotic processes (information and codes) and physical systems (energy-matter and forces). Pattee believed that this distinction occurs at all levels of organization in the universe, not only at the level of genes and proteins. At the highest level this distinction translates into the distinction between the physical systems that populate the universe and the semiotic process of measurement (that codes a dynamical state into quiescent symbols). The genotype-phenotype distinction an instance of the origin of symbol systems from physical systems. What causes normal physical molecules to start functioning as descriptions? Pattee called this phenomenon, that separates description from construction and that occurs at all levels of organization, the "epistemic cut". Its origin lies not in the chemical properties of cells but in the complex relationships established within a complex hierarchical system.
Salthe was also influenced by the metaphysics of the USA philosopher Justus Buchler. His "natural complex" is pretty much anything that one can think of, whether organism, concept or conscious event. On the other hand, an "order" is a multiplicity that becomes a unity in virtue of its internal organization (in virtue of the pattern of relatedness among its components). Buchler’s "principle of ordinality" states that every natural complex is an order. Basically, the principle of ordinality asserts that every complex must be constituted by other complexes, and that every complex must be one of the constituents of some other complex. Every complex is relative to some other complex, is conditioned by and conditions other complexes.
The Argentine philosopher Mario saw the universe not as a heap of things but as a system composed of interconnected systems of various kinds (physical, biological, economic, political, cultural). Bunge's systemism offered an alternative to both individualism and holism, allowing for both individual identity and collective organization. Bunge argued that a system is defined by "composition" (what it is made of), "environment" (what surrounds it), "structure" (what holds it together) and "mechanism" (how does it operate).
In Salthe’s theory, the world is viewed as a determinate machine of unlimited complexity. Within complexity, discontinuities arise. The basic structure of this world must allow for complexity that is spontaneously stable and that can be broken down in things divided by boundaries. The most natural way for the world to satisfy this requirement is to employ a hierarchical structure, which is also implied by Buchler's principle of ordinality: Nature is a hierarchy of entities existing at different levels of organization. Hierarchical structure turns out to be a consequence of complexity.
Entities of the hierarchy are defined by four attributes: boundaries, scale, integration, continuity. An entity has size, is limited by boundaries, and consists of an integrated system, which varies continuously in time.
Entities at different levels interact through mutual constraints, each constraint carrying information for the level it operates upon. A process can be described by a triad of contiguous levels: the one it occurs at, its context (what Bunge called "environment") and its causes (Bunge's "structure"). In general, a lower level provides initiating conditions for a process and an upper level provides boundary conditions. Representing a dynamic system hierarchically requires a triadic structure.
Aggregation occurs upon differentiation. Differentiation interpolates levels between the original two and the new entities aggregate in such a way that affects the structure of the upper levels: every time a new level emerges, the entire hierarchy must reorganize itself.
These abstract principles also apply to biological evolution. Over time, Nature generates entities of gradually more limited scope and more precise form and behavior. This process populates the hierarchy of intermediate levels of organization as the hierarchy spontaneously reorganizes itself. The same model applies to all open systems, whether organisms or ecosystems or planets.
Basically, Salthe aims at reformulating Biology on development rather than on evolution. His approach is non-Darwinian to the extent that development, and not evolution, is assumed to be the fundamental process in self-organization. Evolution, in his opinion, is merely the result of a margin of error.
Salthe’s grand theory of nature turns out to be essentially a theory of change, which turns out to be essentially a theory of emergence.
General Systems Theory
"General Systems Theory" was born before Cybernetics, and cybernetic systems are merely a special case of self-organizing systems; but General Systems Theory took longer to establish itself. It was conceived in the 1930s by the Austrian biologist Ludwig Von Bertalanffy. His ambition was to create a "universal science of organization". His legacy is to have started "system thinking", thinking about systems as systems and not as mere aggregates of parts.
The classical approach to the scientific description of a system's behavior (whether in Physics or in Economics) can be summarized as the search for "isolatable causal trains" and the reduction to atomic units. This approach is feasible under two conditions: 1. that the interaction among the parts of the system be negligible and 2. that the behavior of the parts be linear. Von Bertalanffy's "systems", on the other hand, are those entities ("organized complexities") that consist of strongly interacting parts, usually described by a set of nonlinear differential equations. Systems Theory studies principles that apply to all systems, i.e. properties that apply to any entity qua system.
Basic concepts of Systems Theory are, for example, the following: every whole is based upon the competition among its parts; individuality is the result of a never-ending process of progressive centralization whereby certain parts gain a dominant role over the others.
General Systems Theory mainly studies "wholes", which are characterized by such holistic properties as hierarchy, stability, teleology.
General Systems Theory looks for laws that can be applied to a variety of fields (i.e., for an isomorphism of laws in different fields), particularly in the biological, social and economic sciences (but even to history and politics).
"Open Systems Theory" is a subset of General Systems Theory. Because of the second law of Thermodynamics, a change in entropy in closed systems is always positive: order is continually destroyed. On the other hand in open systems (such as living systems) entropy production due to irreversible processes is balanced by import of negative entropy (as in all living organisms).
A living organism can be viewed as a hierarchical order of open systems, where each level maintains its structure thanks to continuous change of components at the next lower level. Living organisms maintain themselves in spite of continuous irreversible processes and even proceed towards higher and higher degrees of order.
The "theory of natural systems" of the Hungarian philosopher Ervin Laszlo is a theory of the invariants of organized complexity. It is centered on the concept of "ordered whole", whose structure is defined by a set of constraints. Laszlo adopts a variant of the principle of self-organization formulated by the British neurologist Ross Ashby, according to which any isolated natural system subject to constant forces is inevitably inhabited by "organisms" that tend towards stationary or quasi-stationary non-equilibrium states. Natural systems sharing an environment tend to organize in hierarchies. The set of such systems tends to become itself a system, its subsystems providing the constraints for the new system.
In Laszlo's view, the combination of internal constraints and external forces yields adaptive self-organization. Natural systems evolve towards increasingly adapted states, corresponding to increasing complexity (and negative entropy).
Order emerges at the atomic ("micro-cybernetics"), organismic ("bio-cybernetics") and social levels ("socio-cybernetics").
The system-oriented approach can also address a particular class of natural systems: cognitive systems. The mind, just like any other natural system, exhibits a holistic character, adaptive self-organization, and hierarchies, and can be studied with the same tools used for all other natural systems ("psycho-cybernetics").
Laszlo views the dynamics of the universe as driven by "third-state systems". First-state systems are systems in equilibrium. Second-state systems are systems near equilibrium. Third-state systems are non-linear systems that are farthest from equilibrium. Third-state system must import energy in order to survive, and, in doing so, they end up creating new order, at higher and higher levels of complexity. These systems tend to form hyper-cycles, and Laszlo calls this tendency "convergence". It is convergence that led to the formation of galaxies, to the evolution of more complex forms of life, to the birth of consciousness. Laszlo’s convergence seems to act like a universal force that endlessly destroys order and rebuilds it at a higher level.
"Synergetics", as developed by the German physicist Hermann Haken, is a theory of pattern formation in complex systems. It tries to explain structures that develop spontaneously in nature.
Synergetics studies cooperative processes of the parts of a system far from equilibrium that lead to an ordered structure and behavior for the system.
Haken's favorite example was the laser: how do the atoms of the laser agree to produce a single coherent wave flow? The answer is that the laser is a self-organizing system far from the equilibrium (what Prigogine would call a dissipative structure).
A "synergetic" process in a physical system is one in which, when energy is pumped into the system, some macroscopic structure emerges from the disorderly behavior of the large number of microscopic particles that make up the physical system. As energy is pumped into the system, initially nothing seems to happen, other than additional excitation of the particles, but then the system reaches a threshold beyond which structure suddenly emerges. The laser is such a synergetic process: a beam of coherent light is created out of the chaotic movement of particles. What happens is that energy pushes the system of particles beyond a threshold, and suddenly the particles start behaving harmoniously..
Synergetics revolves around a number of technical concepts: compression of the degrees of freedom of a complex system into dynamic patterns that can be expressed as a collective variable; behavioral attractors of changing stabilities; and the appearance of new forms as non-equilibrium phase transitions.
Systems at instability points (at the "threshold") are driven by a "slaving principle": long-lasting quantities (the macroscopic pattern) can enslave short-lasting quantities (the chaotic particles), and they can force order on them (thereby becoming "order parameters").
The system exhibits both a stable "mode", which is the chaotic motion of its particles, and an unstable "mode", which is the macroscopic structure and behavior of the whole system. Close to instability, stable modes are "enslaved" by unstable modes and can be ignored. Instead of having to deal with millions of chaotic particles, one can focus on the macroscopic quantities. De facto, the degrees of freedom of the system are reduced.
The dynamic equations for such a system reflect the interplay between stochastic forces ("chance") and deterministic forces ("necessity").
Synergetics applies to systems driven far from equilibrium, where the classic concepts of Thermodynamics are no longer adequate. It expresses the fact that order can arise from chaos and can be maintained by flows of energy/matter.
The German chemist Manfred Eigen was awarded the Nobel Prize in 1967 for discovering that very short pulses of energy could trigger extremely fast chemical reactions. In the following years, he started looking for how very fast reactions could be used to create and sustain life.
Indirectly, he ended up studying the behavior of biochemical systems far from equilibrium.
Eventually, Eigen came up with the concept of an "hypercycle". A hypercycle is a cyclic reaction network, i.e. a cycle of cycles of cycles (of chemical reactions). Then he argued that life can be viewed as the product of a hierarchy of such hypercycles.
A catalyst is a substance that favors a chemical reaction. When enough energy is provided, some catalytic reactions tend to combine to form networks, and such networks may contain closed loops, called catalytic cycles.
If even more energy is pumped in, the system moves even farther from equilibrium, and then catalytic cycles tend to combine to form closed loops of a higher level, or hypercycles, in which the enzymes produced by a cycle act as catalysts for the next cycle in the loop. Each link of the loop is now a catalytic cycle itself.
Eigen showed that hypercycles are capable of self-replication, which may therefore have been a property of nature even before the invention of living organisms.
Hypercycles are capable of evolution through more and more complex stages. Hypercycles compete for natural resources and are therefore subject to natural selection.
The hypercycle falls short of being a living system because it defines no "boundary": the boundary is the container where the chemical reaction is occurring. A living system, on the other hand, has a boundary that is part of the living system (e.g., the skin).
Catalysis is the phenomenon by which a chemical reaction is sped up: without catalysis, all processes that give rise to life would take a lot longer, and probably would not be fast enough for life to happen. Then Eigen shows that they can be organized into an autocatalytic cycle, i.e. a cycle that is capable of self-reproducing: this is the fundamental requirement of life. A set of autocatalytic cycles gets, in turn, organized into a catalytic hypercycle. This catalytic hypercycle represents the basic form of life.
Formally: "hypercycles" are a class of nonlinear reaction networks. They can originate spontaneously within the population of a species through natural selection and then evolve to higher complexity by allowing for the coherent evolution of a set of functionally coupled self-replicating entities. A hypercycle is based on nonlinear autocatalysis, which is a chain of reproduction cycles, which are linked by cyclic catalysis, i.e. by another autocatalysis. A hypercycle is a cycle of cycles of cycles.
Eigen’s model explains the simultaneous unity (due to the use of a universal genetic code) and diversity (due to the "trial and error" approach of natural selection) in evolution. This dual process started even before life was created. Evolution of species was preceded by an analogous stepwise process of molecular evolution.
Evolution itself turns out to be inevitable: given a set of self-reproducing entities that feed on a common and limited source of energetic/material supply, evolution will spontaneously appear. Evolution is a direct consequence of the dynamics of self-reproducing systems.
That said, not all systems are suitable for becoming successful biological systems. Systems can be classified in four groups according to their stability with respect to fluctuations: stable systems (the fluctuations are self-regulating), indifferent systems (the fluctuations have no effect), unstable systems (self-amplification of the fluctuations) and variable systems (which can be in any of the previous states). Only the last type is suitable for generation of biological information because it can play all the best tactics: indifference towards a broad mutant spectrum, stability towards selective advantages and instability towards unfavorable configurations. In other words, it can take the most efficient stance in the face of both favorable and adverse situations.
The Chilean neurologist Francisco Varela adapted Humberto Maturana's biological ideas to his theory of autonomous systems. He merged the themes of autonomy of natural systems (i.e. internal regulation) and their informational abilities (i.e., cognition) into the theme of a system maintaining an identity and interacting with the rest of the world.
The organization of a system is the set of relations that define it as a unity. The structure of a system, on the other hand, is the set of relations among its components. Components and relations among them may change over time without necessarily changing the overall organization. For example, a machine can be implemented by different sets of components and relations among them.
"Homeostatic" systems are systems that keep the values of their variables within a small range of values.
An "autopoietic" system is a homeostatic system that continuously generates its own organization, by continuously producing components that are capable of reproducing the organization that created them.
Autopoietic systems turn out to be autonomous, to have an identity, to be unities, and to compensate external perturbations with internal structural changes.
Living systems are autopoietic systems in the physical space. Self-reproduction can only occur in autopoietic systems, and evolution is a direct consequence of self-reproduction.
By definition, an autonomous system is organizationally closed. The cognitive domain of an autonomous system is the domain of interaction that it can enter without losing that closure.
An autonomous system always exhibits two aspects: it specifies the distinction between self and non-self, and deals with its environment in a cognitive fashion. Therefore, every autonomous system (ecosystems, societies, brains, even conversations) is a "mind".
A Science of Prisms
In the 1970s the USA inventor Buckminster Fuller developed a theory, also called "Synergetics", that approached systems from a holistic perspective that is basically the opposite of the reductionist perspective of Physics. Fuller's philosophy was inspired by one of his own inventions, the "geodesic" dome (1954), a structure that exploits a very efficient way of enclosing space and that gets stronger as it gets larger.
"Synergy" is the behavior of a whole that cannot be explained by the parts taken separately. For example, a star attracts a planet: humans could not have predicted this by simply studying the two bodies separately. Synergetics, therefore, studies a system in a holistic (rather than reductionistic) fashion. The way it does this is by focusing on form rather than internal structure. Because of its emphasis on shape, Synergetics is, de facto, a branch of Geometrics, the discipline of configurations (or patterns).
Synergetics employs 60-degree coordination instead of the usual 90-degree coordination. The triangle (and the tetrahedron) instead of the square (and the cube) is the fundamental geometric unit. The tetrahedron is the minimal system with the fewest possible points.
Fuller argued that reality is not made of "things", but of angle and frequency events. All experience can be reduced to only angles and frequencies. Fuller found "prisms" to be ubiquitous in nature and in culture. All systems contained in the universe are polyhedral, "universe" being the collection of all experiences of all individuals.
Synergetics rediscovers, in an almost mystical way, most of traditional science, but mainly through topological considerations (with traditional topology extended to "omnitopology"). For example, Synergetics proves that the universe is finite and expanding, and that Planck's constant is a "cosmic relationship". Synergetics unifies Physics and Metaphysics.
The Belgian (but Russian-born) physicist Ilya Prigogine showed that all biological systems actually belong to the same class of systems: they are all dissipative systems.
Classical Physics describes the world as a static and reversible system that undergoes no evolution, whose information is constant in time. Classical Physics is the science of being. Thermodynamics, instead, describes an evolving world in which irreversible processes occur. Thermodynamics is the science of becoming.
The second law of Thermodynamics, in particular, describes the world as evolving from order to disorder, while biological evolution is about the complex emerging from the simple (i.e. order arising from disorder). While apparently contradictory, these two views show that irreversible processes are an essential part of the universe.
Furthermore, conditions far from equilibrium foster phenomena such as life that classical Physics does not cover at all.
Irreversible processes and non-equilibrium states turn out to be fundamental features of the real world.
Prigogine distinguishes between "conservative" systems (which are governed by the three conservation laws for energy, translational momentum and angular momentum, and which give rise to reversible processes) and "dissipative" systems (subject to flows of energy and/or matter). The latter give rise to irreversible processes.
The theme of science is order. Order can come either from equilibrium systems or from non-equilibrium systems that are sustained by a constant source (or, equivalently, by a persistent dissipation) of matter/energy. In the latter systems, order is generated by the flow of matter/energy. All living organisms (including the biosphere as a whole) are non-equilibrium systems.
Prigogine proved that, under special circumstances, the distance from equilibrium and the nonlinearity of a system drive the system to ordered configurations, i.e. create order. The science of being and the science of becoming describe dual aspects of Nature.
What is needed is a combination of factors that are exactly the ones found in living matter: a system made of a large collection of independent units which are interacting with each other; a flow of energy through the system that drives the system away from equilibrium; and nonlinearity. Nonlinearity expresses the fact that a perturbation of the system may reverberate and have disproportionate effects.
Non-equilibrium and nonlinearity favor the spontaneous development of self-organizing systems, which maintain their internal organization, regardless of the general increase in entropy, by expelling matter and energy in the environment.
When such a system is driven away from equilibrium, local fluctuations appear. This means that the system gets very unstable in some places. Localized tendencies to deviate from equilibrium are amplified. When a threshold of instability is reached, one of these runaway fluctuations is so amplified that it takes over as a macroscopic pattern. Order appears from disorder through what are initially small fluctuations within the system. Most fluctuations die along the way, but some survive the instability and carry the system beyond the threshold: those fluctuations "create" new form for the system. Fluctuations become sources of innovation and diversification.
The potentialities of nonlinearity are dormant at equilibrium but are revealed by non-equilibrium: multiple solutions appear and therefore diversification of behavior becomes possible.
Technically speaking, nonlinear systems driven away from equilibrium can generate instabilities that lead to "bifurcations" (and "symmetry breaking" beyond bifurcation). When the system reaches the bifurcation point, it is impossible to determine which path it will take next. Chance rules. Once the path is chosen, determinism resumes.
The multiplicity of solutions in nonlinear systems can even be interpreted as a process of gradual "emancipation" from the environment.
Most of Nature is made of such "dissipative" systems, of systems subject to fluxes of energy and/or matter. Dissipative systems conserve their identity thanks to the interaction with the external world. In dissipative structures, non-equilibrium becomes a source of order.
In general, self-organization is the spontaneous emergence of ordered structure and behavior in open systems that are in a state far from equilibrium described mathematically by nonlinear equations.
These considerations apply to living organisms, which are prime examples of dissipative structures in non-equilibrium. Prigogine's theory explains how life can exist and evolution can work towards higher and higher forms of life. A "minimum entropy principle" characterizes living organisms: stable near-equilibrium dissipative systems minimize their rate of entropy production.
Catastrophe and chaos theories are special cases of nonlinear non-equilibrium systems.
Catastrophe theory, originally formulated in 1967 by the French mathematician Rene' Thom and popularized ten years later by the work of the British mathematician Erich Zeeman, became a widely used tool for classifying the solutions of nonlinear systems in the neighborhood of stability breakdown.
In the beginning, Thom was interested in structural stability in topology (stability of topological form) and was convinced of the possibility of finding general laws of form evolution regardless of the underlying substance of form, as already stated at the beginning of the century by the British biologist D'Arcy Thompson.
Thom's goal was to explain the "succession of form". Our universe presents us with forms (that we can perceive and name). A form is defined, first and foremost, by its stability: a form lasts in space and time. Forms change. The history of the universe, insofar as we are concerned, is a ceaseless creation, destruction and transformation of form. Life itself is, ultimately, creation, growth and decaying of form.
Every physical form is represented by a mathematical quantity called an "attractor" in a space of internal variables. If the attractor satisfies the mathematical property of being "structurally stable", then the physical form is the stable form of an object. Changes in form, or morphogenesis, are due to the capture of the attractors of the old form by the attractors of the new form. All morphogenesis is due to the conflict between attractors.
The universe of objects can be divided into domains of different attractors. Such domains are separated by shock waves. Shock wave surfaces are singularities called "catastrophes". A catastrophe is a state beyond which the system is destroyed in an irreversible manner. Technically speaking, the "ensembles de catastrophes" are hypersurfaces that divide the parameter space in regions of completely different dynamics.
The bottom line is that dynamics and form become dual properties of nonlinear systems.
Thom proves that in a 4-dimensional space there exist seven types of elementary catastrophes. Elementary catastrophes include: "fold", destruction of an attractor, which is captured by a lesser potential; "cusp", bifurcation of an attractor into two attractors; etc. From these singularities, more and more complex catastrophes unfold, until the final catastrophe. Elementary catastrophes are "local accidents". The form of an object is due to the accumulation of many of these "accidents".
What catastrophe theory does is to "geometrize" the concept of "conflict". This theory is a purely geometric theory of morphogenesis, Its laws are independent of the substance, structure and internal forces of the system.
The Origin of Regularity
Prigogine's bifurcation theory is a descendent of the theory of stability initiated by the Russian mathematician Aleksander Lyapounov. Rene' Thom's catastrophe theory is particular case of bifurcation theory, so they all belong to the same family. They all elaborate on the same theorem, namely Lyapounov's theorem: for isolated systems, thermodynamic equilibrium is an attractor of nonequilibrium states.
Then the story unfolds, leading to dissipative systems and eventually to the reversing of Thermodynamics' fundamental assumption, the destruction of structure. Order emerges from the very premises that seem to deny it.
Simplexity and complicity
The British biologist Jack Cohen and the British mathematician Ian Stewart studied how the regularities of nature (from Cosmology to Quantum Theory, from Biology to Cognitive Psychology) emerge from the underlying chaos and complexity of nature: "emergent simplicities collapse chaos". They argued that external constraints are fundamental in shaping biological systems (DNA does not uniquely determine an organism) and defined new concepts: "simplexity" (the tendency of simple rules to emerge from underlying disorder and complexity) and "complicity" (the tendency of interacting systems to co-evolve leading to a growth of complexity). Simplexity is a "weak" form of emergence, and is ubiquitous. Complicity is a stronger form of emergence, and is responsible for consciousness and evolution. "Simplexity merely explores a fixed space of the possible… complicity enlarges it."
Emergence is the rule, not the exception, and it is shaped by simplexity and complicity.
The Edge of Chaos
The USA computer scientist Chris Langton (who organized the first Artificial Life conference in 1987) showed that physical systems achieve the prerequisites for the emergence of computation (i.e., transmission, storage, modification) in the vicinity of a phase transition, "at the edge of chaos" ("Computation at the Edge of Chaos", 1990). Specifically, information becomes an important factor in the dynamics of cellular automata in the vicinity of the phase transition between periodic and chaotic behavior, i.e. between order and chaos.
The idea is that some systems undergo transformations, and while they transform they constantly move from order to chaos and back. This transition is similar to the "phase transitions" undergone by a substance when it turns liquid or solid or fluid. When ice turns into water, the atoms have not changed, but the system as a whole has undergone a phase transition. Microscopically, this means that atoms are behaving in a different way. The transition of a system from chaos to order and back is similar in that the system is still made of the same parts, but they behave in a different way.
The state between order and chaos (the "edge of chaos") is sometimes a very "informative" state, because the parts are not as rigidly assembled as in the case of order and, at the same time, they are not as loose as in the case of chaos. The system is stable enough to keep information and unstable enough to dissipate it. The system at the edge of chaos is both a storage and a broadcaster of information.
At the edge of chaos, information can propagate over long distances without decaying appreciably, thereby allowing for long-range correlation in behavior: ordered configurations do not allow for information to propagate at all, and disordered configurations cause information to quickly decay into random noise.
A fundamental connection therefore exists between computation and phase transition.
The edge of chaos is where the system can perform computation, can metabolize, can adapt, can evolve. In a word: these systems can be alive.
Basically, Langton proved that Physics can support life only in a very narrow boundary between chaos and order. In that locus it is possible to build "organisms" that will settle into recurring patterns conducive to an orderly transmission of information.
Langton’s theory related phase transitions, computation and life: he built a bridge among Thermodynamics, Information Theory and Biology.
Likewise, the USA physicist Murray Gell-Man argued that living organisms dwell at the edge of chaos, as they exhibit order and chaos at the same time, and they must exhibit both in order to survive. Living organisms are complex adaptive systems that retrieve information from the world, find regularities, compress them into a schema to represent the world, predict the evolution of the world and prescribe behavior for themselves. The schema may undergo variants that compete with one another. Their competition is regulated by feedback from the real world under the form of selection pressure. Disorder is useful for the development of new behavior patterns that enable the organism to cope with a changing environment.
The USA biologist Stuart Kauffman views the dynamics of "complex systems" as a manifestation of the fundamental force that counteracts the universal drift towards disorder required by the second law of Thermodynamics.
His idea is that Darwin was only half right: systems do evolve under the pressure of natural selection, but their quest for order is helped by a property of our universe, the property that "complex" systems just tend to organize themselves. Darwin's story is about the power of chance: by chance life developed and then evolved. Kauffman's story is about destiny: life is the almost inevitable result of a process inherent in nature.
Kauffman's starting point was that cells behave like mathematical networks.
In the early 1960s Monod and others discovered gene regulation: genes are assembled not in a long string of instructions but in "genetic circuits". Within the cell, there are regulatory genes whose job is to turn on or off other genes. Therefore genes are not simply instructions to be carried out one after the other. Genes realize a complex network of messages. A regulatory gene may trigger another regulatory gene that may trigger another gene… etc. Each gene is typically controlled by two to ten other genes. Turning on just one gene may trigger an avalanche of effects.
The genetic program is not a sequence of instructions but rather a regulatory network that behaves like a self-organizing system.
By using a computer simulation of a cell-like network, Kauffman proved that, in any organism, the number of cell types must be approximately the square root of the number of genes.
He basically started where Langton ended. His "candidate principle" states that organisms change their interactions in such a way to reach the boundary between order and chaos.
For example, the Danish physicist Per Bak studied the pile of sand, whose collapse under the weight of a new grain is unpredictable: the pile self-organizes. No external force is shaping the pile of sand, it is the pile of sand that organizes itself.
Further examples include any ecosystem (in which organisms live at the border between extinction and overpopulation), the price of a product (which is defined by supply and demand at the border of where nobody wants to buy it and where everybody wants to buy it). Evolution proceeds towards the edge of chaos. Systems on the boundary between order and chaos have the flexibility to adapt rapidly and successfully.
Natural selection and self-organization complement each other: they create complex systems poised at the edge between order and chaos, which are fit to evolve in a complex environment. At all levels of organization, whether of living organisms or ecosystems, the target of selection is a type of adaptive system at the edge between chaos and order.
In 1932 the USA biologist Sewall Wright had introduced the concept of "fitness landscapes". Fitness is the replication rate of a genotype. A fitness landscape is a distribution of fitness values over the space of genotypes. In other words, the fitness landscape describes all possible genotypes, their degree of similarity and their fitness values. Fitness is related to height in the landscape. Genotypes that are very similar are close to each other in the landscape.
Evolution is the traversing of a fitness landscape. Peaks represent optimal fitness. Populations wander through the landscape, driven by mutation, selection and drift, in their search for peaks. Kauffman showed that the best strategy for reaching the peaks occurs at the phase transition between order and disorder, or, again, at the edge of chaos. The same model applies to other biological phenomena and even nonbiological phenomena, and may therefore represent a universal law of nature.
Adaptive evolution can be represented as a local "hill-climbing search" that converges via fitter mutants toward some local or global optimum. Adaptive evolution occurs on rugged (multipeaked) fitness landscapes. The very structure of these landscapes implies that "radiation" and "stasis" are inherent features of adaptation. The Cambrian explosion and the Permian extinction (famous paradoxes of the fossil record) may be the natural consequences of inherent properties of rugged landscapes.
Kauffman also noted how complex (nonlinear dynamic) systems which interact with the external world are bound to "classify" and "know" their world through their attractors.
Kauffman's view of life can be summarized as follows: autocatalytic networks (networks that feed themselves) arise spontaneously; natural selection brings them to the edge of chaos; a genetic regulatory mechanism accounts for metabolism and growth; attractors lay the foundations for cognition.
The main theme of Kauffman's research is the ubiquitous trend towards self-organization. This trend causes the appearance of "emergent properties" in complex systems. One such property is life. The requirements for order to emerge are far easier than traditionally assumed.
There is order for free.
Far from equilibrium, systems organize themselves. The way they organize themselves is such that it creates systems at higher levels, which in turn tend to organize themselves. Atoms organize in molecules that organize in autocatalytic sets that organize in living organisms that organize in ecosystems.
The whole universe may be driven by a principle similar to autocatalysis. The universe may be nothing but a hierarchy of autocatalytic sets.
Of course, one possible objection to the whole theory of "self-organizing" systems is that no system truly "self-organizes": they all depend on external energy. Thus one could claim that it is the external energy that organizes them. Self-organizing systems, strictly speaking, do not exist. Only the universe as a whole can be said to be truly self-organizing.
The Emergence of a Science of Emergence
Theories such as Prigogine's non-equilibrium Thermodynamics, Haken's Synergetics, Von Bertalanffy's General Systems Theory and Kauffman's complex adaptive systems all point to the same scenario: the origin of life from inorganic matter is due to nonlinear processes of self-organization. The same processes account for "emergent" phenomena at different levels in the organization of the universe, and, in particular, for cognition. Cognition appears to be a general property of systems, not an exclusive of the human mind.
Buchler, Justus: METAPHYSICS OF NATURAL COMPLEXES (Columbia University Press, 1966)
Bunge Mario: TREATISE ON BASIC PHILOSOPHY (Reidel, 1974-83)
Cohen, Jack & Steward Ian: THE COLLAPSE OF CHAOS (Viking, 1994)
Coveney, Peter: FRONTIERS OF COMPLEXITY (Fawcett, 1995)
Dalenoort G.J.: THE PARADIGM OF SELF-ORGANIZATION (Gordon & Breach, 1989)
Dalenoort G.J.: THE PARADIGM OF SELF-ORGANIZATION II (Gordon & Breach, 1994)
Davies, Paul: GOD AND THE NEW PHYSICS (Penguin, 1982)
Eigen, Manfred & Schuster Peter: THE HYPERCYCLE (Springer Verlag, 1979)
Forrest, Stephanie: EMERGENT COMPUTATION (MIT Press, 1991)
Fuller, Buckminster: SYNERGETICS (Macmillan, 1975)
Fuller, Buckminster: COSMOGRAPHY ( Macmillan, 1992)
Gell-Mann, Murray: THE QUARK AND THE JAGUAR (W.H.Freeman, 1994)
Gleick, James: CHAOS (Viking, 1987)
Goldberg, David: GENETIC ALGORITHMS (Addison Wesley, 1989)
Haken, Hermann: SYNERGETICS (Springer-Verlag, 1977)
Kauffman, Stuart: THE ORIGINS OF ORDER (Oxford University Press, 1993)
Kauffman, Stuart: AT HOME IN THE UNIVERSE (Oxford Univ Press, 1995)
Koestler, Arthur: THE GHOST IN THE MACHINE (Henry Regnery, 1967)
Langton, Christopher: ARTIFICIAL LIFE (Addison-Wesley, 1989)
Laszlo, Ervin: INTRODUCTION TO SYSTEMS PHILOSOPHY (Gordon & Breach, 1972)
Lewin, Roger: COMPLEXITY (Macmillan, 1992)
Mandelbrot, Benoit: THE FRACTAL GEOMETRY OF NATURE (W.H.Freeman, 1982)
Nicolis, Gregoire & Prigogine Ilya: SELF-ORGANIZATION IN NON-EQUILIBRIUM SYSTEMS (Wiley, 1977)
Nicolis, Gregoire & Prigogine Ilya: EXPLORING COMPLEXITY (W.H.Freeman, 1989)
Nicolis, Gregoire: INTRODUCTION TO NONLINEAR SCIENCE (Cambridge University Press, 1995)
Pattee, Howard: HIERARCHY THEORY (Braziller, 1973)
Prigogine, Ilya: INTRODUCTION TO THERMODYNAMICS OF IRREVERSIBLE PROCESSES (Interscience Publishers, 1961)
Prigogine, Ilya: NON-EQUILIBRIUM STATISTICAL MECHANICS (Interscience Publishers, 1962)
Prigogine, Ilya & Stengers Isabelle: ORDER OUT OF CHAOS (Bantham, 1984)
Salthe, Stanley: EVOLVING HIERARCHICAL SYSTEMS (Columbia University Press, 1985)
Salthe, Stanley: DEVELOPMENT AND EVOLUTION (MIT Press, 1993)
Thom, Rene': MATHEMATICAL MODELS OF MORPHOGENESIS (Horwood, 1983)
Thom, Rene': STRUCTURAL STABILITY AND MORPHOGENESIS (Benjamin, 1975)
Toffoli, Tommaso & Margolus, Norman: CELLULAR AUTOMATA MACHINES (MIT Press, 1987)
Varela, Francisco: PRINCIPLES OF BIOLOGICAL AUTONOMY (North Holland, 1979)
Von Bertalanffy, Ludwig: GENERAL SYSTEMS THEORY (Braziller, 1968)
Waldrop, Mitchell: COMPLEXITY (Simon & Schuster, 1992)
Zeeman, Erich-Christian: CATASTROPHE THEORY (Addison-Wesley, 1977)