The Nature of Consciousness

Piero Scaruffi

(Copyright © 2013 Piero Scaruffi | Legal restrictions )
Inquire about purchasing the book | Table of Contents | Annotated Bibliography | Class on Nature of Mind

These are excerpts and elaborations from my book "The Nature of Consciousness"

Chaos and Complexity

Both biological and physical sciences need a mathematical model of phenomena of emergence (spontaneous creation of order), and in particular adaptation, as well as a physical justification of their dynamics (which seems to violate physical laws such as the second law of Thermodynamics). 

The French physicist Sadi Carnot, one of the founding fathers of Thermodynamics, realized that the statistical behavior of a complex system could be predicted if its parts were all identical and their interactions weak.  It was not feasible for Physics to predict the behavior of each single part, but it was relatively easy to predict the statistical behavior of the parts.

At the beginning of the 20th century, another French physicist, Henri Poincare`, realizing that the behavior of a complex system can become unpredictable if it consists of a few parts that interact strongly, invented "chaos" theory.  A system is said to exhibit the property of chaos if a slight change in the initial conditions results in large-scale differences in the result. This “chaos” opens up a wealth of possibilities for the future of the system. (Note that “chaos” does not mean “disorder”: a disordered state cannot be described, whereas a chaotic state can be described by deterministic laws).

The most interesting feature of chaotic systems is that under some circumstances chaotic systems spontaneously "crystallize" into a higher degree of order, i.e. properties begin to emerge.  A system must be “complex” enough for any property to “emerge” out of it.

Complexity can be formally defined as nonlinearity. Nonlinearity is best visualized as a game in which playing the game changes the rules. In a linear system an action has an effect that is somewhat proportionate to the effect. In a nonlinear system an action has an effect that causes a change in the action itself. As a matter of fact, the world is mostly nonlinear. The “linear” world studied by classical Physics (Newton’s equations are linear) is really just an exception to the rule.

The science of nonlinear dynamics is also known as "chaos theory", from the definition introduced by US physicist James Yorke  ("Period Three Implies Chaos", 1975), because unpredictable solutions emerge from nonlinear equations; in other words, they appear to behave in random fashion.

The US mathematician Edward Lorenz (“Deterministic Nonperiodic Flow”, 1963) was the first one to focus on the equations about deterministic laws that generated unpredictable behavior. These equations were unpredictable because a slight change in the initial conditions had devastating consequences. In linear equations the effects are proportional to the causes. Linear equations are modular in the sense that one can reduce them to simpler equations and then reassemble the solution. Unfortunately, linearity is an ideal state that usually happens only when a system is close to equilibrium. Real-world systems are rarely close to equilibrium. Anything that is alive, in particular, and anything that is evolving is in a state of non-equilibrium. Nonlinear equations cannot be reduced to simpler equations: the whole is literally more than its parts. This “synergistic” aspect is a sort of internal loop, and is responsible for the property that a small change in the initial conditions can grow exponentially quickly, the so called “butterfly effect” named after an Edward Lorenz lecture ("Does The Flap Of A Butterfly's Wings In Brazil Set Off A Tornado In Texas?”, 1972).

Nonlinearity tends to show up when one tries to explain how global behavior emerges from local behavior. For example, it's easy to muster the rules of dynamics and electromagnetism when applied to isolated systems but difficult to predict the weather of a region. It's easy to muster the arithmetic and statistical formulas of economic principles but not to predict what will happen to a nation's economy. The reason is that these complex systems have components that influence each other.

Chaos is not intractable though. As Stephen Smale showed ("Differentiable dynamical systems", 1967), an irregularity can persist in a chaotic system to the point that the system is "stable".

A useful abstraction to describe the evolution of a system in time is that of a "phase space". Our ordinary space has only three dimensions (width, height, depth) but in theory we can think of spaces with any number of dimensions. A phase space has six dimensions, three of which are the usual spatial dimensions while the other three are the components of velocity along those spatial dimensions. In ordinary 3-dimensional space, a "point" can only represent the position of a system. In 6-dimensional phase space, a point represents both the position and the motion of the system. The evolution of a system is represented by some sort of shape in phase space.

An “attractor” is a region of phase space where the system is doomed to end up eventually. For a linear system the attractor basically describes its typical behavior. A point attractor is the state in which the system ends its motion. A periodic attractor is the state that a system returns to periodically. Chaotic systems do not have point or periodic attractors, but chaotic systems may exhibit attractors that are shapes with fractional dimension. If the region that these “strange attractors” occupy is finite, then, by definition, the behavior of the chaotic system is not truly random. The first “strange attractor” was “discovered” by the US meteorologist Edward Lorenz (“Deterministic Nonperiodic Flow”, 1963): it was due to a nonlinear system that evolves over time in a non-repeating pattern. In phase space, its evolution looks like an infinite series of ever-changing patterns that seem to be attracted to a point but never repeat the same trajectory around it.

The US physicist Robert Shaw ("Strange Attractors, Chaotic Behavior, and Information Flow", 1981) realized that strange attractors increase entropy, i.e. "create" information. They create information about the beahvior of a chaotic system  where no information about it was available.

Nonlinear processes are ubiquitous. They are processes of emergent order and complexity, of how structure arises from the interaction of many independent units. These processes recur at every level, from morphology to behavior. At every level of science (including the brain and life) the spontaneous emergence of order, or self-organization of complex systems, is a common theme.

One feature of chaos, discovered by the Australian mathematician Robert May (""Simple Mathematics Models with very complicated dynamics", 1976), is “self-similarity”, the fact that the “chaotic” behavior sometimes embeds an exact replica of itself. By studying self-similarity, the Polish-born mathematician Benoit Mandelbrot came up with the idea of fractal geometry (“Fractal Objects”, 1975). Euclidean geometry fails to capture the essence of  ordinary natural shapes that are way more complex than straight lines or perfect circles. Mandelbrot introduced fractional dimensions to express the degree of “irregularity” of a shape (for example, of a coastline).

The US physicist Mitchell Feigenbaum discovered a new universal constant that applies to chaotic systems by analyzing how a nonlinear system becomes chaotic ("Quantitative Universality for a Class of Nonlinear Transformations", 1978). At some point the behavior of the system splits into two (a “bifurcation”) and then into four and so forth, at an increasingly faster rate. It is this acceleration of bifurcations that defines “chaos”. Feigenbaum discovered that the ratio between the bifurcations is 4.669.  There is order even in chaos.

Darwin's vision of natural selection as a creator of order is probably not sufficient to explain all the spontaneous order exhibited by both living and inanimate matter. There might be other physical principles at work.

Koestler and Salthe showed how complexity entails hierarchical organization. Von Bertalanffy's general systems theory, Haken's synergetics, and Prigogine's non-equilibrium Thermodynamics belong to the class of theories that extend Physics to dynamic systems.

These theories have in common the fact that they deal with self-organization (how collections of parts can produce structures) and try to provide a unified view of the universe at different levels of organization (from living organisms to physical systems to societies).

The drawback in any study of complex systems is that there is no commonly accepted definition of complexity and method of measuring complexity. Kolmogorov’s complexity is not a biologist’s complexity. A biologist does not know how to compare the complexity of a tomato with the complexity of the stock market.

 


Back to the beginning of the chapter "Self-organization and the Science of Emergence" | Back to the index of all chapters