The Nature of Consciousness

Piero Scaruffi

(Copyright © 2013 Piero Scaruffi | Legal restrictions )
Inquire about purchasing the book | Table of Contents | Annotated Bibliography | Class on Nature of Mind

These are excerpts and elaborations from my book "The Nature of Consciousness"

Emergent Computation

Emergent computation is to sequential computation what nonlinear systems are to linear systems: it deals with systems whose parts interact in a nontrivial way. Both Alan Turing and John Von Neumann, the two mathematicians who inspired the creation of the computer, were precursors in emergent computation: Turing formulated a theory of self-catalytic systems and Von Neumann studied self-replicating automata. 

In the 1950s Turing introduced the “Reaction-diffusion Theory” of pattern formation, based on the bifurcation properties of the solutions of differential equations. 

Turing devised a model to generate stable patterns:

·       X catalyzes itself: X diffuses slowly

·       X catalyzes Y: Y diffuses quickly

·       Y inhibits X

·       Y may or may not catalyze or inhibit itself

Some reactions might be able to create ordered spatial schemes from disordered schemes. The function of genes is purely catalytic: they catalyze the production of new morphogenes, which will catalyze more morphogenes until eventually form emerges. 

Von Neumann saw life as a particular class of automata (of programmable machines). Life's main property is the ability to reproduce. In the 1940s Von Neumann had already proven that a machine could be programmed to make a copy of itself.

Von Neumann's automaton was conceived to absorb matter from the environment and process it to build another automaton, including a description of itself. Von Neumann realized (years before the genetic code was discovered) that the machine needed a description of itself in order to reproduce. The description itself would be copied to make a new machine, so that the new machine too could copy itself.

In Von Neumann's simulated world, a large checkerboard was a simplified version of  the real world, in which both space and time were discrete. Time, in particular, was made to advance in discrete steps, which meant that change could occur only at each discrete step, and simultaneously for everything that had to change.

Von Neumann's studies of the 1940s led to an entire new field of Mathematics, called "Cellular Automata". Technically speaking, cellular automata are discrete dynamical systems whose behavior is completely specified in terms of a local relation.  In practice, cellular automata are the computer scientist's equivalent of the physicist's concept of field. Space is represented by a uniform grid and time advances in discrete steps. Each cell of space contains bits of information. Laws of nature express what operation must be performed on each cell's bits of information, based on its neighbor's bits of information. Laws of nature are local and uniform. The amazing thing is that such simple "organisms" can give rise to very complex structures, and those structures recur periodically, which means that they achieve some kind of stability.

Von Neumann understood the dual genetics of self-reproducing automata: namely, that the genetic code must act as instructions on how to build an organism and as data to be passed on to the offspring. This was basically  the idea behind what will be called DNA: DNA encodes the instructions for making all the enzymes and the protein that a cell needs to function and DNA makes a copy of itself every time the cell divides in two.  Von Neumann indirectly understood other properties of life: the ability to increase its complexity (an organism can generate organisms that are more complex than itself) and the ability to self-organize. 

When a machine (e.g., an assembly line) builds another machine (e.g., an appliance), there occurs a degradation of complexity, whereas the offspring of living organisms are at least as complex as their parents and their complexity increases in evolutionary times. A self-reproducing machine would be a machine that produces another machine of equal or higher complexity. 

By representing an organism as a group of contiguous multi-state cells (either empty or containing a component) in a 2-dimensional matrix, Von Neumann proved that a Turing-type machine that can reproduce itself could be simulated by using a 29-state cell component.

Turing proved that there exists a “universal computing machine”. Von Neumann proved that there exists a universal computing machine which, given a description of an automaton, will construct a copy of it, and, by extension, that there exists a universal computing machine which, given a description of a universal computing machine, will construct a copy of it, and, by extension, that there exists a universal computing machine which, given a description of itself, will construct a copy of itself.

 


Back to the beginning of the chapter "Machine Intelligence" | Back to the index of all chapters