The Nature of Consciousness

Piero Scaruffi

(Copyright © 2013 Piero Scaruffi | Legal restrictions )
Inquire about purchasing the book | Table of Contents | Annotated Bibliography | Class on Nature of Mind

These are excerpts and elaborations from my book "The Nature of Consciousness"

Algorithmic Information Theory

A fundamental step in bridging the analog and the digital world was taken in 1933 when the Russian engineer Vladimir Kotelnikov discovered the "Nyquist-Shannon" sampling theorem: how to convert continuous signals (e.g. the wave of a sound) into discrete sequences of numbers (e.g. a sequence of zeroes and ones) in such a way that the original signal can be reconstructed without losing fidelity, That was the fundamental mathematical artifice that made the digital revolution possible. No information is lost if a signal is sampled at the proper frequency.

Another Russian mathematician, Andrei Kolmogorov, formulated "Algorithmic Information Theory" ("On Tables of Random Numbers", 1963), a scientific study of the concept of complexity (in 1960 Ray Solomonoff had predated the idea in "A Preliminary Report on a General Theory of Inductive Inference"). Complexity is basically defined as a quantity of information, which means that Algorithmic Information Theory is the discipline that deals with the quantity of information in systems.

The complexity of a system is defined as the shortest possible description of it; or, equivalently, the least number of bits of information necessary to describe the system. It turns out that this means: "the shortest algorithm that can simulate it"; or, equivalently, as the size of the shortest program that computes it.  For example, the complexity of "pi" is "the ratio between a circumference and its diameter".  The emphasis is therefore placed on sequences of symbols that cannot be summarized in any shorter way. Algorithmic Information Theory looks for the shortest possible message that encodes everything there is to know about a system. Objects that contain regularities have a description that is shorter than themselves.

In the terminology of computer science, the Komogorov complexity of a string is the number of bits of the shortest program that can print the string.

Kolmogorov complexity is not computable: there is no program that can computer the Kolmogorov complexity of any arbitrary string.

Algorithmic Information Theory represents an alternative to Probability Theory when it comes to study randomness. Probability Theory cannot define randomness. Probability Theory says nothing about the meaning of a probability: a probability is simply a measure of frequency.  On the other hand, randomness can be easily defined by Kolmogorov: a random system is one that cannot be compressed. A random sequence is one that cannot be compressed any further.

The Argentinean mathematician Gregory Chaitin proved that randomness is pervasive. His "Diophantine" equation contains 17,000 variables and a parameter which can take the value of any integer number. By studying it, Chaitin achieved a result as shocking as Goedel's theorem: there is no way to tell whether, for a specific value of the parameter, the equation has a finite or infinite number of solutions. That means that the solutions to some mathematical problems are totally random. Chaitin defined randomness in terms of computability: if it is computable (if there is a program that generates it), a number is not random. Conversely, one can measure the degree of randomness of something by the length of the shortest program (algorithm) that generates it.

Incidentally, every system has a finite complexity because of the Bekenstein bound. In Quantum Theory the Bekenstein bound (named after the Israeli physicist Jacob Bekenstein) is a direct consequence of Heisenberg’s uncertainty principle: there are upper limits on the number of distinct quantum states and on the rate at which changes of state can occur. In other words, the principle of uncertainty indirectly sets an upper limit on the information density of a system, and that upper limit is expressed by the Bekenstein bound.

Physicists seem to be fascinated with the idea of quantifying the complexity of the brain and even the complexity of a human being. The US mathematician Frank Tipler estimated the storage capacity of the human brain at 10 to the 15th power and the maximum amount of information stored in a human being at 10 to the 45th power (a number with 45 zeros). Freeman Dyson computed the entropy of a human being at 10 to the 23rd.

 


Back to the beginning of the chapter "Machine Intelligence" | Back to the index of all chapters