The Nature of Consciousness

Piero Scaruffi

(Copyright © 2013 Piero Scaruffi | Legal restrictions )
Inquire about purchasing the book | Table of Contents | Annotated Bibliography | Class on Nature of Mind

These are excerpts and elaborations from my book "The Nature of Consciousness"

Information Theory

The Hungarian physicist Leo Szilard, trying to solve the paradox of "Maxwell's demon" (a thought experiment in which measurements cause a decrease of entropy), calculated the amount of entropy generated as the demon stores information in its memory ("On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings", 1929), thereby establishing a connection between information and entropy: information was shown to increase when entropy decreases and viceversa. The tendency of systems to drift from the low-probability state of organization and individuality to the high-probability state of chaos and sameness could be interpreted as a decline in information.

Wiener conceived of information as the opposite of entropy. To him the amount of information in a system was a measure of its degree of organization. Hence, the entropy of a system was a measure of its degree of disorganization. The higher the entropy the lower the information (technically, information is a negative logarithm whereas entropy is a positive logarithm). A process that loses information is a process that gains entropy. Information is a reduction in uncertainty, i.e. of entropy: the quantity of information produced by a process equals the amount of entropy that has been reduced.

An unlikely unusual message is a state of low entropy because there are relatively few ways to compose that message. Its information, however, is very high, precisely because it is so unusual.

The second law of Thermodynamics, one of the fundamental laws of the universe, responsible for our dying among other things, states that an isolated system always tends to maximize its entropy (i.e., things decay). Since entropy is a measure of the random distribution of atoms, maximizing it entails that the distribution has to become as homogeneous as possible. The more homogeneous, the less informative a distribution of probabilities is. Therefore, entropy, a measure of disorder, is also a measure of the lack of information.

The French physicist Leon Brillouin coined the term "negentropy" for Wiener's negative entropy and formulated the "negentropy principle of information" ("Negentropy Principle of Information", 1953): since the total change in the entropy has to be greater than or equal to zero, new information in a system can only be obtained at the expense of the negentropy of some other system.

The US electrical engineer Claude Shannon and the US mathematician Warren Weaver, instead, defined entropy as the statistical state of knowledge about a question: the entropy of a question is related to the probability assigned to all the possible answers to that question.

Shannon's entropy measures the uncertainty in a statistical ensemble of messages. That entropy "is" information. The amount of information is equal to entropy. This is exactly the opposite of Wiener's definition of information.

Shannon defines information as chaos (entropy), Wiener and Brillouin define information as order (negentropy).

In summary, a theory of information turns out to be related to a theory of entropy: if information is ultimately a measure of order, entropy is ultimately a measure of disorder, and, indirectly, a measure of the lack of information; if information is ultimately a measure of chaos, then it is basically entropy. Either way, there is a direct connection between the two.

In this view the role of (positive or negative) entropy was to contribute to the self-organization and increased complexity of a system, an interpretation that constituted a conceptual revolution.

The relationship between information and life had already been understood by the Russian mathematician Aleksandr Lyapunov ("The General Problem of the Stability of Motion", 1892) who had realized that life is about information, and the preservation of life is about processing information. His definition of life reads: "a highly stable state of matter, utilizing information encoded by the states of the individual molecules for the purpose of developing reactions aimed at self-preservation".

Whether unifying machine processes and natural processes, or unifying quantities of Information Theory and quantities of Thermodynamics, the underlying theme was that of finding commonalities between artificial systems and natural systems. This theme caught up speed with the invention of the computer.


Back to the beginning of the chapter "Machine Intelligence" | Back to the index of all chapters