Piero Scaruffi(Copyright © 2013 Piero Scaruffi | Legal restrictions )
These are excerpts and elaborations from my book "The Nature of Consciousness"
Entropy is a measure of disorder, and information is found in disorder (the more the microstates the more the information, ergo the more the disorder the more the information), so ultimately entropy is also a measure of information.
Later, several scientists interpreted entropy as a measure of ignorance about the microscopic state of a system, for example as a measure of the amount of information needed to specify it. Murray Gell-Mann summarized these arguments when he gave his explanation for the drift of the universe towards disorder. The reason that nature prefers disorder over order is that there are many more states of disorder than of order, therefore it is more probable that the system ends up in a state of disorder. In other words, the probability of disorder is much higher than the probability of spontaneous order, and that's why disorder happens more often than order.
Equilibrium states are also states of minimum information (a few parameters are enough to identify the state, e.g. one temperature value for the whole gas at a uniform temperature). Information is negative entropy and this equivalence would play a key role in applying entropy beyond Physics.
Back to the beginning of the chapter "The New Physics" | Back to the index of all chapters