The Nature of Consciousness

Piero Scaruffi

(Copyright © 2013 Piero Scaruffi | Legal restrictions )
Inquire about purchasing the book | Table of Contents | Annotated Bibliography | Class on Nature of Mind

These are excerpts and elaborations from my book "The Nature of Consciousness"

Uncertainty 

Another aspect of common sense reasoning that cannot be removed from our behavior without endangering our species is the ability  (and even preference) for dealing with uncertainties. Pick any sentence that you utter at work, with friends or at home and it is likely that you will find some kind of “uncertainty” in the quantities you were dealing with. Sometimes uncertainty is explicit, as in “maybe I will go shopping” or “I almost won the game” or “I think that Italy will win the next World Cup”. Sometimes it is hidden in the nature of things, as in “it is raining” (can a light shower be considered as “rain”?), or as in “this cherry is red” (how “red”?), or as in “I am a tall person” (how tall is a “tall” person?).

The classic tool for representing uncertainties is Probability Theory, as formulated by Thomas Bayes in the late 18th century.  Probabilities translate uncertainty into the lingo of statistics. One can translate “I think that Italy will win the next World Cup” into a probability by examining how often Italy wins the World Cup, or how many competitions its teams have won over the last four years. One can then express a personal feeling in probabilities, as all bookmakers do. Bayes’ theorem and other formulas allow one to draw conclusions from a number of probable events.

Technically, a probability simply measures "how often" an event occurs. The probability of getting tails is 50% because if you toss a coin you will get tails half of the times. But that is not the way we normally use probabilities: we use them to express a belief. A proponent of probabilities as a measure of somebody's preferences was the US mathematician Leonard Savage who in the 1950s thought of the probability of an event as not merely the frequency with which that event occurs, but also as a measure of the degree to which someone believes it “will” happen.  

The problems with probabilities are computational. Bayes' theorem, the main tool to propagate probabilities from one event to a related event, does not yield intuitive conclusions. For example, the accumulation of evidence tends to lower the probability, not to increase it.  Also, the sum of the probabilities of all possible events must be one, and that is also not very intuitive. Our beliefs are not consistent: try assigning probabilities to a complete set of beliefs (e.g., probabilities of winning the World Cup for each of the countries of the world) and see if they add up to 100%. In order to satisfy the postulates of probability theory, one has to change her belief and make them consistent, i.e. tweak them so that the sum is 100%.

Bayes rule ("the probability of a hypothesis being true is proportional to the initial belief in it, multiplied by the conditional probability of an observational data, given that prior probability") would be very useful to build generalizations (or induction), but, unfortunately, it requires one to know the initial belief, or the "prior" probability, which, in the case of induction, is precisely what we are trying to assess.

In 1968 mathematicians Glenn Shafer and Stuart Dempster devised a “Theory of Evidence” aimed at making Probability Theory more plausible.  They introduced a "belief function" which operates on all subsets of events (not just the single events).  In the throwing of a dice, the possible events are only six, but the number of all subsets is 64 (all the combination of two sides, three sides, four sides and five sides). In their theory the sum of the probabilities of all subsets is one, the sum of the probabilities of all the single events is less than one. Dempster-Shafer's theory allows one to assign a probability to a group of events, even if the probability of each single event is not known.  Indirectly, Dempster-Shafer's theory also allows one to represent "ignorance", as the state in which the belief in an event is not known (while the belief in a set it belongs to is known).  In other words, Dempster-Shafer's theory does not require a complete probabilistic model of the domain. 

An advantage (and a more plausible behavior) of evidence over probabilities is its ability to narrow the hypothesis set with the accumulation of evidence. 

 


Back to the beginning of the chapter "Common Sense" | Back to the index of all chapters