These are excerpts and elaborations from my book "The Nature of Consciousness"
The Computational Theory of
the Mind The US philosopher Hilary
Putnam (“Minds and Machines”, 1960)
focused on the fact that the same mental state may be implemented by different
physical states. For example, each person has a different brain, but every
person has the same psychological states of "fear",
"happiness", etc. Even other animals exhibit some of the same states.
Putnam classified mental states based on their function, i.e. their causal
roles within the mental system, regardless of their physical structure.
Physical states and mental states can even be grouped in different ways. Putnam then suggested that
the psychological state of an individual
should be identified with the state of a so-called “Turing machine” (basically, with a
computer). A psychological state would cause other psychological states
according to the machine's operations. Belief and desire would correspond to
formulas stored in two registers of the machine. Appropriate algorithms would
process those contents to produce action. Putnam’s idea led to a
special case of identity theories, the “computational theory of the mind”. The "representational
theory of the mind", developed by the US linguist Jerry Fodor, is an evolution of Putnam’s ideas. Fodor argues that the mind is a
symbolic processor. Knowledge of the world is embedded in mental representations,
and mental representations are symbols, which possess their causal role in
virtue of their syntactic properties (i.e., in virtue of how they can be used
in “computing” operations). The mind is endowed with a set of rules to operate
on such representations. Cognitive life is the transformation of those rules.
The mind processes symbols without knowing what those symbols mean, in a purely
syntactic fashion. Behavior is due only to the internal syntactic structures of
the mind. The symbols used to build
mental representations belong to a language of thought, or
"mentalese". Such language cannot be one of the languages we speak
because the very ability to speak requires the existence of an internal
language of representation. Such language is an intrinsic part of the brain and
has been somehow produced through evolution. A belief, for example, is realized
as a sentence in the language of thought which resides in the belief area of
the brain ("I believe that my name is Piero" is implemented in the
belief area by the translation in the language of thought of the English
sentence "My name is Piero"). This inner language of
thought is shared by all creatures capable of “propositional attitudes” (the
simplest form of thought, such as beliefs, hopes, fears, desires). Such
creatures can then express their representations in whichever human or animal
language they happen to speak. Fodor basically offers a solution to
the problem faced by dualists: how to connect the mind and the body, mental
states and physical states, the desire to do something and the act of doing it.
Beliefs and desires are information, represented by symbols, and symbols are
physical states of a processor, and the processor is connected to the muscles
of the body. When the symbols change, they have an impact on the body, they
cause behavior. At the same time, perception results in a change of those
symbols. The processor, in turn, may change the symbols because it compacts
several of them into a new one (reasoning). Mind and body communicate via
symbol processing. Fodor’s computational theory is consistent with those offered by the US
linguist Noam Chomsky in linguistics and later by the British psychologist David Marr in vision: the mind as a set of
modules that “compute” something based on an innate symbolic capability. Noam
Chomsky spoke of "mental
organs", to relate their role to the role of physical organs. Each organ
carries out a function and communicates the results to the other organs. Fodor generalizes their ideas: the
mind is made of genetically-specified modules, each one specialized in
performing one task. A module corresponds to a physical region of the brain,
and is isolated from other modules. A module receives input only from modules
of lower level, never from higher levels (for example, a belief cannot
influence the working of a module that analyzes sensory data). Each module
generates output in a common format, the "language of thought". Their
outputs are input to the central processor, that manages long-term memory and
manufactures beliefs. The central processor is the only module that is not
domain-specific. Every other module deals with a specific domain. Fodor does not seem to contemplate
cognitive growth: the modules are fixed at birth and remain the same throughout
the life of the individual. The approach of the Canadian
philosopher Stephen Stich is even more purely syntactic:
he even rejects the notion that each object of a mental operation must
represent something (or stand for something). Stich assumes that cognitive
states correspond to syntactic states in such a way that causal relationships
between syntactic states (or between syntactic states and stimuli and actions)
correspond to syntactic relationships of corresponding syntactic objects. His
“mind” is a purely syntactic program. The US philosopher Ned Block believes that the psychological
state of a person can be identified with the physical process that is taking
place in the brain rather than the state in which the brain is. A psychological
state can then be represented as an operation performed on a machine, i.e.
identified with the computational state of the machine, rather than with its
physical state. This way the psychological state does not depend on the
physical state of the machine and can be the same for different machines that
are in different physical states, but in which the same process is occurring. Block has, actually, provided the
broadest criticism of functionalism (“Troubles with Functionalism”, 1978).
“Qualia” (sensations that are associated to the fact of being in a given
psychological state) are not easily explained in a functionalist view. Take an
organism whose functional states are identical to ours, but in which pain
causes the sensation that we associate to pleasure (“inverted qualia”), and an
organism whose functional states are identical to ours, but in which pain
causes no sensation (“absent qualia”): functionalism cannot, apparently,
account for either case. Furthermore, functionalism does not prescribe how we
can limit the universe of organisms who have mental states. A functionalist
might think that even Bolivia's economy, as expertly manipulated by a
financier, has mental states. Back to the beginning of the chapter "Mind and Matter" | Back to the index of all chapters |