Book Reviews

Additions to the Bibliography on Mind and Consciousness

compiled by Piero Scaruffi

My book on Consciousness | My essays | Cognitive Science news | Contact
My seminar on Mind/Consciousness | My seminar on History of Knowledge

(Copyright © 2000 Piero Scaruffi | Legal restrictions )

Baars Bernard: A COGNITIVE THEORY OF CONSCIOUSNESS (Cambridge Univ Press, 1993)

Click here for the full review

Baars Bernard: IN THE THEATER OF CONSCIOUSNESS (Oxford Univ Press, 1996)

An introduction to the studies of consciousness and a review of Baars' own "Global Workspace" theory.
Baars employs the metaphor of the "theatre". Consciousness is the stage, and the modules are the actors. Actors are competing for the spotlight.

Bach Emmon: UNIVERSALS IN LINGUISTIC THEORY (Holt, Rinehart & Winston, 1968)

A collection of four essays on linguistics, the longest one being Charles Fillmore's seminal "The case for cases".
Fillmore's grammar assumes that each sentence represents explicitly the relationships between concepts and action. A universal underlying set of caselike relations play a key role in determining syntactic and semantic relations in all languages. A sentence is represented by identifying its "cases", analogous to noun cases. Sentences that deliver the same meaning with different words are therefore represented in the same way.

Bach Emmon: CATEGORIAL GRAMMARS (Reidel, 1988)

A collection of essays on what Yehoshua Bar-Hillel defined in 1960 as categorial grammar, that provide an excellent historical introduction to the field.
In contrast to linguistic analyses based on phrase structure grammars, in a categorial grammar every item of a language belongs to one or more categories; a category can be either basic or derived; derived categories are defined in terms of basic or derived categories in a compositional way. Expressions belonging to derived categories may be identified with funtions that map expressions of one constituent category into expressions of another constituent category.
Categorial grammars adhere to three principles: language is seen in terms of functions and arguments rather than constituent structure (dependency grammar rather than phrase-structure grammar); a tight correspondence is imposed between syntax and semantics such that every rule of syntax is also a rule of semantics (the rule-to-rule hypothesis); monotonicity is always favored at the expense of destructive devices which characterize transformational grammars.
Categorial grammars are based on the algebraic notions of function and argument and can therefore be represented using Church's lambda operator. The Lambek calculus was the first major mathematical tool for the field.
Categorial grammars involve semantic categories, in agreement with Edmund Husserl's meaning categories and Stanislaw Lesniewski's and Kazimierz Ajdukiewicz's logics.
Bach has improved the original model by allowing categories to have internal structures that define the features that are relevant to determine lexical and syntactic properties. Categories can then be viewed as clusters of features.

Bach Emmon: SYNTACTIC THEORY (Holt, Rinehart & Winston, 1974)

An in-depth treatment of transformational grammars for linguists that summarizes the progress made in the early Seventies and updates Bach's earlier "Introduction to Transformational Grammars". It contains a long introduction to Chomsky's "Aspects of the Theory of Syntax".

Baddeley Alan: WORKING MEMORY (Clarendon Press, 1986)

Baddeley developed a theory of work memory based on three subsystems: a central control (for residual ignorance) and two passive storage systems, a speech system and a visual system.

Baddeley Alan: YOUR MEMORY (MacMillan, 1982)

An introduction to the functioning and structure of memory for the broad audience. Baddeley assumes the existence of three types of memory: long-term (both episodic and semantic), short-term and sensory memory.

Baddeley Alan: ATTENTION (Oxford University Press, 1994)

A tribute to Donald Broadbent in the form of a collection of essays on his contributions to various cognitive tasks.

Baddeley Alan: HUMAN MEMORY (Simon & Schuster, 1990)

An introduction to the theories of memory for the broad audience.

Bailenson, Jeremy & Jim Blascovich: "Infinite Reality" (William Morrow, 2011)

Click here for the full review


Click here for the full review

Ballard Dana: COMPUTER VISION (Prentice Hall, 1982)

This monumental book describes a detailed computational model of how physical objects can be constructed from images.

Baltes Paul: LIFE-SPAN DEVELOPMENT AND BEHAVIOR (Academic Press, 1984)

Baltes' theory of dual processes assumes that intelligence as information processing is universal and biological, whereas intelligence as knowledge pragmatics is acquired through experience and therefore influenced by cultural factors.

Bar-Hillel Yehoshuas: LANGUAGE AND INFORMATION (Addison Wesley, 1964)

By building on Lesniewski's and Ajdukiewicz's semantic categories, Bar-Hillel defined a variant of phrase structure grammar that he called categorial grammar in which "every sentence is the result of the operation of one continous part of it upon the remainder, these two parts being the immediate constituents of the sentence, such that these constituents are again the product of the operation of some continous part upone the remainder, etc".

Julian Barbour: THE END OF TIME (Oxford Univ Press, 2000)

Click here for the full review

Barkow Jerome, Cosmites Leda, Tooby John: THE ADAPTED MIND (Oxford Univ Press, 1992)

A collection of articles on evolutionary psychology (i.e., the evolution of the mind).
The mind consists of specialized modules designed by natural selection to solve problems in the environment that have to do with survival and reproduction.
Social darwinism is the evolution of Wilson's sociobiology

Barnet Ann: THE YOUNGEST MINDS (SImon & Schuster, 1998)

Click here for the full review

Barr Avron & Feigenbaum Ed: HANDBOOK OF ARTIFICIAL INTELLIGENCE (William Kaufmann, 1982)

A monumental catalog of models and techniques for A.I. professionals and researchers.

Barsalou Lawrence: COGNITIVE PSYCHOLOGY (Lawrence Erlbaum, 1992)

An introduction to the field.

Bartlett Frederic Charles: REMEMBERING (Cambridge Univ Press, 1967)

In 1932 Bartlett developed one of the earliest cognitive models of memory. Bartlett noted how memory cannot remember all the details, but can "reconstruct" the essence of a scene. Events cannot be stored faithfully, but must have been summarized into a different form, a "schema". Individuals do not passively record stories verbatim, but rather actively code them in terms of schemas, and then can recount the stories by retranslating the schemas into words.
Each new memory is categorized in a schema which depends on the already existing schemas. In practice, only what is strictly necessary is added. When a memory must be retrieved, the corresponding schema provides with instructions to reconstruct it. Bartlett notes how much easier it is to recognize an object in its typical environment.
That is why recognizing an object is much easier in its typical context than in an unusual context.

Barwise John & Perry John: SITUATIONS AND ATTITUDES (MIT Press, 1983)

Inspired by Gibson's ecological realism, Barwise proceeds to undo Frege's theory of meaning (that meaning is located in the world of sense). The world is full of meaning and information that living organisms can use.
Meaning is not an exclusive of language, it is pervasive in nature ("smoke means fire"). Meaning involves the informational content of situations and arises from regularities in the world. Reality is made of situations. Sentences stand for situations. The semantic value of a sentence is a set of abstract situations. Meaning arises out of recurring relations between situations.
Barwise's formalism employs Kleene's partial functions (which deal with finite amounts of information).
Reality comes in situations. Situations are made of objects and spatio-temporal locations; objects have properties and stand in relations. Therefore, a situation is described by a set of relations between objects.
A situation-type is a partial relation from n-ary relations and n individuals to the values true and false. A course of events is a partial function from locations to situation-types. Therefore a course of events at a location on which it is defined yields a situation-type. A state of affairs is a course of events which is defined on just one location.
A living organism (a part of reality capable of perception and action) must be able to cope with the ever new situations of its course of events and to anticipate the future course of events. It must be able to pick up information about one situation from another situation. This can be realized by identifying similarities between situations and relations between such similarities. Each organism performs this process of breaking down reality in a different way, as each organism "sees" reality in a different way, based on its ecological needs.
The type of a situation is determined by the regularities that the situation exhibits. Regularities are invariants differentiated by organism, acquired by adaptation to the environment, that define its behavior in the environment. These similarities between various situations make it possible for an organism to make sense of the world. At the same time they are understood by all members of the same specie, by a whole "linguistic community".
Formally, one situation can contain information about another situation only if there is a relation that holds between situations sharing similarities with the former situation and situations sharing similarities with the latter situation. In that case the first situation "means" the second. A meaning is a relation between different types of situations. In situational semantics the meaning of a declarative sentence is a relation between utterances and described situations.
Therefore, constraints between types of situations are actual and yield meaning. Meaning is defined as relations that allow one situation to contain information about another situation.
Situational semantics solves the semantic problems of granularity, context and factorization by expressing properties and relations as primitive entities. By assuming that sentences stand for situations, it avoids all the pitfalls of the logical tradition, for which sentences stand for truth values.
Situations are more flexible than possible worlds because they don't need to be coherent and don't need to be maximal. Just like mental states.
Indexicals are held to represent not only a few isolated words such as "I" and "now" but the way the speaker exploits the discourse context. They play a key role in the way language conveys information.
Propositional attitudes report relations to situations.
The book also contains a theory of knowledge and beliefs that is similar to Dretske's. An agent knows that p if the agent has a belief that p and that belief carries the information that p.

Barwise Jon: THE SITUATION IN LOGIC (Cambridge Univ Press, 1988)

A collection of a few historical papers by Barwise on situation theory and situation semantics, including philosophical discussions, replies to criticism and introduction to the mathematical rudiments. Barwise also extendes and refines a few of his original concepts.
Logic should be studied from the perspective of information, information processing and information communication. Barwise emphasizes the relational nature of information (e.g., perception is a relation between perceiver and perceived) and the circumstantial nature of information (information is information about the world).
Situation semantics emphasizes two related phenomena: efficiency of language and partiality of information. Situation semantics offers a relation theory of meaning: the meaning of a sentence provides a constraint between the utterance and the described situation.

Bates, Elizabeth: THE EMERGENCE OF SYMBOLS (Academic Press, 1979)

Click here for the full review

Bateson Gregory: MIND AND NATURE (Dutton, 1979)

Click here for the full review

Bateson Gregory: STEPS TO AN ECOLOGY OF MIND (Chandler, 1972)

Click here for the full review

Bechtel William: PHILOSOPHY OF MIND (Lawrence Erlbaum, 1988)

A broad and accessible survey of various schools of philosophy of mind. The book is organized around three topics: language, intentionality and mind-body problem. As far as language goes it covers referential analysis of meaning (Frege, Russell), speech act theory (Austin, Searle, Grice), holistic analysis of meaning (Quine, Davidson), Kripke's possible world semantics and Putnam's causal theory of reference. The chapters on intentionality deal with the computational theory of mind, cybernetics, Dennett's intentional stance.
The mind-body problem is summarized from Descartes' dualism to behaviorism, identity theories, eliminative materialism and functionalism.
A survey of ancient and modern theories of the mind.

Bechtel William: PHILOSOPHY OF SCIENCE (Lawrence Erlbaum, 1988)

An introduction to logical positivism and most recent theories (Kuhn, Feyerabend, Lakatos).
A survey of modern theories of science.

Bechtel William & Adele Abrahamsen: CONNECTIONISM AND THE MIND (MIT Press, 1991)

Drawing from James McClelland, David Rumelhart and Geoffrey Hinton, the book provides a primer to connectionist networks, with examples on connectionist simulations of language and reasoning. The book includes a lengthy defense of connectionism against criticism and a survey of the impact of connectionism on other disciplines.

Becker, Ernest: "The Denial of Death" (1973)

Click here for the full review

Behe Michael: THE EDGE OF EVOLUTION (2007)

Click here for the full review

Behe Michael: DARWIN'S BLACK BOX (Free Press, 1996)

Click here for the full review

Berlin Brent & Kay Paul: BASIC COLOR TERMS (Univ of California Press, 1969)

One ot the studies that established the existence of a privileged level of categorization, the basic level, from which generalizations and specializations are drawn.

Berwick Robert: PRINCIPLE-BASED PARSING (Kluwer Academic, 1991)

A collection of articles in principle-based parsing a small set of fundamental principles is used to derive sentence types (such as passive). The principles interactive deductively to construct sentence types. Parsers are highly specialized inference procedures.

Bickerton, Derek: LANGUAGE AND HUMAN BEHAVIOR (University of Washington PRess, 1995)

Click here for the full review

Bickerton Derek: LANGUAGE AND SPECIES (Chicago Univ Press, 1992)

Click here for the full review

Bischof Horst: PYRAMIDAL NEURAL NETWORKS (Lawrence Erlbaum, 1995)

Bischof thinks that the complex task of vision is performed effortlessly by the brain because of a massive use of hierarchical structures.

Black Max: MODELS AND METAPHORS (Cornell Univ Press, 1962)

Click here for the full review

Blackmore Susan: THE MEME MACHINE (Oxford University Press, 1998)

Click here for the full review

Blackmore, Susan: CONSCIOUSNESS - AN INTRODUCTION (Oxford Univ Press, 2004)

Click here for the full review

Block Ned: READINGS IN PHILOSOPHY OF PSYCHOLOGY (Harvard Univ Press, 1980)

A collection of articles on behaviorism (Putnam, Skinner, Chomsky), physicalism (Davidson, Fodor, Putnam, Kripke, Nagel), functionalism (Armstrong, Nagel, Lewis, Putnam, Kim), mental representations (Fodor, Dennett), imagery (Dennett, Fodor, Kosslyn, Pylyshyn), linguistics (Stich, Chomsky, Fodor, Katz).
Block offers his own critique of functionalism and his own theory of the mind.
The psychological state of a person can be identified with the physical process that is taking place in the brain rather than the state in which the brain is. The psychological state can be represented as the operation performed on a machine, i.e. with the computational state of the machine. The psychological state does not depend on the physical state of the machine and can be the same for different machines that are in different physical states.
Qualias (sensations that are associated to the fact of being in a given psychological state) are not easily explained in a functionalist view. An organism whose functional states are identical to ours, but in which pain causes the sensation that we associate to pleasure (inverted qualia), and an organism whose functional states are identical to ours, but in which pain causes no sensation (absent qualia). Functionalism cannot account for either case.
Functionalism does not prescribe how we can limit the universe of organisms who have mental states. A functionalist might think that Bolivia's economy, as expertly manipulated by a financier, has mental states. Class identity requires also identical internal processes, but this way it excludes beings that we might be tempted to consider having mental states, such as an extraterrestrial being who behaves like us but is made of different material.

Block Ned: IMAGERY (MIT Press, 1981)

A collection of articles on mental imagery, including results of psychological experiments and philosophical theories. The starting point for the debate is that scientists have found no pictures or images in the brain, no internal eye to view pictures stored in memory and no means to manipulate them. Either (Fodor, Kosslyn) the brain has mental pictures that somehow represent the real-world images, or (Dennett, Pylyshyn) the brain represents images through a non-imagistic system, namely language, i.e. all mental representations are descriptional.


A collection of papers by distinguished phisophers or mind on the subject of consciousness.

Bloom, Howard: "Global Brain" (Wiley, 2000)

Click here for full review

Bloom, Howard: THE LUCIFER PRINCIPLE (Norton, 1995)

Click here for full review

Bloom, Paul: "Descartes' Baby" (2004)

Click here for full review


This is the first volume (a special issue of the Artificial Intelligence Journal) that brought together the main names in the then still young discipline of qualitative reasoning. They all share the aim of explaining a physical system's behavior through something closer to common sense than Physics' dynamic equations. They conceive a physical system as made of parts that contribute to the overall behavior through local interactions. They all employ some variation of Hayes' measure space (a discrete representation of a continous space that only deals with the significant values that determine boundaries of behavior).
The main difference in the way they model a system is in their ontologies: Kuipers adopts qualitative constraints among state variables; DeKleer focuses on the devices (pipes, valves and springs) connected in a network of constraintsl; Forbus deals with processes by extending Hayes' notion of history. The system behavior is almost always described by constraint propagation.
Johan DeKleer describes a phenomenon in a discrete measure space through "qualitative differential equations", or "confluences". An envisionment is the set of all possible future behaviors.
Ken Forbus defines a quantity space as a partially ordered set of numbers. Common sense is interested in knowing that quantities "increase" and "decrease" rather than on formulas yielding their values in time.
Benjamin Kuipers formalizes the problem as a sequence of formal descriptions; from the structural description derive the behavioral description ("envisionment") and from this derive the functional description. In his quantity space, besides the signs of the derivatives, what matters most are critical or "landmarks" values, such as the temperature at which water undergoes a phase transition.
The other papers mainly cover practical applications.


An excellent selection of the articles (originally published by the Journal of Artificial Intelligence) that made Artificial Intelligence, from McCarthy's circumpscription to Moore's autoepistemic logic, from Newell's knowledge levels to Pearl's belief networks, from DeKleers', Forbus' and Kuipers' qualitative reasoning to Hayes-Roth's blackboard systems. With a chapter-tribute to Newell.
McCarthy's circumpscription starts from the "close-world assumption", that all relevant information is known (or, all information that is not known can be considered false).


A collection of historical papers, starting with Warren McCulloch's and Walter Pitts' "A logical calculus of the ideas immanent in the nervous system" (1943).
In "Computing machinery and intelligence" (1950) Alan Turing proposed his famous "Turing test" to prove whether a machine is intelligent or not (a computer can be said to be intelligent if its answers are indistinghishable from the answers of a human being).
John Searle's "Mind, Brains and programs" (1980) summarizes his view that computers are purely syntactic and therefore cannot be said to be thinking. His famous thought experiment of the "Chinese room" (a man who does not know how to speak chinese but is provided by formal rules on how to build perfectly sensible chinese answers would pass the Turing test, even if he will never know what those questions and those answers were about) opened the floodgates to the arguments that computation per se will never lead to intelligence.
In the introduction Boden surveys the arguments pro and against Turing's test and the possibility of thinking machines.
Drew McDermott's "A critique of pure reason" (1987) is a critique specifically of Pat Hayes' attempt at formalizing common-sense knowledge. Most of reasoning is not deductive and therefore cannot be reduced to first-order predicate logic. McDermott proves that all logistic approaches, in particular non-monotonic logics as the one advocated by McCarthy (circumscription), yield very weak solutions to the problem of representing knowledge in a tractable way: one cannot write axioms independent of a program for manipulating them if the inferences to be performed from them are not deductions.
In "Motives, mechanisms and emotions" Aaron Sloman analyzes emotions as states in which powerful motives respond to relevant beliefs by triggering mechanisms required by resource-limited systems. An autonomous system having many motives and finite resources is prone to internal conflicts whose resolution requires emotion-based mechanisms. Emotion is not a separate subsystem of the mind, but a pervasive feature of it. Sloman even proposes a generative grammar for emotions.

Boden Margaret: THE CREATIVE MIND (Basic, 1992)

An analysis of human creativity.

Bogdan Radu: GROUNDS FOR COGNITION (Lawrence Erlbaum, 1994)

Bogdan' teleo-evolutionary theory claims that cognitive systems are guided by the environment in their goal-driven behavior. Cognitive systems actually are the product of the evolutionary pressure of guiding behaviors towards goals. Organisms are systems that are genetically programmed to maintain and replicate themselves, therefore they must guide themselves to their goals, therefore they need to obtain relevant information about their environment, therefore they need to be cognitive. It makes evolutionary sense that cognition should appear. Central to his thinking is the concept of "goal-directedness", the result of prebiological evolution which is constantly reshaped by natural selection. Natural selection presupposes goal-directedness. Goal-directedness arises from the genes themselves, which operate goal-directedly.
Organisms manage to survive and multiply in a hostile world by organizing themselves to achieve specific, limited goals in an ecological niche. To pursue their goals, organisms evolve ways to identify and track those goals. Such ways determine which knowledge is necessary. To obtain such knowledge, organisms learn to exploit pervasive and recurrent patterns of information in the world. The information tasks necessary to manipulate such information "select" the appropriate type of cognitive faculties that the organism must be capable of.

Bohm David: "Causality and Chance in Modern Physics" (1957)

Click here for full review

Bohm David: THE UNDIVIDED UNIVERSE (Routledge, 1993)

Click here for full review

Bohm David: WHOLENESS AND THE IMPLICATE ORDER (Ark Paperbacks, 1988)

Click here for full review


A collection of articles, and a subject-indexed bibliography.
Distributed information processing systems, i.e. collection of "intelligent" agents, embody a variety of strategies of decomposition and coordination. Research in distributed A.I. focuses on such methods, and on the forms of interaction that make such methods effective.
Mike Georgeff discusses multi-agent planning. Barbara Hayes-Roth's "A blackboard architecture for control" is included. Frederick Hayes-Roth discusses ABE. Also articles by Victor Lesser, Carl Hewitt, etc.
Gasser thinks, with Mead, that intelligent behavior is essentially a social behavior and emphasizes the social aspects of the interaction among intelligent agents.

Brachman Ronald: READINGS IN KNOWLEDGE REPRESENTATION (Morgan Kaufman, 1985)

A collection of milestone essays on the topic of knowledge representation from a semantic perspective and of knowledge representation frameworks (mainly semantic networks and frames). It includes Pat Hayes' "The logic of frames" (1979) and William Woods' "What's in a link" (1975).
Hayes proves that the language of frames (with the exclusion of stereotipical reasoning) can be reduced to a notational variant of predicate logic. A frame is a micro-theory which allows very rapid inferences. On the other hand, stereotipical reasoning of default values goes against the monotonicity of classical logic.
Woods highlights that a semantic network confuses two types of representation: assertions and definitions (a taxonomic relation between concepts). A concept is equivalent to a first-order predicate. As first-order predicate logic cannot handle the intension of a concept, a semantic network must exhibit the same limitation. Woods proposes to define a concept as a set of sufficient and necessary conditions.
Ross Quillian's "Word concepts" (1967) originated the idea of a semantic network of nodes interconnected with associative links. Marvin Minsky's "A framework for representing knowledge" (1975) presented the fundamental idea of a frame, a knowledge representation formalism based on prototypes, defaults, multiple perspectives, analogies and partial matching.
James Allen's "Maintaining knowledge about temporal intervals" (1983) claims that common sense's time is subject to a number of principles, such as relativity (a date is usually specified relative to another date) and decomponibily (any event can be described as a sequence of component events that take place in the same interval). These principles state the preminence of the "interval" of time (time as partial ordering of intervals) over the "instant" of time (time as total ordering of instants).

Brady Michael & Berwick Robert: COMPUTATIONAL MODELS OF DISCOURSE (MIT Press, 1983)

Bonnie Webber is looking for a formal language to represent utterances. In addition, Candace Sidner also tries to track discourse entities (especially, the focus) over the entire duration of discourse; that involves an understanding of how (definite) anaphoras work.
James Allen thinks that minds are connected to objects via the causal connection between actions and objects, i.e. via beliefs and desires. Allen is trying to marry Austin's and Searle's theory of speech acts with Artificial Intelligence's theory of planning by assuming that speech acts are just particular cases of actions that, like all actions, must be planned. The speaker that asks a question must have a plan of speech acts in mind and, in order to answer appropriately, the other speaker must first unravel that plan. Understanding the purpose of a question helps understand indirect speech acts.

Brandon Robert: GENES ORGANISMS POPULATION (MIT Press, 1984)

A collection of seminal papers on the subject of the level at which natural selection operates.
Evolutionary theory is based upon the idea that species evolve and their evolution is driven by natural selection, but what exactly evolves and what natural selection acts on is still not clear. Nature is organized in a hierarchy: genes are located on chromosomes, chromosomes are located in cells, cells make up organs which make up organisms which make up species which make up populations which make up ecosystems: at what level does selection act?
Darwin's theory implies that what evolves is a population and what selection acts on are the competing organisms of a generation within the population.
Alfred Russel Wallace thinks that selection acts on populations as well as individuals. Wynne-Edwards (1963) thinks that selection acts on groups of organisms. Ernst Mayr (1975) thinks that genes cannot be treated as separate, individual units, that their interaction is not negligible. The units of evolution and natural selection are not individual genes but groups of genes tied into balance adapative systems. Natural selection favors phenotypes, not genes or genotypes.
Lewontin thinks that all entities that exhibit heritable variance in fitness (from prebiotic molecules to whole populations) are units of selection.
William Wimsatt thinks that the notion of selection must be grounded around the notion of "additive variance". This quantity determines the rate of evolution. Variance in fitness is totally additive when the fitness increase in a genotype is a linear function of the number of genes of a given type present in it. Additivity can be proven to be a special case of context-independence. If variance in fitness at a given level is totally additive, then this is the highest level at which selection operates (the entities at that level are composed of units of selection, and there are no higher-level units of selections).
Robert Brandon distinguishes levels of selection from units of selection.
David Hull distinguishes replicators (units that reproduce their structure directly, such as genes) from interactors (entities that interact directly with their environment, such as organisms). Differences in the interactions of interactors with their environment result in differential reproduction of reproductors.
Hamilton (1975)'s kin-selection theory, and more general group-selection theories are also introduced.

Brandon Robert: ADAPTATION AND ENVIRONMENT (Princeton Univ Press, 1990)

Natural selection is defined as the process of differential reproduction due to differential fitness to a common selective environment. The "selective" environment (measured in terms of the relative fitnesses of different genotypes across time or space) is distinguished from the "external" environment and the "ecological" environment (measured using the organism itself as the measuring instrument so that only that part of the external environment that affects the organism's contribution to population growth is taken into account). The selective environment is the one that is responsible for natural selection.
Following David Hull, Brandon generalizes phenotype and genotype to "interactor" (Dawkins' "vehicle") and "replicator" and posits that selection occurs among interactors. The biosphere is hierarchically arranged and, in agreement with Lewontin, natural selection applies to any level of the hierarchy. Selection applies at different levels of the hierarchy of interactors. Interactors can be lengths of RNA or species, or even replicators (but even they behave as interactors when "naturally selected").
Brandon thinks that adaptation defines the function of a property of the organism. The only process one needs to study to understand the properties of a living organism are those that contribute to adaptation.


Click here for full review


A monumental work on grammars as mental representations, that led to the definition of a lexical functional grammar. Half of the chapters are by Bresnan in person.
Bresnan's lexical functional grammar posits the existence of an intermediary functional level between syntactic structures and semantic structures. Two levels of syntactic structures are postulated: constituent (a standard context-free surface parse of a sentence) and functional (generated by equations associated with the context-free rules). Transformations are avoided in favor of a richer lexicon and links between nodes in the constituent and functional structures.

Brillouin Leon: SCIENCE AND INFORMATION THEORY (Academic Press, 1962)

The French physicist Leon Brillouin coined the term "negentropy" for Wiener's negative entropy and formulated the "negentropy principle of information" ("Negentropy Principle of Information", 1953): since the total change in the entropy has to be greater than or equal to zero, new information in a system can only be obtained at the expense of the negentropy of some other system.
A basic point is that information does not reside within the system and is thus phenomenological.
Entropy (a measure of randomness in the state of the system) measures the lack of information.
Information is defined as the amount of uncertainty which existed before a choice was made. Information is thus the difference between the entropy of the observed state of the system and its maximum possible entropy.
Brillouin proved that the minimum entropy cost for obtaining one bit of information is 10 to the -23 joules per degree K.

Broadbent Donald: PERCEPTION AND COMMUNICATION (Pergamon, 1958)

Broadbent is one of the psychologists who identified two types of memory, a "short-term memory", limited to few pieces of information, capable of retrieving them very quickly and decaying also very quickly, and a "long-term memory", capable of large storage and much slower in both retrieving and decaying. Broadbent thinks that short-term memory is a set of pointers to blocks of information located in the long-term memory.
Broadbent enunciated the principle of "limited capacity" to explain how the brain can focus on one specific object out of the thousands perceived by the retina. The selective character of attention is due to the limited capacity of processing by the brain, which can only be conscious of so many events at the same time. Attention originates from a multitude of attentional functions in different subsystems of the brain.
Broadbent's 1958 model of memory reflected well-known features of memory: information about stimuli is temporarily retained but it will fade unless attention is turned quickly to it. The unattended information is "filtered out" without being analyzed. He draws a distinction between a sensory store of virtually unlimited capacity and a categorical short-term store of limited capacity. This is the way that a limited-capacity system such as human memory can cope with the overwhelming amount of information available in the world.
Broadbent proposes a block diagram which was similar to those used by computer science, thereby approaching the first computational model of memory.

Broadbent Donald: DECISION AND STRESS (Academic Press, 1971)

In 1971 Broadbent modified his original information-flow model of 1958 by taking into account new physiological and psychological findings. Foremost among the changes is that stimuli may be selected by the attentional filter on the basis of semantic properties, besides their physical properties.
In 1984 Broadbent will also propose his "maltese cross" model consisting of four stores (sensory, short-term, long-term and motor output) with a central processing unit that controls the flow of information among them.

Brooks Daniel & Wiley E.O.: EVOLUTION AS ENTROPY (Univ of Chicago Press, 1986)

The goal of this unified theory of evolution is to integrate Dollo's law (the irreversibility of biological evolution) with natural selection. Natural selection per se only states an environmental constraint, but no directionality in time. Dollo's law is considered as a biological manifestation of the second law of thermodynamics.
Unlike Prigogine, Wiley and Brooks believe that biological systems are inherently different from dissipative structures. Biological systems owe their order and organization to their genetic information, which is inherent and inheritable. Both during growth and during evolution entropy of biological information constantly increases. Evolution is a particular case of the second law of thermodynamics and biological order is a direct consequence of it.
The creation of new species is made necessary by the second law and is a "sudden" phenomenon similar to phase changes in Physics. Phylogenetic branching is an inevitable increase in informational entropy. The interaction between species and the environment is not as important in molding evolution: natural selection mainly acts as a pruning factor. Species are systems in a state of non-equilibrium and new species are created according to the second law.
Biological systems differ from physical dissipative systems in that their order is based on properties that are inherent and heritable. Their relevant phase space is genetic. The total phylogeny is characterized by an ever increasing genetic phase space. Dissipation in biological systems is not limited to energy but also involves information. Information is transmitted to subsequent generations.
Unlike most theories of information, that use information to denote the degree to which external forces create structure within a system, Brooks-Wiley's information resides within the system and is material, it has a physical interpretation. Such information resides in molecular structure as potential for specifying homeostatic and ontogenetic processes. As the organism absorbs energy from the environment, this potential is actualized and is "converted" into structure. Over short time intervals biological systems behave like dissipative structures. Over longer time intervals they behave like expanding phase space systems.
In concluding, by studying entropy in biological systems, Wiley and Brooks propose a nonequilibrium approach to evolution. Reproduction, ontogeny and phylogeny are examples of biological organization that exhibit irreversible behavior. Biological systems are nonequilibrium systems.

Brooks Rodney & Luc Steels: THE ARTIFICIAL LIFE ROUTE TO ARTIFICIAL INTELLIGENCE (Lawrence Erlbaum, 1995)

A collection of papers on the paradigm of situated cognition.
Brooks' 1991 papers, "Intelligence without representation" and "Intelligence without reason", were instrumental in creating a new, "situated" approach to cognition by emphasizing the interaction between an agent and its environment.
Situated agents have no knwoledge. Their memory is not a locus of representation but simply the place where behavior is generated.
In Brooks' subsumption architecture behavior is determined by the structure of the environment. The cognitive system has no need to represent the world, but only how to operate in the world. There is no centralized function that coordinates the entire cognitive system, but a number of distributed decisional centers that operate in parallel, each of them performing a different task. The system does not have the explicit representation of what it is doing. It does have parallel processes that represent only their very limited goal.
The system decomposes in layers of goal-driven behavior, each layer being a network of finite-state automata, and incrementally composes its behavior through the interaction with the world.
Brooks can therefore account for the response times required in the real world. In the real world there is no clearcut difference between perception, reasoning and action.
Brooks' tolemaic revolution in cognitive science turns the mind into one of many agents that live in the environment. The environment is the center of the action, not the mind.
The environment is action, continous action, continously changing. Only a system of separate, autonomous control systems could possibly react and adapt to such a context.
The world contains all the information that the organism needs. Therefore there is no need to represent it in the mind. The environment acts like a memory external to the organism, from which the organism can retrieve any kind of information through perception.
"Intelligent" behavior can be partitioned into a set of asynchronous tasks (eating, walking, etc), each endowed with a mechanism of perception and action. An organism can be built incrementally by gradually adding new tasks.
In other words, every intelligent being has a body!
Cognition is rational cinematics.

Brown Frank: THE FRAME PROBLEM (Morgan Kaufmann, 1987)

Proceedings of a workshop on the frame problem. Yoav Shoham identifies a qualification problem and an extended prediction problem that subsume the frame problem. Frank Brown presents a modal logic approach. Matthew Ginsberg's "Reasoning About Action" offers a solution based on the search for the nearest possible world to the current one.

Brown, Roger: A FIRST LANGUAGE (Harvard Univ Press, 1973)

The US psycholinguist Roger Brown refined the view of Jean Piaget's "constructivism" that language acquisition follows the acquisition of cognitive skills. According to Brown, language is acquired via a "law of cumulative complexity". Language follows the acquisition of "sensori-motor intelligence". First the child's mind develops the representation of the world in terms of objects and actions, then the child learns to speak; and that initial speech (of one-word sentences) is "semantic", i.e. the initial relation between that representation of the world and sounds is purely semantic. As mental life evolves into more and more complex structures, so does language. Language acquisition is a process of hierarchic construction, and complexity of adult language is the result of that process. Chomsky's "universal grammar" is an illusion due to the fact that all children are programmed to develop through the same stages and achieve the same adult stage, and language simply reflects the outcome of that step-by-step hierarchical process.

Brown Jason: THE LIFE OF THE MIND (Lawrence Erlbaum, 1988)

Brown's approach to the mind is neuropsychological.
"Microgenesis" assumes that the structure of perceptions, concepts and actions (and mental states in general) is not based on representations but on processing stages that last over a microtime, propagate "bottom-up", and are not conscious. A representation is but a section of a processing continuum. Mind is not simply the final representation, it is the very series of processing stages. Earlier processing stages remain part of the final stage just like a child's early stages of development persist as subconscious themes in the adult's cognitive life.
Microgenesis is the correspective for microtimes of ortogenesis and phylogenesis. They are the expression of the same general process over different time scales. Microgenesis is sort of istantaneous evolution.
The theory implies that symptoms of brain damage represent normal stages in the cognitive life at microscopic level. Therefore they can be used to reconstruct cognitive life. Brown has used this technique to reconstructs the way language is produced and understood.

Bruner Jerome: SELF RECONSIDERED (1995)

Click here for full review

Bruner Jerome: A STUDY OF THINKING (Wiley, 1956)

Click here for full review

Bruner Jerome: ACTUAL MINDS POSSIBLE WORLDS (Cambridge Univ Press, 1986)

Click here for full review

Bruner Jerome: A STUDY OF THINKING (Wiley, 1956)

A book that helped launch the cognitive revolution in Psychology and Philosophy. Bruner concentrates on how human beings categorize objects. All cognitive activity depends upon the process of categorizing events. A category is a set of events that can be treated as if they were equivalent. Bruner employs techniques of game theory and communication theory to explain how the environment is partitioned into equivalence classes. Concept formation, or "attainment", is achieved via a number of selection (choice of instances) and reception (revision of hypothesis) strategies. In general, though, subjects categorize with probabilistic cues.

Bruner Jerome: ACTS OF MEANING (Harvard University Press, 1994)

A manifesto of methodology from the man who set up the first Center for Cognitive Studies (in Cambridge, MA, in the Sixties) proposes a "cultural psychology" that is centered on meaning, not information, and on the construction of meaning by the mind, not on the processing of information by the mind. To understand humans one must understand how their experiences are shaped by their intentional states. The form of these intentional states depend upon the symbolic systems of their culture. Biological inheritance merely imposes constraints on action. Culture enables humans to transcend those biological limits. Folk psychology is but a device for people to organize their views of themselves, the others and the world they share with them. Folk psychology is not grounded on a logical system, but on narratives. Narrative skills arise somehow from a biological need to narrate. Even selves must be viewed in the context of culture and society: a self is distributed interpersonally.

Buchler Justus: METAPHYSICS OF NATURAL COMPLEXES (Columbia University Press, 1966)

A general discussion of complexity from a philosophical point of view. The world is unlimitedly complex and complexity if the result of multiple relatedness among processes. Buchler adopts an ontology of processes instead of things.

Buck Ross: THE COMMUNICATION OF EMOTION (Guilford Press, 1984)

Human behavior is a function of several systems of organization: innate special-purpose processing systems (reflexes, instincts, etc) concerned with bodily adaptation and the maintenance of homeostasis and that employ a holistic, syncretic type of cognition (knowledge by acquaitance); and acquired general-purpose processing systems, concerned with making sense of the environment and that employ sequential, analytic cognition (knoweldge by description). The former (associated with the right emisphere) carry out spontaneous communication involving emotional expression, the latter (associated with the left emisphere) carry out symbolic communication involving propositions. The former is primitive, the latter also requires the former, and may be based upon it both phylogenetically and ontogenetically.
Buck grounds his model of communication of emotions on Shannon-Weaver's theory of communication and assumes that such communication occurs via two parallel streams, one spontaneous (emotions) and one symbolic (propositions).
Communication occurs when the behavior of an individual influences the behavior of another individual. Communication of emotions, in particular, is a biologically shared signal system that has been created through an evolutionary process.
Emotion is defined as a readout of motivational systems. Buck identifies three functions of emotions: bodily adaptation to the environment, social communication with other aware beings and subjective experience. All originate from motives that must be satisfied. The emotion is a measure of how far they have been satisfied.
Buck provides both a general cognitive model of emotions and a a detailed physical model of their neural processes.
Buck thinks that behavior is governed by biological, or innate, epistemic, or acquired, and rational, or processed, factors.


Bundy introduces the notation of propositional logic and predicate logic, higher-order logics and lambda calculus. Then explains how a computer can perform automatic theorem proving by using resolution, along the way defining Horn clauses, Kowalski form and Skolem normal form. The book also touches on Douglas Lenat's concept formation and Daniel Bobrow's theory formation.

Bunge Mario: TREATISE ON BASIC PHILOSOPHY (Reidel, 1974-83)

A monumental seven-volume synthesis of modern philosophical themes.
Volume one deals with sense and reference. Reference is non equated to extension. Intension is an irreducible semantic object. The sense of a construct is relative to the theory in which it occurs (sense depends on the context).
Volume two deals with interpretation and truth. Meaning is sense together with reference. Meaning is not verifiability, truth conditions, information, etc. Bunge develops a calculus of meaning. A truth measure function (a continous function) allows for the expression of partial truth, or degrees of truth.
Volume three and four deal with ontology (substance, properties, change, spacetime). Reality is the aggregation of things holding spatiotemporal relations: spacetime can be understood only in terms of changing things. Spacetime must be anchored to things, not the other way around. A system is identified by three components: its composition, environment and structure. The universe is a system composed of subsystems. Everything is a system or a system component.
Organisms are particular systems with emergent properties. The unit of biological study is the organism-in-the-environment together with its subsystems (from cells to organs) and its supersystems (from population to biosphere). The mind is a collection of processes of neural systems. Society is a system made of people linked by social relations.
Volume five and six deal with epistemology. Every cognitive activity is a neural process. Language is for transmitting knowledge and influencing behavior. Perception yields a subjective type of knowledge. Conceptualizing yields objective knowledge. Perception is like copying reality to the brain. Conceptualizing goes beyond mere copying: it can form new propositions out of nonpropositional knowledge (percepts) or it can form new propositions out of old propositions (inferring). Inference yields new propositions, not new concepts.

Bunt Harry: MASS-TERMS AND MODEL-THEORETIC SEMANTICS (Cambridge Univ Press, 1985)

The book deals with the semantic problems related to mass nouns (such as "water", "music", "luggage", etc), as opposed to count nouns. The semantics for mass terms is built on ensemble theory (an extension of mereology built around the concept "part of").

Burnham, Terry & Phelan, Jay: MEAN GENES (Basic, 2000)

Click here for full review

Buss David: THE EVOLUTION OF DESIRE (Basic, 1994)

Research on sexual behavior reveals a distinct gender gap. Natural selection has molded the brains of men and women in very different ways as a result of their different reproductive goals.

Butler, Samuel: EVOLUTION (?, 1879)

Click here for full review

Home | The whole bibliography | My book on Consciousness

(Copyright © 2000 Piero Scaruffi | Legal restrictions - Termini d'uso )