Ageno Mario: LE ORIGINI DELL'UNIVERSO (Boringhieri, 1992)
Mario Ageno shows that Boltzmann's proof contains two errors: 1. Boltzmann's model of a gas represents a discrete set of molecules as a continuum of points; 2. Boltzmann assumes that the walls containing the closed system are perfectly reflecting. If these arbitrary assumptions are dropped, no rigorous proof for the irreversibility of natural processes exists.
Aggleton John: THE AMYGDALA (Wiley-Liss, 1992)
The book explores various neurobiological aspects of emotion and memory.
Emotions are key to learning and behavior as fear conditioning imprints emotional memories that are quite permanent. The relationship between emotion and memory goes beyond fear, but fear is the emotion that has been studied more extensively. As a matter of fact, fear seems to be a common ground for (at least) all vertebrates. Memories about fearful experiences are created by interactions among the amygdala, the thalamus and the cortex. Emotional memory (stored in the amygdala) differs from declarative memory (which is mediated by the hippocampus and the cortex). Emotional memory is primitive, in the sense that only contains simple links between cues and responses. A noise in the middle of the night is enough to create a state of anxiety, without necessarily bringing back to the mind full consciousness of what the origin of that noise can be. This actually increases the efficiency (at least the speed) of the emotional response.
Emotional and declarative memories are stored and retrieved in parallel. Adults cannot recall childhood traumas because in children the hippocampus has not yet matured to the point of forming conscious memories, but the emotional memory is there.
Aleksander Igor: IMPOSSIBLE MINDS (Imperial College Press, 1996)
Describes a computational model of consciousness that was implemented
(as a neural state machine)
at the Imperial College. Along the way, it briefly describes a little bit
of philosophy of mind and a lot of behavioral psychology, relating them to
Allen James: NATURAL LANGUAGE UNDERSTANDING (Benjamin Cummings, 1995)
The new edition of one of the best textbooks on natural language processing, from basic parsing techniques to anaphora resolution, discourse structure to speech acts.
Allen James: READINGS IN PLANNING (Morgan Kaufmann, 1990)
Allen's temporal logic is based on a many-sorted predicate calculus with variables ranging over "properties", "time intervals", "events", etc. Temproal relations such as "during", "before", "overlap", "meets" and "equal" are primitive, are represented by predicates and are controlled by the axioms of the logic. An instant is defined as a very small interval. Properties hold at intervals.
Amari Shun-ichi & Freeman Walter: NEURAL NETWORKS AND CHAOS (Lawrence Erlbaum, 1994)
A collection of papers for a workshop on the subject.
Anderson James & Rosenfeld Edward: NEURO-COMPUTING (MIT Press, 1988)
Click here for the full review
Anderson James A.: NEURO-COMPUTING 2 (MIT Press, 1990)
Another set of historical articles, including seminal papers on Caianiello's neural equations, Wiener's cybernetics, Pribram's holographic model, Minsky's critique of perceptrons and Fodor's And Pylyshyn's "Connectionism and cognitive architecture" on the feasibility of a compositional theory.
Anderson John Robert: THE ARCHITECTURE OF COGNITION (Harvard Univ Press, 1983)
ACT, as developed in 1976, was a cognitive architecture capable of dealing with both declarative knowledge (represented by propositional networks) and procedural knowledge (represented by production rules). The production system worked as the interpreter of the propositional network.
New production rules are learned as the system works. Complex cognitive skills can develop from a simple architecture.
ACT assumes that a cognitive system has two short-term memories: a declarative memory (that remembers experience) and a productive memory (that remember rules learned from experience). Knowledge is compiled into more and more complex procedural chunks through an incremental process of transformation of declarative knowledge in procedurale knowledge. An incremental process transforming declarative knowledge into procedural knowledge consolidates knowledge into ever more complex procedural chunks. Each rule is weighed according to how often it is used, and the weight determines its priority.
Anderson John Robert: THE ADAPTIVE CHARACTER OF THOUGHT (Lawrence Erlbaum, 1990)
The book explores the cognitive architecture known as ACT, which broadens the principles of production systems.
Anderson has developed a probabilistic method to explain how categories are built and how prototypes are chosen. Anderson's model maximizes the inferential potential of categories (i.e., their "usefulness"): the more a category helps predict the features of an object, the more the existence of that category makes sense. For each new object, Anderson's model computes the probability that the object belongs to one of the known categories and the probability that it belongs to a new category: if the latter is greater than the former, a new category is created.
Anderson John Robert: RULES OF THE MIND (Lawrence Erlbaum, 1993)
In this book Anderson looks for the psychological evidence of production systems (in particular in the area of acquisition of cognitive skills) and refines ACT into ACT-R, which includes a neural-network implementation of a production system. The book is structured as a set of articles by Anderson and others, and it includes simulation software.
Anderson James: AN INTRODUCTION TO NEURAL NETWORKS (MIT Press, 1995)
A very up-to-date 600-page survey of the mathematical foundations of neural networks that neatly organizes linear associators, perceptrons, gradient descent algorithms (ADALINE, back propagation), nearest neighbor models, Kanerva's sparse distributed memories, energy-based models (Hopfield model, Boltzmann machine), Kohonen's adaptive maps, the BSB model, etc. The sonar system of the bat is also reviewed.
Anderson Norman: A FUNCTIONAL THEORY OF COGNITION (Lawrence Erlbaum, 1996)
By applying the same principles over and over again, Anderson provides a unified
theory of cognition founded on "general cognitive algebra".
John Andreae: ASSOCIATIVE LEARNING FOR A ROBOT INTELLIGENCE (Imperial College Press, 1998)
John Andreae's goal is to build robots that
can learn like humans, have free will and eventually
consciousness. His model is based on associative learning (that he proves is
capable of learning a Universal Turing Machine), and the "intelligence" of the robot is basically a growing collection of associations.
Aoun Joseph: A GRAMMAR OF ANAPHORA (MIT Press, 1986)
Aoun deals with reciprocals and reflexives by proposing a generalized government-binding theory that leads to a structural unification of the notions of pronouns, empty categories and anaphors.
Arbib Michael: THE HANDBOOK OF BRAIN THEORY AND NEURAL NETWORKS (MIT Press, 1995)
This 1,000-page handbook (compiled by dozens of experts under the direction of Michael Arbib) covers topics in Psychology, Philosophy, Neurophysiology, Artificial Intelligence, self-organizing systems, neural networks, etc.
Arbib Michael: METAPHORICAL BRAIN (Wiley, 1972)
This introduction to cybernetics begins with dividing simulation and emulation approaches to modeling intelligent behavior, i.e. artificial intelligence and neural networks. Then the book focuses on brain theory, considering the brain as a particular type of machine.
Arbib Michael: THE CONSTRUCTION OF REALITY (Cambridge University Press, 1986)
Click here for the full review
Arbib Michael: FROM SCHEMA THEORY TO LANGUAGE (Oxford Univ Press, 1987)
A theory of language based on Arbib's theory of schemas, with a practical implementation.
Arbib Michael: BRAINS MACHINES AND MATHEMATICS (Springer Verlag, 1987)
An introduction to some topics of cybernetics, neural networks, Turing machines, self-reproducing automata and to Godel's incompleteness theorem.
Arbib Michael: METAPHORICAL BRAIN 2 (Wiley, 1989)
The second volume greatly expands the contents of the first volume. Besides a little neuroanatomy, the focus is on mathematical analyses of neural phenomena from the perspective of action-oriented perception and in the light of Arbib's own theory of schemas. Schema theory is applied to the vision of the frog and high-level recognition, hand control and speech understanding. Along the way, mathematical models are offered to explain locomotion and eye movement; and all the main learning models (from perceptrons to the HEARSAY system, from Hopfield nets to Boltzmann machines, from backpropagation to the NETTALK system) are formally introduced.
Arbib advances a theory of consciousness: first language developed, as a tool to communicate with other members of the group in order to coordinate group action; then communication evolved beyond the individual-to-individual sphere into the self sphere.
Armstrong David-Malet: BELIEF, TRUTH AND KNOWLEDGE (Camrbidge University Press, 1973)
Beliefs are maps of the world (with the believer as central reference) by which the believer's actions are guided. Beliefs are states that have an internal structure: the content of the proposition believed. Beliefs may be reduced to the deep structures of Chomsky's linguistic theory. Beliefs often come in degrees: a partial belief is a degree of causal efficacy of the belief state in relation to action.
Armstrong David Malet: THE NATURE OF MIND (Cornell Univ Press, 1981)
A philosophical treaty on the dualism of the mind, which also presents Armostrong's causal theory of the mind. Mental states and physical states are identical (just like we perceive many natural phenomena without perceiving the corresponding microscopic physical processes) and a mental state is causally connected with a physical state. A state of the brain causes a mental state. Consciousness of a mental state is a perception of that mental state.
Consciousness is the perception of mental states. Its special status is purely illusory. The self is the single continuing entity that appears from the organization of introspection. The biological function of consciousness is to sophisticate the mental processes so that they yield more interesting action.
Armstrong, David Malet: THE MIND-BODY PROBLEM (Westview, 1999)
Click here for the full review
Ashby William: AN INTRODUCTION TO CYBERNETICS (Chapman & Hall, 1956)
In this book Ashby summarized a number of influential concepts. He placed emphasis on feedback, the process that allows for "homeostasis". Both machines and living beings tend to change to compensate variations in the environment, so that the combined system is stable. For living beings this translates into "adaptation" to the environment. The "functioning" of both living beings and machines depends on feedback processes. Ashby also emphasized the power of self-organizing systems, systems made of a very high number of simple units which can evolve autonomously and adapt to the environment by virtue of their structure.
In 1962 Ashby also formulated his principle of self-organization: "in any isolated system life and intelligence inevitably develop". In every isolated system subject to constant forces "organisms" arise that are capable of adapting to their environment.
Ashtekar Abbay: CONCEPTUAL PROBLEMS OF QUANTUM GRAVITY (Birkhauser, 1991)
Ashtekar is a proponent of the loop-space theory of quantum gravity. To quantize gravity physicists only need to show that gravitational waves consist of quantum force-carrying particles, or gravitons. The perturbation methods that have been developed to this purpose (and which gave rise to the theory of superstrings, infinitesimal loops of energy whose wrigglings should generate particles and forces) have largely failed because gravitons, unlike other force carriers, alter the very geometry of space and time, which in turn affects their behavior; in other words, because of gravity's inherently self-referential, non-linear nature.
By using Amithaba Sen's variable, time and space can be split in two distinct entities subject to quantum uncertainty just like position and momentum. AShetkar's equations generate exact solutions for quantum gravitational states that can be represented by loops (as in knot theory). The loops are tightly knitted together. Gravitons are embroidery knitted into the loops.
Austin John Langshaw: HOW TO DO THINGS WITH WORDS (Oxford Univ Press, 1962)
Austin handles language as a particular case of action, "speech action".
Austin introduced a tripartite classification of acts performed when a person speaks. Each utterance entails three different categories of speech acts: a locutionary act (the words employed to deliver the utterance), an illocutionary act (the type of action that it performs, such as warning, commanding, promising, asking), and a perlocutionary act (the effect that the act has on the listener, such as believing or answering).
A locutionary act is the act of producing a meaningful linguistic sentence. An illocutionary act sheds light on why the speaker is uttering that meaningful linguistic sentence. A perlocutionary act is performed only if the speaker's strategy succeeds.
Austin believes that any locutionary act (phonetic act plus phatic act plus rhetic act) is part of a discourse which bestows an illocutionary force on it. All language is therefore an illocutionary act.
Austin John Langshaw: SENSE AND SENSIBILIA (Clarendon, 1962)
Austin criticizes the view that
we cannot directly perceive material objects, but only sense-data.
Austin John Langshaw: PHILOSOPHICAL PAPERS (Clarendon, 1961)
A collection of all the philosophical papers of the philosopher famous for his theory of truth as grounded in historical situations: "a statement is true when the historic state of affairs to which it is correlated by the "demonstrative" conventions is of a type with which the sentence used in making it is correlated by the "descriptive" conventions. Descriptive conventions correlate sentences with types of situation. Demonstrative conventions correlate statements with historic situations.