(Copyright © 2000 Piero Scaruffi | Legal restrictions - Termini d'uso )
With this book Chomsky struck a fatal blow at the behaviorist tradition of Skinner and others that research should be focused solely on external, measurable stimuli and responses, and not to abstract mental entities. At the same time Chomsky reacted to structural linguistics that was content with describing and classifying languages. Chomsky extended the idea of formal systems to linguistics by using the logical formalism to express the grammar of a language.
Chomsky's idea was to concentrate on the study of grammar, and specifically syntax, i.e. on the rules that account for all valid sentences of a language. The idea was that language is based on a system of rules determining the interpretation of its infinitely many sentences.
Chomsky argued for the independence of syntax from semantics, as the notion of a well-formed sentence in the language is distinct from the notion of a meaningful sentence.
The phrase structure model, based on immediate constituent analysis, is a more powerful tool for the purpose of grammar than other existing tools, but not adequate enough. A grammar needs to have a tripartite structure: a sequence of rules to generate phrase structure, a sequence of morphophonemic rules to convert strings of morphemes into strings of phonemes, and a sequence of transformational rules that transform strings with phrase structure into new strings to which the morphophonemic rules can apply.
Chomsky proposed a hierarchy that categorizes languages according to the complexity of the grammars that generate them. The simplest languages are regular languages, or type-3; type-2 languages are context free; type-1 are context-sensitive; and type-0 are recursively enumerable languages. The definitions are based on the type of rules needed to generate all the sentences of the language.
Chomsky posited the existence of two levels of language: an underlying deep structure, which accounts for the fundamental syntactic relationships among language components, and a surface structure, which accounts for the sentences that are actually uttered, and which is generated by transformations of elements in the deep structure.
A generative grammar is a rules system that generates the grammatical sentences of the language that it describes and assigns to each sentence a grammatical analysis. The simplest type of generative grammar is the finite-state grammar, but no natural language is finite. In a phrase structure grammar the elements of the sentences are identified by constituents (noun phrase, verb phrase, etc). In a transformational generative grammar the phrase structure (which produces the "deep structure" of a sentence) is supplemented by a transformational component and a morphophonemic component (which transform the deep structure into the surface structure of the sentence, e.g. active or passive form).
Chomsky's computational approach had its flaws. Each Chomsky grammar is equivalent to a Turing machine. From Godel's theorem, the processing of a Turing machine may never come to an end. Therefore a grammar may never find the meaning of a valid sentence, although we have no evidence that our brain may never find the meaning of a valid sentence in our language. Later, Gold proved that no amount of correct examples of sentences are enough to learn a language.
The book was one of the milestones of cognitive science. Chomsky's formal method was influenced by mathematical logic (particularly formal systems) and the computer model (the information-processing paradigm).