Computational Complexity and Natural Language heralds an entirely new way of looking at grammatical systems. It applies the recently developed computer science tool of complexity theory to the study of natural language. A unified and coherent account emerges of how complexity theory can probe the information-processing structure of grammars, discovering why a grammar is easy or difficult to process and suggesting where to look for additional grammatical constraints.
Written primarily from the perspective of computational theory, Grammatical Basis of Linguistic Performance presents a synthesis of some major recent developments in grammatical theory and its application to models of language performance. Its main thesis is that Chomsky's government-binding theory is a good foundation for models of both machine parsing and language learnability.Both authors are at MIT. Robert C. Berwick is Assistant Professor in the Department of Electrical Engineering and Computer Science, and Amy Weinberg is in the Department of Linguistics and Philosophy.
This landmark work in computational linguistics is of great importance both theoretically and practically because it shows that much of English grammar can be learned by a simple program.
The Acquisition of Syntactic Knowledge investigates the central questions of human and machine cognition: How do people learn language? How can we get a machine to learn language? It first presents an explicit computational model of language acquisition which can actually learn rules of English syntax given a sequence of grammatical, but otherwise unprepared, sentences.
As the contributions to this book make clear, a fundamental change is taking place in the study of computational linguistics analogous to that which has taken place in the study of computer vision over the past few years and indicative of trends that are likely to affect future work in artificial intelligence generally.