Highlighting the close relationship between linguistic explanation and learnability, Bruce Tesar and Paul Smolensky examine the implications of Optimality Theory (OT) for language learnability.
Highlighting the close relationship between linguistic explanation and learnability, Bruce Tesar and Paul Smolensky examine the implications of Optimality Theory (OT) for language learnability. They show how the core principles of OT lead to the learning principle of constraint demotion, the basis for a family of algorithms that infer constraint rankings from linguistic forms.
Of primary concern to the authors are the ambiguity of the data received by the learner and the resulting interdependence of the core grammar and the structural analysis of overt linguistic forms. The authors argue that iterative approaches to interdependencies, inspired by work in statistical learning theory, can be successfully adapted to address the interdependencies of language learning. Both OT and Constraint Demotion play critical roles in their adaptation. The authors support their findings both formally and through simulations. They also illustrate how their approach could be extended to other language learning issues, including subset relations and the learning of phonological underlying forms.
Bruce Tesar is Professor in the Department of Linguistics/Center for Cognitive Science at Rutgers University.
Paul Smolensky is Professor of Cognitive Science at Johns Hopkins University. He was a leading member of the PDP connectionist research group, and is the recipient of the 2005 David E. Rumelhart Prize in Cognitive Science, which is awarded annually to an individual or collaborative team making a significant contribution to the formal analysis of human cognition.
This work represents what is arguably the most clear-minded and far-reaching current research program on the applications of formal learning theory to the problem of language acquisition.
Stefano Bertolo, Massachusetts Institute of Technology
Tesar and Smolensky have done something remarkable here: they've managed to address an important topic in a way that's both formally rigorous and wonderfully accessible.
John J. McCarthy, Professor of Linguistics, University of Massachusetts
In this work, Tesar and Smolensky fruitfully extend their research program in Optimality-Theoretic learning theory to the crucial problem of learning 'hidden' structure: linguistic entities, such as foot structure and underlying representations, which cannot be directly detected in the learning data. This thoughful and novel work is strongly recommended to scholars in learnability theory, as well as to anyone with an interest in Optimality Theory.
Bruce Hayes, Department of Linguistics, UCLA
In Tesar and Smolensky's groundbreaking work, grammar learning is tied intimately and inextricably to the core principles of the underlying linguistic theory. Their success in this ambitious project is a beacon to all those who see learning as a central link between generative grammar and its interpretation as cognitive science.
Alan Price, Department of Linguistics and Center for Cognitive Science, Rutgers University