This introductory text takes a novel approach to the study of syntax. Grammar as Science offers an introduction to syntax as an exercise in scientific theory construction. Syntax provides an excellent instrument for introducing students from a wide variety of backgrounds to the principles of scientific theorizing and scientific thought; it engages general intellectual themes present in all scientific theorizing as well as those arising specifically within the modern cognitive sciences. The book is intended for students majoring in linguistics as well as non-linguistics majors who are taking the course to fulfill undergraduate requirements. Grammar as Science covers such core topics in syntax as phrase structure, constituency, the lexicon, inaudible elements, movement rules, and transformational constraints, while emphasizing scientific reasoning skills. The individual units are organized thematically into sections that highlight important components of this enterprise, including choosing between theories, constructing explicit arguments for hypotheses, and the conflicting demands that push us toward expanding our technical toolkit on the one hand and constraining it on the other. Grammar as Science is constructed as a “laboratory science” course in which students actively experiment with linguistic data. Syntactica, a software application tool that allows students to create and explore simple grammars in a graphical, interactive way, is available online in conjunction with the book. Students are encouraged to “try the rules out,” and build grammars rule-by-rule, checking the consequences at each stage.
Downloadable instructor resources available for this title: instuctor's manual and file of figures in the book
Experiencers—grammatical participants that undergo a certain psychological change or are in such a state—are grammatically special. As objects (John scared Mary; loud music annoys me), experiencers display two peculiar clusters of nonobject properties across different languages: their syntax is often typical of oblique arguments and their semantic scope is typical of subjects. In The Locative Syntax of Experiencers, Idan Landau investigates this puzzling correlation and argues that experiencers are syntactically coded as (mental) locations. Drawing on results from a range of languages and theoretical frameworks, Landau examines the far-reaching repercussions of this simple claim. Landau shows that all experiencer objects are grammaticalized as locative phrases, introduced by a dative/locative preposition. “Bare” experiencer objects are in fact oblique, too, the preposition being null. This preposition accounts for the oblique psych(ological) properties, attested in case alternations, cliticization, resumption, restrictions on passive formation, and so on. As locatives, object experiencers may undergo locative inversion, giving rise to the common phenomenon of quirky experiencers. When covert, this inversion endows object experiencers with wide scope, attested in control, binding, and wh-quantifier interactions. Landau’s synthesis thus provides a novel solution to some of the oldest puzzles in the generative study of psychological verbs. The Locative Syntax of Experiencers offers the most comprehensive description of the syntax of psychological verbs to date, documenting their special properties in more than twenty languages. Its basic theoretical claim is readily translatable into alternative frameworks. Existing accounts of psychological verbs either consider very few languages or fail to incorporate other theoretical frameworks; this study takes a broader perspective, informed by findings of four decades of research.
A convincing account of reduplicative phenomena has been a longstanding problem for rule-based theories of morphophonology. Many scholars believe that derivational phonology is incapable in principle of analyzing reduplication. In Distributed Reduplication, John Frampton demonstrates the adequacy of rule-based theories by providing a general account within that framework and illustrating his proposal with extensive examples of widely varying reduplicatation schemes from many languages. His analysis is based on new proposals about the structure of autosegmental representations.Although Frampton offers many new ideas about the computations that are put to use in reduplicative phonology, some fairly radical, his intent is conservative: to provide evidence that the model of the phonological computation developed by Chomsky and Halle in 1968 is fundamentally correct--that surface forms are produced by the successive modification of underlying forms. Frampton’s theory accounts for the surface properties of reduplicative morphemes by operations that are distributed at various points in the morphophonology rather than by a single operation applied at a single point. Lexical insertion, prosodic adjustment, and copying can each make a contribution to the output at different points in the computation of surface form.Frampton discusses particular reduplicative processes in many languages as he develops his general theory. The final chapter provides an extensive sequence of detailed case studies. Appendixes offer additional material on the No Crossing Constraint, the autosegmental structure of reduplicative representations, linearization, and concatenative versus nonconcatenative morphology. This volume will play a major role in the main debate of current phonological research: what is the nature of the phonological computation?
The essays in this volume address foundational questions in phonology that cut across different schools of thought within the discipline. The theme of modularity runs through them all, however, and these essays demonstrate the benefits of the modular approach to phonology, either investigating interactions among distinct modules or developing specific aspects of representation within a particular module. Although the contributors take divergent views on a range of issues, they agree on the importance of representations and questions of modularity in phonology. Their essays address the status of phonological features, syllable theory, metrical structure, the architecture of the phonological component, and interaction among components of phonology. In the early 1990s the rise of Optimality Theory--which suggested that pure computation would solve the problems of representations and modularity--eclipsed the centrality of these issues for phonology. This book is unique in offering a coherent view of phonology that is not Optimality Theory based. The essays in this book, all by distinguished phonologists, demonstrate that computation and representation are inherently linked; they do not deny Optimality Theory, but attempt to move the field of phonology beyond it.
In this highly original reanalysis of minimalist syntax, Thomas Stroik considers the optimal design properties for human language. Taking as his starting point Chomsky’s minimalist assumption that the syntactic component of a language generates representations for sentences that are interpreted at perceptual and conceptual interfaces, Stroik investigates how these representations can be generated most parsimoniously. Countering the prevailing analyses of minimalist syntax, he argues that the computational properties of human language consist only of strictly local Merge operations that lack both look-back and look-forward properties. All grammatical operations reduce to a single sort of locally defined feature-checking operation, and all grammatical properties are the cumulative effects of local grammatical operations. As Stroik demonstrates, reducing syntactic operations to local operations with a single property--merging lexical material into syntactic derivations--not only radically increases the computational efficiency of the syntactic component, but it also optimally simplifies the design of the computational system. Locality in Minimalist Syntax explains a range of syntactic phenomena that have long resisted previous generative theories, including that-trace effects, superiority effects, and the interpretations available for multiple-wh constructions. It also introduces the Survive Principle, an important new concept for syntactic analysis, and provides something considered impossible in minimalist syntax: a locality account of displacement phenomena.
This concise but wide-ranging monograph examines where the conditions of binding theory apply and in doing so considers the nature of phrase structure (in particular how case and theta roles apply) and the nature of the lexical/functional split. David Lebeaux begins with a revised formulation of binding theory. He reexamines Chomsky’s conjecture that all conditions apply at the interfaces, in particular LF (or Logical Form), and argues instead that all negative conditions, in particular Condition C, apply continuously throughout the derivation. Lebeaux draws a distinction between positive and negative conditions, which have different privileges of occurrence according to the architecture of the grammar. Negative conditions, he finds, apply homogeneously throughout the derivation; positive conditions apply solely at LF. A hole in Condition C then forces a reconsideration of the whole architecture of the grammar. He finds that case and theta representations are split apart and are only fused at later points in the derivation, after movement has applied. Lebeaux’s exploration of the relationship between case and theta theory reveals a relationship of greater subtlety and importance than is generally assumed. His arguments should interest syntacticians and those curious about the foundations of grammar.
Paul Kiparsky's work in linguistics has been wide-ranging and fundamental. His contributions as a scholar and teacher have transformed virtually every subfield of contemporary linguistics, from generative phonology to poetic theory. This collection of essays on the word—the fundamental entity of language—by Kiparsky's colleagues, students, and teachers reflects the distinctive focus of his own attention and his influence in the field.
As the editors of the volume observe, Kiparsky approaches words much as a botanist approaches plants, fascinated equally by their beauty, their structure, and their evolution. The essays in this volume reflect these multiple perspectives. The contributors discuss phonology, morphology, syntax and semantics bearing on the formal composition of the word; historical linguistic developments emphasizing the word's simultaneous idiosyncratic character and participation in a system; and metrical and poetic forms showing the significance of Kiparsky's ideas for literary theory. Collectively they develop the overarching idea that the nature of the word is not directly observable but nonetheless inferable.
Stephen R. Anderson, Arto Anttila, Juliette Blevins, Geert Booij, Young-mee Yu Cho, Cleo Condoravdi, B. Elan Dresher, Andrew Garrett, Carlos Gussenhoven, Morris Halle, Kristin Hanson, Bruce Hayes, Larry M. Hyman, Sharon Inkelas, S. D. Joshi, René Kager, Ellen Kaisse, Aditi Lahiri, K. P. Mohanan, Tara Mohanan, Cemil Orhan Orgun, Christopher Piñón, William J. Poser, Douglas Pulleyblank, J. A. F. Roodbergen, Háj Ross, Patricia Shaw, Galen Sibanda, Donca Steriade, John Stonham, Stephen Wechsler, Dieter Wunderlich, Draga Zec.
Recent research on the syntax of signed languages has revealed that, apart from some modality-specific differences, signed languages are organized according to the same underlying principles as spoken languages. This book addresses the organization and distribution of functional categories in American Sign Language (ASL), focusing on tense, agreement, and wh-constructions.
Signed languages provide illuminating evidence about functional projections of a kind unavailable in the study of spoken languages. Along with manual signing, crucial information is expressed by specific movements of the face and upper body. The authors argue that such nonmanual markings are often direct expressions of abstract syntactic features. The distribution and intensity of these markings provide information about the location of functional heads and the boundaries of functional projections. The authors show how evidence from ASL is useful for evaluating a number of recent theoretical proposals on, among other things, the status of syntactic agreement projections and constraints on phrase structure and the directionality of movement.
This concise work offers a compositional theory of verbal argument structure in natural languages that focuses on how arguments that are not “core” arguments of the verb (arguments that are not introduced by verbal roots themselves) are introduced into argument structures. Liina Pylkkänen shows that the type of argument structure variation that allows additional noncore arguments is a pervasive property of human language and that most languages have verbs that exhibit this behavior. It would be natural to hypothesize that the grammatical elements that allow for this variation are the same in different languages, but Pylkkänen, citing the differences between the inventories of verbs that allow additional arguments in English and Venda, shows the difficulties in this assumption. Either the noncore arguments are introduced by different elements with different distributions, she argues, or the introducing elements are the same and some other factor is responsible for the distributional difference. Distinguishing between these two types of explanations and articulating the properties of argument-introducing elements is the essence of Pylkkänen’s theory. Investigating the grammatical elements that allow the addition of noncore arguments, Pylkkänen argues that the introduction of additional arguments is largely carried by seven functional heads. Following Chomsky, she claims that these belong to a universal inventory of functional elements from which a particular language must make its selection. Cross-linguistic variation, she argues, has two sources: selection; and the way a language packages the selected elements into syntactic heads. Liina Pylkkänen is Assistant Professor of Linguistics and Psychology at NYU.
Jean-Roger Vergnaud’s work on the foundational issues in linguistics has proved influential over the past three decades. At MIT in 1974, Vergnaud (now holder of the Andrew W. Mellon Professorship in Humanities at the University of Southern California) made a proposal in his Ph.D. thesis that has since become, in somewhat modified form, the standard analysis for the derivation of relative clauses. Vergnaud later integrated the proposal within a broader theory of movement and abstract case. These topics have remained central to theoretical linguistics. In this volume, essays by leading theoretical linguists attest to the importance of Jean-Roger Vergnaud’s contributions to linguistics. The essays first discuss issues in syntax, documenting important breakthroughs in the development of the principles and parameters framework and including a famous letter (unpublished until recently) from Vergnaud to Noam Chomsky and Howard Lasnik commenting on the first draft of their 1977 paper “Filters and Controls.” Vergnaud’s writings on phonology (which, the editors write, “take a definite syntactic turn”) have also been influential, and the volume concludes with two contributions to that field. The essays, rewarding from both theoretical and empirical perspectives, not only offer insight into Vergnaud’s impact on the field but also describe current work on the issues he introduced into the scholarly debate. ContributorsJoseph Aoun, Elabbas Benmamoun, Cedric Boeckx, Noam Chomsky, B. Elan Dresher, Robert Freidin, Morris Halle, Norbert Hornstein, Richard S. Kayne, Samuel Jay Keyser, Howard Lasnik, Yen-hui Audrey Li, M. Rita Manzini, Karine Megerdoomian, David Michaels, Henk van Riemsdijk, Alain Rouveret, Leonardo M. Savoia, Jean-Roger Vergnaud, Edwin WilliamsRobert Freidin is Professor of the Council of the Humanities in the Philosophy Department at Princeton University. Carlos P. Otero is Professor Emeritus of Spanish and Portuguese at the University of California, Los Angeles. Maria Luisa Zubizarreta is Professor of Linguistics at the University of Southern California.