In this theoretical monograph, Edwin Williams demonstrates that when syntax is economical, it economizes on shape distortion rather than on distance. According to Williams, this new notion of economy calls for a new architecture for the grammatical system—in fact, for a new notion of derivation. The new architecture offers a style of clausal embedding—the Level Embedding Scheme—that predictively ties together the locality, reconstructive behavior, and "target" type of any syntactic process in a way that is unique to the model.
The linguistic turn in German philosophy was initiated in the eighteenth century in the work of Johann Georg Hamann, Johann Gottfried von Herder, and Wilhelm von Humboldt. It was further developed in this century by Martin Heidegger, and Hans-Georg Gadamer extended its influence to contemporary philosophers such as Karl-Otto Apel and Jürgen Habermas. This tradition focuses on the world-disclosing dimension of language, emphasizing its communicative over its cognitive function.
In this book Mark Steedman argues that the surface syntax of natural languages maps spoken and written forms directly to a compositional semantic representation that includes predicate-argument structure, quantification, and information structure, without constructing any intervening structural representation.
This book offers a comprehensive survey of research on parasitic gaps, an intriguing syntactic phenomenon. The first section of the book contains a history of work on the topic and three fundamental previously published papers. The remaining three sections present new perspectives on the theory of parasitic gaps based on data taken from diverse languages.
The central idea of Dynamic Antisymmetry is that movement and phrase structure are not independent properties of grammar; more specifically, that movement is triggered by the geometry of phrase structure. Assuming a minimalist framework, movement is traced back to the necessity for natural language to organize words in linear order at the interface with the perceptual-articulatory module.
Until now, most discourse researchers have assumed that full semantic understanding is necessary to derive the discourse structure of texts. This book documents the first serious attempt to construct automatically and use nonsemantic computational structures for text summarization. Daniel Marcu develops a semantics-free theoretical framework that is both general enough to be applicable to naturally occurring texts and concise enough to facilitate an algorithmic approach to discourse analysis.
This unusual book takes the form of a dialogue between a linguist and another scientist. The dialogue takes place over six days, with each day devoted to a particular topic—and the ensuing digressions. The role of the linguist is to present the fundamentals of the minimalist program of contemporary generative grammar. Although the linguist serves essentially as a voice for Noam Chomsky's ideas, he is not intended to be a portrait of Chomsky himself.
When we speak, we mean more than we say. In this book Stephen C. Levinson explains some general processes that underlie presumptions in communication. This is the first extended discussion of preferred interpretation in language understanding, integrating much of the best research in linguistic pragmatics from the last two decades. Levinson outlines a theory of presumptive meanings, or preferred interpretations, governing the use of language, building on the idea of implicature developed by the philosopher H. P. Grice.
This self-contained introduction to natural language semantics addresses the major theoretical questions in the field. The authors introduce the systematic study of linguistic meaning through a sequence of formal tools and their linguistic applications. Starting with propositional connectives and truth conditions, the book moves to quantification and binding, intensionality and tense, and so on.