Linguists have mapped the topography of language behavior in many languages in intricate detail. To understand how the brain supports language function, however, we must take into account the principles and regularities of neural function. Mechanisms of neurolinguistic function cannot be inferred solely from observations of normal and impaired language. In The Neural Architecture of Grammar, Stephen Nadeau develops a neurologically plausible theory of grammatic function.
In Taking Scope, Mark Steedman considers the syntax and semantics of quantifier scope in interaction with negation, polarity, coordination, and pronominal binding, among other constructions. The semantics is “surface compositional,” in that there is a direct correspondence between syntactic types and operations of composition and types and compositions at the level of logical form.
When two or more languages are part of a child’s world, we are presented with a rich opportunity to learn something about language in general and about how the mind works. In this book, Norbert Francis examines the development of bilingual proficiency and the different kinds of competence that come together in making up its component parts. In particular, he explores problems of language ability when children use two languages for tasks related to schooling, especially in learning how to read and write.
In Meaningful Games, Robin Clark explains in an accessible manner the usefulness of game theory in thinking about a wide range of issues in linguistics. Clark argues that we use grammar strategically to signal our intended meanings: our choices as speaker are conditioned by what choices the hearer will make interpreting what we say. Game theory--according to which the outcome of a decision depends on the choices of others--provides a formal system that allows us to develop theories about the kind of decision making that is crucial to understanding linguistic behavior.
In The Connectives, Lloyd Humberstone examines the semantics and pragmatics of natural language sentence connectives (and, or, if, not), giving special attention to their formal behavior according to proposed logical systems and the degree to which such treatments capture their intuitive meanings. It will be an essential resource for philosophers, mathematicians, computer scientists, linguists, or any scholar who finds connectives, and the conceptual issues surrounding them, to be a source of interest.
In 1995, Robert Barsky met with Noam Chomsky to discuss his work-in-progress, Noam Chomsky: A Life of Dissent (MIT Press, 1997). Chomsky told Barsky that he should focus his attention instead on midcentury linguist and activist Zellig Harris, who was, Chomsky modestly insisted, more interesting than Chomsky himself.
Chomsky showed that no description of natural language syntax would be adequate without some notion of movement operations in a syntactic derivation. It now seems likely that such movement transformations are formally simple operations, in which a single phrase is displaced from its original position within a phrase marker, but after more than fifty years of generative theorizing, the mechanics of syntactic movement are still murky and controversial.
This volume brings together contributions by prominent researchers in the fields of language processing and language acquisition on topics of common interest: how people refer to objects in the world, how people comprehend such referential expressions, and how children acquire the ability to refer and to understand reference.
In Edge-Based Clausal Syntax, Paul Postal rejects the notion that an English phrase of the form [V + DP] invariably involves a grammatical relation properly characterized as a direct object. He argues instead that at least three distinct relations occur in such a structure. The different syntactic properties of these three kinds of objects are shown by how they behave in passives, middles, -able forms, tough movement, wh-movement, Heavy NP Shift, Ride Node Raising, re-prefixation, and many other tests.
Pronouns and anaphors (including reflexives such as himself and herself) may or must depend on antecedents for their interpretation. These dependencies are subject to conditions that prima facie show substantial crosslinguistic variation. In this monograph, Eric Reuland presents a theory of how these anaphoric dependencies are represented in natural language in a way that does justice to the the variation one finds across languages. He explains the conditions on these dependencies in terms of elementary properties of the computational system of natural language.