What does our ability to use words—that is, our lexical competence—consist of? What is the difference between a system that can be said to understand language and one that cannot? Most approaches to word meaning fail to account for an essential aspect of our linguistic competence, namely, our ability to apply words to the world. This monograph proposes a dual picture of human lexical competence in which inferential and referential abilities are separate—a proposal confirmed by neuropsychological research on brain-damaged persons.
According to the received view of linguistic communication, the primary function of language is to enable speakers to reveal the propositional contents of their thoughts to hearers. Speakers are able to do this because they share with their hearers an understanding of the meanings of words. Christopher Gauker rejects this conception of language, arguing that it rests on an untenable conception of mental representation and yields a wrong account of the norms of discourse.
The linguistic turn in German philosophy was initiated in the eighteenth century in the work of Johann Georg Hamann, Johann Gottfried von Herder, and Wilhelm von Humboldt. It was further developed in this century by Martin Heidegger, and Hans-Georg Gadamer extended its influence to contemporary philosophers such as Karl-Otto Apel and Jürgen Habermas. This tradition focuses on the world-disclosing dimension of language, emphasizing its communicative over its cognitive function.
Since the early work of Montague, Boolean semantics and its subfield of generalized quantifier theory have become the model-theoretic foundation for the study of meaning in natural languages. This book uses this framework to develop a new semantic theory of central linguistic phenomena involving coordination, plurality, and scope. The proposed theory makes use of the standard Boolean interpretation of conjunction, a choice-function account of indefinites, and a novel semantics of plurals that is not based on the distributive/collective distinction.
In 1971 Jürgen Habermas delivered the Gauss Lectures at Princeton University. These pivotal lectures, entitled "Reflections on the Linguistic Foundation of Sociology," anticipate The Theory of Communicative Action and offer an excellent introduction to it. They show why Habermas considers the linguistic turn in social philosophy to be necessary and contain the first formulation of formal pragmatics, including an important discussion of truth.
In the early 1960s, the bold project of the emerging field of cognition was to put the human mind under the scrutiny of rational inquiry, through the conjoined efforts of philosophy, linguistics, computer science, psychology, and neuroscience. Forty years later, cognitive science is a flourishing academic field. The contributions to this collection, written in honor of Jacques Mehler, a founder of the field of psycholinguistics, assess the progress of cognitive science. The questions addressed include: What have we learned or not learned about language, brain, and cognition?
A machine for language? Certainly, say the neurophysiologists, busy studying the language specializations of the human brain and trying to identify their evolutionary antecedents. Linguists such as Noam Chomsky talk about machinelike "modules" in the brain for syntax, arguing that language is more an instinct (a complex behavior triggered by simple environmental stimuli) than an acquired skill like riding a bicycle.
Using sentence comprehension as a case study for all of cognitive science, David Townsend and Thomas Bever offer an integration of two major approaches, the symbolic-computational and the associative-connectionist. The symbolic-computational approach emphasizes the formal manipulation of symbols that underlies creative aspects of language behavior. The associative-connectionist approach captures the intuition that most behaviors consist of accumulated habits.
Since the late 1970s, the orthodox view of complex 'that' phrases (e.g., 'that woman eating a granola bar') has been that they are contextually sensitive devices of direct reference. In Complex Demonstratives, Jeffrey King challenges that orthodoxy, showing that quantificational accounts not only are as effective as direct reference accounts but also handle a wider range of data.
That children learn to speak so skillfully at a young age has long fascinated adults. Most children virtually master their native tongue even before learning to tie their shoelaces. The ability to acquire language has historically been regarded as a "gift"—a view given scientific foundation only in the present century by Noam Chomsky's theory of "universal grammar," which posits an innate knowledge of the principles that structure all languages.