Skip navigation

Language, Speech, and Communication

Functional Categories and Hierarchical Structure

Recent research on the syntax of signed languages has revealed that, apart from some modality-specific differences, signed languages are organized according to the same underlying principles as spoken languages. This book addresses the organization and distribution of functional categories in American Sign Language (ASL), focusing on tense, agreement, and wh-constructions.

In Ontological Semantics, Sergei Nirenburg and Victor Raskin introduce a comprehensive approach to the treatment of text meaning by computer. Arguing that being able to use meaning is crucial to the success of natural language processing (NLP) applications, they depart from the ad hoc approach to meaning taken by much of the NLP community and propose theory-based semantic methods.

Concept Structuring Systems

In this two-volume set Leonard Talmy defines the field of cognitive semantics. He approaches the question of how language organizes conceptual material both at a general level and by analyzing a crucial set of particular conceptual domains: space and time, motion and location, causation and force interaction, and attention and viewpoint. Talmy maintains that these are among the most fundamental parameters by which language structures conception.

What does our ability to use words--that is, our lexical competence--consist of? What is the difference between a system that can be said to understand language and one that cannot? Most approaches to word meaning fail to account for an essential aspect of our linguistic competence, namely, our ability to apply words to the world. This monograph proposes a dual picture of human lexical competence in which inferential and referential abilities are separate--a proposal confirmed by neuropsychological research on brain- damaged persons.

In this book Mark Steedman argues that the surface syntax of natural languages maps spoken and written forms directly to a compositional semantic representation that includes predicate-argument structure, quantification, and information structure, without constructing any intervening structural representation.

Recent work in theoretical syntax has revealed the strong explanatory power of the notions of economy, competition, and optimization. Building grammars entirely upon these elements, Optimality Theory syntax provides a theory of universal grammar with a formally precise and strongly restricted theory of universal typology: cross-linguistic variation arises exclusively from the conflict among universal principles.

The Integration of Habits and Rules

Using sentence comprehension as a case study for all of cognitive science, David Townsend and Thomas Bever offer an integration of two major approaches, the symbolic-computational and the associative-connectionist. The symbolic-computational approach emphasizes the formal manipulation of symbols that underlies creative aspects of language behavior. The associative-connectionist approach captures the intuition that most behaviors consist of accumulated habits.

Typology and Process in Concept Structuring

In this two-volume set Leonard Talmy defines the field of cognitive semantics. He approaches the question of how language organizes conceptual material both at a general level and by analyzing a crucial set of particular conceptual domains: space and time, motion and location, causation and force interaction, and attention and viewpoint. Talmy maintains that these are among the most fundamental parameters by which language structures conception.

Volume 1: Concept Structuring Systems and Volume 2: Typology and Process in Concept Structuring

In this two-volume set Leonard Talmy defines the field of cognitive semantics. He approaches the question of how language organizes conceptual material both at a general level and by analyzing a crucial set of particular conceptual domains: space and time, motion and location, causation and force interaction, and attention and viewpoint. Talmy maintains that these are among the most fundamental parameters by which language structures conception.

The Theory of Generalized Conversational Implicature

When we speak, we mean more than we say. In this book Stephen C. Levinson explains some general processes that underlie presumptions in communication. This is the first extended discussion of preferred interpretation in language understanding, integrating much of the best research in linguistic pragmatics from the last two decades. Levinson outlines a theory of presumptive meanings, or preferred interpretations, governing the use of language, building on the idea of implicature developed by the philosopher H. P. Grice.

Approximately five percent of all children are born with the disorder known as specific language impairment (SLI). These children show a significant deficit in spoken language ability with no obvious accompanying condition such as mental retardation, neurological damage, or hearing impairment. Children with Specific Language Impairment covers all aspects of SLI, including its history, possible genetic and neurobiological origins, and clinical and educational practice.

In the information age, reading is one of the most important cognitive skills an individual acquires. A scientific understanding of this skill is important to help optimize its acquisition and performance. This book offers an interdisciplinary look at the acquisition, loss, and remediation of normal reading processes. Its two main goals are to illustrate, through state-of-the-art examples, various approaches used by scientists to understand the complex skill of reading and its breakdown, and to stimulate innovative research strategies that combine these methods.

The study of the relationship between natural language and spatial cognition has the potential to yield answers to vexing questions about the nature of the mind, language, and culture. The fifteen original contributions in Language and Space bring together the major lines of research and the most important theoretical viewpoints in the areas of psychology, linguistics, anthropology, and neuroscience, providing a much needed synthesis across these diverse domains.Each chapter gives a clear up-to-date account of a particular research program.

Evidence from Early Language Comprehension

Computational Models of Reading

This book highlights cutting-edge research relevant to the building of a computational model of reading comprehension, as in the processing and understanding of a natural language text or story. A distinguishing feature of the book is its emphasis on "real" understanding of "real" narrative texts rather than on syntactic parsing of single sentences taken out of context or on limited understanding of small, researcher-constructed stories.

The Resource Logic Approach
Edited by Mary Dalrymple

A new, deductive approach to the syntax-semantics interface integrates two mature and successful lines of research: logical deduction for semantic composition and the Lexical Functional Grammar (LFG) approach to the analysis of linguistic structure. It is often referred to as the "glue" approach because of the role of logic in "gluing" meanings together.

The two basic approaches to linguistics are the formalist and the functionalist approaches. In this engaging monograph, Frederick J. Newmeyer, a formalist, argues that both approaches are valid. However, because formal and functional linguists have avoided direct confrontation, they remain unaware of the compatibility of their results. One of the author's goals is to make each side accessible to the other.

The study of child language and, in particular, child syntax is a growing area of linguistic research, yet methodological issues often take a backseat to the findings and conclusions of specific studies in the field. This book is designed in part as a handbook to assist students and researchers in the choice and use of methods for investigating children's grammar. For example, a method (or combination of methods) can be chosen based on what is measured and who the target subject is.

Based on an introductory course on natural-language semantics, this book provides an introduction to type-logical grammar and the range of linguistic phenomena that can be handled in categorial grammar. It also contains a great deal of original work on categorial grammar and its application to natural-language semantics. The author chose the type-logical categorial grammar as his grammatical basis because of its broad syntactic coverage and its strong linkage of syntax and semantics.

A Guide to Experiments on the Acquisition of Syntax and Semantics

This introductory guide to language acquisition research is presented within the framework of Universal Grammar, a theory of the human faculty for language. The authors focus on two experimental techniques for assessing children's linguistic competence: the Elicited Production task, a production task, and the Truth Value Judgment task, a comprehension task. Their methodologies are designed to overcome the numerous obstacles to empirical investigation of children's language competence.

An Electronic Lexical Database

in cooperation with the Cognitive Science Laboratory at Princeton University


The Generative Lexicon presents a novel and exciting theory of lexical semantics that addresses the problem of the "multiplicity of word meaning"; that is, how we are able to give an infinite number of senses to words with finite means. The first formally elaborated theory of a generative approach to word meaning, it lays the foundation for an implemented computational treatment of word meaning that connects explicitly to a compositional semantics.