Skip navigation

Computational Linguistics

  • Page 3 of 6
The Resource Logic Approach
Edited by Mary Dalrymple

A new, deductive approach to the syntax-semantics interface integrates two mature and successful lines of research: logical deduction for semantic composition and the Lexical Functional Grammar (LFG) approach to the analysis of linguistic structure. It is often referred to as the "glue" approach because of the role of logic in "gluing" meanings together.

The "glue" approach has attracted significant attention from, among others, logicians working in the relatively new and active field of linear logic; linguists interested in a novel deductive approach to the interface between syntax and semantics within a nontransformational, constraint-based syntactic framework; and computational linguists and computer scientists interested in an approach to semantic composition that is grounded in a conceptually simple but powerful computational framework.

This introduction to and overview of the "glue" approach is the first book to bring together the research of the major contributors to the field.

Contributors:
Richard Crouch, Mary Dalrymple, John Fry, Vineet Gupta, Mark Johnson, Andrew Kehler, John Lamping, Dick Oehrle, Fernando Pereira, Vijay Saraswat, Josef van Genabith.

Based on an introductory course on natural-language semantics, this book provides an introduction to type-logical grammar and the range of linguistic phenomena that can be handled in categorial grammar. It also contains a great deal of original work on categorial grammar and its application to natural-language semantics. The author chose the type-logical categorial grammar as his grammatical basis because of its broad syntactic coverage and its strong linkage of syntax and semantics. Although its basic orientation is linguistic, the book should also be of interest to logicians and computer scientists seeking connections between logical systems and natural language.

The book, which stepwise develops successively more powerful logical and grammatical systems, covers an unusually broad range of material. Topics covered include higher-order logic, applicative categorial grammar, the Lambek calculus, coordination and unbounded dependencies, quantifiers and scope, plurals, pronouns and dependency, modal logic, intensionality, and tense and aspect. The book contains more mathematical development than is usually found in texts on natural language; an appendix includes the basic mathematical concepts used throughout the book.

An Electronic Lexical Database

in cooperation with the Cognitive Science Laboratory at Princeton University

WordNet is an on-line lexical reference system whose design is inspired by current psycholinguistic theories of human lexical memory; version 1.6 is the most up-to-date version of the system. The WordNet CD-ROM contains the complete WordNet system for PC and Macintosh computers, including the database and semantic concordance packages. Version 1.6 contains the WordNet database, browser software, source code, and documentation. These packages for Unix platforms are also included, as well as additional packages such as the Prolog version of the database.

If installed on a system's hard drive, the WordNet database package requires at least 22MB of disk space; installation of optional index files increases the requirement to 37MB. An additional 43MB of disk space is required to install the semantic concordance package.

The PC version of the browser runs under Windows 3.1 and Windows 95. A command line interface runs under DOS. The Macintosh version runs on Power Macintosh systems only.


The Generative Lexicon presents a novel and exciting theory of lexical semantics that addresses the problem of the "multiplicity of word meaning"; that is, how we are able to give an infinite number of senses to words with finite means. The first formally elaborated theory of a generative approach to word meaning, it lays the foundation for an implemented computational treatment of word meaning that connects explicitly to a compositional semantics.



The past fifteen years have seen great changes in the field of language acquisition. New experimental methods have yielded insights into the linguistic knowledge of ever younger children, and interest has grown in the phonological, syntactic, and semantic aspects of the lexicon. Computational investigations of language acquisition have also changed, reflecting, among other things, the profound shift in the field of natural language processing from hand-crafted grammars to grammars that are learned automatically from samples of naturally occurring language.

Each of the four research papers in this book takes a novel formal approach to a particular problem in language acquisition. In the first paper, J. M. Siskind looks at developmentally inspired models of word learning. In the second, M. R. Brent and T. A. Cartwright look at how children could discover the sounds of words, given that word boundaries are not marked by any acoustic analog of the spaces between written words. In the third, P. Resnik measures the association between verbs and the semantic categories of their arguments that children likely use as clues to verb meanings. Finally, P. Niyogi and R. C. Berwick address the setting of syntactic parameters such as headedness—for example, whether the direct object comes before or after the verb.


Finite-state devices, which include finite-state automata, graphs, and finite-state transducers, are in wide use in many areas of computer science. Recently, there has been a resurgence of the use of finite-state devices in all aspects of computational linguistics, including dictionary encoding, text processing, and speech processing. This book describes the fundamental properties of finite-state devices and illustrates their uses. Many of the contributors pioneered the use of finite-automata for different aspects of natural language processing. The topics, which range from the theoretical to the applied, include finite-state morphology, approximation of phrase-structure grammars, deterministic part-of-speech tagging, application of a finite-state intersection grammar, a finite-state transducer for extracting information from text, and speech recognition using weighted finite automata. The introduction presents the basic theoretical results in finite-state automata and transducers. These results and algorithms are described and illustrated with simple formal language examples as well as natural language examples.

Contributors: Douglas Appelt, John Bear, David Clemenceau, Maurice Gross, Jerry R. Hobbs, David Israel, Megumi Kameyama, Lauri Karttunen, Kimmo Koskenniemi, Mehryar Mohri, Eric Laporte, Fernando C. N. Pereira, Michael D. Riley, Emmanuel Roche, Yves Schabes, Max D. Silberztein, Mark Stickel, Pasi Tapanainen, Mabry Tyson, Atro Voutilainen, Rebecca N. Wright.

Language, Speech, and Communication series

In Elementary Operations and Optimal Derivations, Hisatsugu Kitahara advances Noam Chomsky's Minimalist Program (1995) with a number of innovative proposals. The analysis is primarily concerned with the elementary operations of the computational system for human language and with the principles of Universal Grammar that constrain derivations generated by that system. Many conditions previously assumed to be axiomatic are deduced from the interaction of more fundamental principles of Universal Grammar.

Kitahara first unifies disparate syntactic operations by appeal to more elementary operations. He then determines the set of optimal derivations involving only legitimate steps and demonstrates how, without stipulation, these derivations characterize a number of linguistic expressions that have long occupied the center of syntactic investigation.

This monograph also includes a clear explication of the distinct but closely related analyses presented in Chomsky's work of the early 1990s. This exposition makes the book attractive to the general linguistic reader as well as the professional syntactician.

Linguistic Inquiry Monograph No. 31

Combinatory Categorial Grammar (CCG) offers a new approach to the theory of natural language grammar. Coordination, relativization, and related prosodic phenomena have been analyzed in CCG in terms of a radically revised notion of surface structure. CCG surface structures do not exhibit traditional notions of syntactic dominance and command, and do not constitute an autonomous level of representation. Instead, they reflect the computations by which a sentence may be realized or analyzed, to synchronously define a predicate-argument structure, or logical form. Surface Structure and Interpretation shows that binding and control can be captured at this level, preserving the advantages of CCG as an account of coordination and unbounded dependency.

The core of the book is a detailed treatment of extraction, a focus of syntactic research since the early work of Chomsky and Ross. The topics addressed include the sources of subject-object asymmetries and phenomena attributed to the Empty Category Principle (ECP), asymmetric islands, parasitic gaps, and the relation of coordination and extraction, including their interactions with binding theory. In his conclusion, the author relates CCG to other categorial and type-driven approaches and to proposals for minimalism in linguistic theory.

Linguistic Inquiry Monograph No. 30

Combining Symbolic and Statistical Approaches to Language

Symbolic and statistical approaches to language have historically been at odds—the former viewed as difficult to test and therefore perhaps impossible to define, and the latter as descriptive but possibly inadequate. At the heart of the debate are fundamental questions concerning the nature of language, the role of data in building a model or theory, and the impact of the competence-performance distinction on the field of computational linguistics. Currently, there is an increasing realization in both camps that the two approaches have something to offer in achieving common goals.

The eight contributions in this book explore the inevitable "balancing act" that must take place when symbolic and statistical approaches are brought together—including basic choices about what knowledge will be represented symbolically and how it will be obtained, what assumptions underlie the statistical model, what principles motivate the symbolic model, and what the researcher gains by combining approaches.

The topics covered include an examination of the relationship between traditional linguistics and statistical methods, qualitative and quantitative methods of speech translation, study and implementation of combined techniques for automatic extraction of terminology, comparative analysis of the contributions of linguistic cues to a statistical word grouping system, automatic construction of a symbolic parser via statistical techniques, combining linguistic with statistical methods in automatic speech understanding, exploring the nature of transformation-based learning, and a hybrid symbolic/statistical approach to recovering from parser failures.

Eugene Charniak breaks new ground in artificial intelligence research by presenting statistical language processing from an artificial intelligence point of view in a text for researchers and scientists with a traditional computer science background.

New, exacting empirical methods are needed to break the deadlock in such areas of artificial intelligence as robotics, knowledge representation, machine learning, machine translation, and natural language processing (NLP). It is time, Charniak observes, to switch paradigms. This text introduces statistical language processing techniques—word tagging, parsing with probabilistic context free grammars, grammar induction, syntactic disambiguation, semantic word classes, word-sense disambiguation—along with the underlying mathematics and chapter exercises.

Charniak points out that as a method of attacking NLP problems, the statistical approach has several advantages. It is grounded in real text and therefore promises to produce usable results, and it offers an obvious way to approach learning: "one simply gathers statistics."

Language, Speech, and Communication

  • Page 3 of 6