Combinatory Categorial Grammar (CCG) offers a new approach to the theory of natural language grammar. Coordination, relativization, and related prosodic phenomena have been analyzed in CCG in terms of a radically revised notion of surface structure. CCG surface structures do not exhibit traditional notions of syntactic dominance and command, and do not constitute an autonomous level of representation. Instead, they reflect the computations by which a sentence may be realized or analyzed, to synchronously define a predicate-argument structure, or logical form.
The use of computers to understand words continues to be an area of burgeoning research. Electric Words is the first general survey of and introduction to the entire range of work in lexical linguistics and corpora—the study of such on-line resources as dictionaries and other texts—in the broader fields of natural-language processing and artificial intelligence. The authors integrate and synthesize the goals and methods of computational lexicons in relation to AI's sister disciplines of philosophy, linguistics, and psychology.
Language in Action demonstrates the viability of mathematical research into the foundations of categorial grammar, a topic at the border between logic and linguistics. Since its initial publication it has become the classic work in the foundations of categorial grammar. A new introduction to this paperback edition updates the open research problems and records relevant results through pointers to the literature.
Cognitive Models of Speech Processing presents extensive reviews of current thinking on psycholinguistic and computational topics in speech recognition and natural-language processing, along with a substantial body of new experimental data and computational simulations. Topics range from lexical access and the recognition of words in continuous speech to syntactic processing and the relationship between syntactic and intonational structure.
A Bradford Book. ACL-MIT Press Series in Natural Language Processing
Today, large corpora consisting of hundreds of millions or even billions of words, along with new empirical and statistical methods for organizing and analyzing these data, promise new insights into the use of language. Already, the data extracted from these large corpora reveal that language use is more flexible and complex than most rule-based systems have tried to account for, providing a basis for progress in the performance of Natural Language Processing systems.
Although the theory of object-oriented programming languages is far from complete, this book brings together the most important contributions to its development to date, focusing in particular on how advances in type systems and semantic models can contribute to new language designs.
Eugene Charniak breaks new ground in artificial intelligence research by presenting statistical language processing from an artificial intelligence point of view in a text for researchers and scientists with a traditional computer science background.