Skip navigation

Computational Intelligence

  • Page 2 of 11
Using Complex Lexical Descriptions in Natural Language Processing

The last decade has seen computational implementations of large hand-crafted natural language grammars in formal frameworks such as Tree-Adjoining Grammar (TAG), Combinatory Categorical Grammar (CCG), Head-driven Phrase Structure Grammar (HPSG), and Lexical Functional Grammar (LFG). Grammars in these frameworks typically associate linguistically motivated rich descriptions (Supertags) with words.

The Mechanization of the Mind

The conceptual history of cognitive science remains for the most part unwritten. In this groundbreaking book, Jean-Pierre Dupuy—one of the principal architects of cognitive science in France—provides an important chapter: the legacy of cybernetics. Contrary to popular belief, Dupuy argues, cybernetics represented not the anthropomorphization of the machine but the mechanization of the human.

A New Reading of 'Representation and Reality'

With mind-brain identity theories no longer dominant in philosophy of mind in the late 1950s, scientific materialists turned to functionalism, the view that the identity of any mental state depends on its function in the cognitive system of which it is a part. The philosopher Hilary Putnam was one of the primary architects of functionalism and was the first to propose computational functionalism, which views the human mind as a computer or an information processor.

In What Is Thought? Eric Baum proposes a computational explanation of thought. Just as Erwin Schrodinger in his classic 1944 work What Is Life? argued ten years before the discovery of DNA that life must be explainable at a fundamental level by physics and chemistry, Baum contends that the present-day inability of computer science to explain thought and meaning is no reason to doubt there can be such an explanation.

Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning.

The Geometry of Thought

Within cognitive science, two approaches currently dominate the problem of modeling representations. The symbolic approach views cognition as computation involving symbolic manipulation. Connectionism, a special case of associationism, models associations using artificial neuron networks. Peter Gärdenfors offers his theory of conceptual representations as a bridge between the symbolic and connectionist approaches.

Reasoning about knowledge--particularly the knowledge of agents who reason about the world and each other's knowledge--was once the exclusive province of philosophers and puzzle solvers. More recently, this type of reasoning has been shown to play a key role in a surprising number of contexts, from understanding conversations to the analysis of distributed computer algorithms.Reasoning About Knowledge is the first book to provide a general discussion of approaches to reasoning about knowledge and its applications to distributed systems, artificial intelligence, and game theory.

The Computer Generation of Explanatory Dialogues

Explanation and Interaction describes the problems and issues involved in generating interactive user-sensitive explanations. It presents a particular computational system that generates tutorial, interactive explanations of how simple electronic circuits work. However, the approaches and ideas in the book can be applied to a wide range of computer applications where complex explanations are provided, such as documentation, advisory, and expert systems.

Computational modeling plays a central role in cognitive science. This book provides a comprehensive introduction to computational models of human cognition. It covers major approaches and architectures, both neural network and symbolic; major theoretical issues; and specific computational models of a variety of cognitive processes, ranging from low-level (e.g., attention and memory) to higher-level (e.g., language and reasoning). The articles included in the book provide original descriptions of developments in the field.

  • Page 2 of 11