What makes people smarter than computers? These volumes by a pioneering neurocomputing group suggest that the answer lies in the massively parallel architecture of the human mind. They describe a new theory of cognition called connectionism that is challenging the idea of symbolic computation that has traditionally been at the center of debate in theoretical discussions about the mind.
The authors' theory assumes the mind is composed of a great number of elementary units connected in a neural network. Mental processes are interactions between these units which excite and inhibit each other in parallel rather than sequential operations. In this context, knowledge can no longer be thought of as stored in localized structures; instead, it consists of the connections between pairs of units that are distributed throughout the network.
Volume 1 lays the foundations of this exciting theory of parallel distributed processing, while Volume 2 applies it to a number of specific issues in cognitive science and neuroscience, with chapters describing models of aspects of perception, memory, language, and thought.
About the Authors
James L. McClelland is Professor of Psychology and Director of the Center for Mind, Brain, and Computation at Stanford University. He is the coauthor of Parallel Distributed Processing (1986) and Semantic Cognition (2004), both published by the MIT Press.
David E. Rumelhart is Professor of Psychology at the University of California, San Diego. With James McClelland, he was awarded the 2002 University of Louisville Grawemeyer Award for Psychology for his work in the field of cognitive neuroscience on a cognitive framework called parallel distributed processing and the concept of connectionism.
"The ideas represented in Parallel Distributed Processing fundamentally challenge the main concepts and assumptions of modern cognitive science."
- James G. Greeno, The New York Times Book Review
"Rumelhart and McClelland propose that what is stored in memory is not specific facts or events, but rather the relationships between the various aspects of those facts or events as they are encoded in groupings of neuronal cells or patterns of cell activity."
- Daniel Coleman, The New York Times
"[This is] a comprehensive compilation of neural network research and development. There are algorithms you can use to explore various methods in the field. If you want information on neural network technology in book form, this is the set to own."
- Artificial Intelligence Special Interest Group Newsletter
"The most intense, most effective and most mind-stretching view of neurocomputing origins, theories and concerns to yet reach print."