Skip navigation

James A. Anderson

James A. Anderson is Professor in the Department of Cognitive and Linguistic Sciences at Brown University.

Titles by This Author

An Introduction to Neural Networks falls into a new ecological niche for texts. Based on notes that have been class-tested for more than a decade, it is aimed at cognitive science and neuroscience students who need to understand brain function in terms of computational modeling, and at engineers who want to go beyond formal algorithms to applications and computing strategies. It is the only current text to approach networks from a broad neuroscience and cognitive science perspective, with an emphasis on the biology and psychology behind the assumptions of the models, as well as on what the models might be used for. It describes the mathematical and computational tools needed and provides an account of the author's own ideas.

Students learn how to teach arithmetic to a neural network and get a short course on linear associative memory and adaptive maps. They are introduced to the author's brain-state-in-a-box (BSB) model and are provided with some of the neurobiological background necessary for a firm grasp of the general subject.

The field now known as neural networks has split in recent years into two major groups, mirrored in the texts that are currently available: the engineers who are primarily interested in practical applications of the new adaptive, parallel computing technology, and the cognitive scientists and neuroscientists who are interested in scientific applications. As the gap between these two groups widens, Anderson notes that the academics have tended to drift off into irrelevant, often excessively abstract research while the engineers have lost contact with the source of ideas in the field. Neuroscience, he points out, provides a rich and valuable source of ideas about data representation and setting up the data representation is the major part of neural network programming. Both cognitive science and neuroscience give insights into how this can be done effectively: cognitive science suggests what to compute and neuroscience suggests how to compute it.

Titles by This Editor

An Oral History of Neural Networks


Since World War II, a group of scientists has been attempting to understand the human nervous system and to build computer systems that emulate the brain's abilities. In this collection of interviews, those who helped to shape the field share their childhood memories, their influences, how they became interested in neural networks, and how they envision its future.

Prominent in these recollections are Norbert Wiener, Warren McCulloch, Frank Rosenblatt, and other mythic figures responsible for laying the foundations of modern brain theory and cybernetics. The interviewees agree about some things and disagree about more. Together, they tell the story of how science is actually done, including the false starts and the struggle for jobs, resources, and reputation. Although some of the interviews contain technical material, there is no actual mathematics in the book.


Directions for Research

In bringing together seminal articles on the foundations of research, the first volume of Neurocomputing has become an established guide to the background of concepts employed in this burgeoning field. Neurocomputing 2 collects forty-one articles covering network architecture, neurobiological computation, statistics and pattern classification, and problems and applications that suggest important directions for the evolution of neurocomputing.

Foundations of Research

Researchers will find Neurocomputing an essential guide to the concepts employed in this field that have been taken from disciplines as varied as neuroscience, psychology, cognitive science, engineering, and physics. A number of these important historical papers contain ideas that have not yet been fully exploited, while the more recent articles define the current direction of neurocomputing and point to future research. Each article has an introduction that places it in historical and intellectual perspective.

Included among the 43 articles are the pioneering contributions of McCulloch and Pitts, Hebb, and Lashley; innovative work by Von Neumann, Minsky and Papert, Cooper, Grossberg, and Kohonen; exciting new developments in parallel distributed processing.