The search for origins of communication in a wide variety of species including humans is rapidly becoming a thoroughly interdisciplinary enterprise. In this volume, scientists engaged in the fields of evolutionary biology, linguistics, animal behavior, developmental psychology, philosophy, the cognitive sciences, robotics, and neural network modeling come together to explore a comparative approach to the evolution of communication systems.
The annual Neural Information Processing (NIPS) conference is the flagship meeting on neural computation. It draws a diverse group of attendees—physicists, neuroscientists, mathematicians, statisticians, and computer scientists. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning and control, emerging technologies, and applications.
Evolutionary robotics is a new technique for the automatic creation of autonomous robots. Inspired by the Darwinian principle of selective reproduction of the fittest, it views robots as autonomous artificial organisms that develop their own skills in close interaction with the environment and without human intervention. Drawing heavily on biology and ethology, it uses the tools of neural networks, genetic algorithms, dynamic systems, and biomorphic engineering.
The annual Neural Information Processing (NIPS) meeting is the flagship conference on neural computation. The conference draws a diverse group of attendees—physicists, neuroscientists, mathematicians, statisticians, and computer scientists—and the presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and applications.
McClelland and Rumelhart's Parallel Distributed Processing was the first book to present a definitive account of the newly revived connectionist/neural net paradigm for artificial intelligence and cognitive science. While Neural Computing Architectures addresses the same issues, there is little overlap in the research it reports. These 18 contributions provide a timely and informative overview and synopsis of both pioneering and recent European connectionist research.
Visual Reconstruction presents a unified and highly original approach to the treatment of continuity in vision. It introduces, analyzes, and illustrates two new concepts. The first—the weak continuity constraint—is a concise, computational formalization of piecewise continuity. It is a mechanism for expressing the expectation that visual quantities such as intensity, surface color, and surface depth vary continuously almost everywhere, but with occasional abrupt changes. The second concept—the graduated nonconvexity algorithm—arises naturally from the first.
As book review editor of the IEEE Transactions on Neural Networks, Mohamad Hassoun has had the opportunity to assess the multitude of books on artificial neural networks that have appeared in recent years. Now, in Fundamentals of Artificial Neural Networks, he provides the first systematic account of artificial neural network paradigms by identifying clearly the fundamental concepts and major methodologies underlying most of the current theory and practice employed by neural network researchers.
Artificial Neural Networks (ANNs) offer an efficient method for finding optimal cleanup strategies for hazardous plumes contaminating groundwater by allowing hydrologists to rapidly search through millions of possible strategies to find the most inexpensive and effective containment of contaminants and aquifer restoration. ANNs also provide a faster method of developing systems that classify seismic events as being earthquakes or underground explosions.
Motivated by the remarkable fluidity of memory the way in which items are pulled spontaneously and effortlessly from our memory by vague similarities to what is currently occupying our attention Sparse Distributed Memory presents a mathematically elegant theory of human long term memory.