Skip navigation

Neural Information Processing Systems

  • Page 4 of 9
Proceedings of the 2000 Conference

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. The conference is interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and diverse applications. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented at the 2000 conference.

Most practical applications of artificial neural networks are based on a computational model involving the propagation of continuous variables from one processing unit to the next. In recent years, data from neurobiological experiments have made it increasingly clear that biological neural networks, which communicate through pulses, use the timing of the pulses to transmit information and perform computation.

The concept of large margins is a unifying principle for the analysis of many different approaches to the classification of data from examples, including boosting, mathematical programming, neural networks, and support vector machines. The fact that it is the margin, or confidence level, of a classification--that is, a scale parameter--rather than a raw training error that matters has become a key tool for dealing with classifiers.

Proceedings of the 1999 Conference

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory.

Computational finance, an exciting new cross-disciplinary research area, draws extensively on the tools and techniques of computer science, statistics, information systems, and financial economics. This book covers the techniques of data mining, knowledge discovery, genetic algorithms, neural networks, bootstrapping, machine learning, and Monte Carlo simulation. These methods are applied to a wide range of problems in finance, including risk management, asset allocation, style analysis, dynamic trading and hedging, forecasting, and option pricing.

Models of Learning, Thinking, and Acting

How does the brain work? How do billions of neurons bring about ideas, sensations, emotions, and actions? Why do children learn faster than elderly people? What can go wrong in perception, thinking, learning, and acting? Scientists now use computer models to help us to understand the most private and human experiences. In The Mind Within the Net, Manfred Spitzer shows how these models can fundamentally change how we think about learning, creativity, thinking, and acting, as well as such matters as schools, retirement homes, politics, and mental disorders.

An Oral History of Neural Networks

Since World War II, a group of scientists has been attempting to understand the human nervous system and to build computer systems that emulate the brain's abilities. Many of the early workers in this field of neural networks came from cybernetics; others came from neuroscience, physics, electrical engineering, mathematics, psychology, even economics. In this collection of interviews, those who helped to shape the field share their childhood memories, their influences, how they became interested in neural networks, and what they see as its future.

Edited by Ian Cloete and J M. Zurada

Neurocomputing methods are loosely based on a model of the brain as a network of simple interconnected processing elements corresponding to neurons. These methods derive their power from the collective processing of artificial neurons, the chief advantage being that such systems can learn and adapt to a changing environment. In knowledge-based neurocomputing, the emphasis is on the use and representation of knowledge about an application. Explicit modeling of the knowledge represented by such a system remains a major research topic.

Recent advances in motor behavior research rely on detailed knowledge of the characteristics of the neurons and networks that generate motor behavior. At the cellular level, Neurons, Networks, and Motor Behavior describes the computational characteristics of individual neurons and how these characteristics are modified by neuromodulators. At the network and behavioral levels, the volume discusses how network structure is dynamically modulated to produce adaptive behavior. Comparisons of model systems throughout the animal kingdom provide insights into general principles of motor control.

Proceedings of the 1998 Conference

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory.

  • Page 4 of 9