Skip navigation

Neural Information Processing Systems

  • Page 4 of 9

In recent years, data from neurobiological experiments have made it increasingly clear that biological neural networks, which communicate through pulses, use the timing of the pulses to transmit information and perform computation. This realization has stimulated significant research on pulsed neural networks, including theoretical analyses and model development, neurobiological modeling, and hardware implementation.

The concept of large margins is a unifying principle for the analysis of many different approaches to the classification of data from examples, including boosting, mathematical programming, neural networks, and support vector machines. The fact that it is the margin, or confidence level, of a classification—that is, a scale parameter—rather than a raw training error that matters has become a key tool for dealing with classifiers. This book shows how this idea applies to both the theoretical analysis and the design of algorithms.

Proceedings of the 1999 Conference

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory.

Computational finance, an exciting new cross-disciplinary research area, draws extensively on the tools and techniques of computer science, statistics, information systems, and financial economics. This book covers the techniques of data mining, knowledge discovery, genetic algorithms, neural networks, bootstrapping, machine learning, and Monte Carlo simulation. These methods are applied to a wide range of problems in finance, including risk management, asset allocation, style analysis, dynamic trading and hedging, forecasting, and option pricing.

Models of Learning, Thinking, and Acting

An Oral History of Neural Networks


Since World War II, a group of scientists has been attempting to understand the human nervous system and to build computer systems that emulate the brain's abilities. In this collection of interviews, those who helped to shape the field share their childhood memories, their influences, how they became interested in neural networks, and how they envision its future.

Edited by Ian Cloete and J M. Zurada

Neurocomputing methods are loosely based on a model of the brain as a network of simple interconnected processing elements corresponding to neurons. These methods derive their power from the collective processing of artificial neurons, the chief advantage being that such systems can learn and adapt to a changing environment. In knowledge-based neurocomputing, the emphasis is on the use and representation of knowledge about an application. Explicit modeling of the knowledge represented by such a system remains a major research topic.

Parallel Distributed Perception and Performance

  • Page 4 of 9