The process of inductive inference—to infer general laws and principles from particular instances—is the basis of statistical modeling, pattern recognition, and machine learning. The Minimum Descriptive Length (MDL) principle, a powerful method of inductive inference, holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data—that the more we are able to compress the data, the more we learn about the regularities underlying the data.
Data mining, or knowledge discovery, has become an indispensable technology for businesses and researchers in many fields. Drawing on work in such areas as statistics, machine learning, pattern recognition, databases, and high performance computing, data mining extracts useful information from the large data sets now available to industry and science. This collection surveys the most recent advances in the field and charts directions for future research.
The biannual International Conference on the Simulation of Adaptive Behavior brings together researchers from ethology, psychology, ecology, artificial intelligence, artificial life, robotics, engineering, and related fields to advance the understanding of behaviors and underlying mechanisms that allow natural and synthetic agents (animats) to adapt and survive in uncertain environments.
Evolutionary robotics is a new technique for the automatic creation of autonomous robots. Inspired by the Darwinian principle of selective reproduction of the fittest, it views robots as autonomous artificial organisms that develop their own skills in close interaction with the environment and without human intervention. Drawing heavily on biology and ethology, it uses the tools of neural networks, genetic algorithms, dynamic systems, and biomorphic engineering.
This book grew out of the Blacks at MIT History Project, whose mission is to document the black presence at MIT. The main body of the text consists of transcripts of more than seventy-five oral history interviews, in which the interviewees assess their MIT experience and reflect on the role of blacks at MIT and beyond. Although most of the interviewees are present or former students, black faculty, administrators, and staff are also represented, as are nonblack faculty and administrators who have had an impact on blacks at MIT.
This book presents, within a conceptually unified theoretical framework, a body of methods that have been developed over the past fifteen years for building and simulating qualitative models of physical systems—bathtubs, tea kettles, automobiles, the physiology of the body, chemical processing plants, control systems, electrical systems—where knowledge of that system is incomplete. The primary tool for this work is the author's QSIM algorithm, which is discussed in detail.
Parallel-Algorithms for Regular Architectures is the first book to concentrate exclusively on algorithms and paradigms for programming parallel computers such as the hypercube, mesh, pyramid, and mesh-of-trees. Algorithms are given to solve fundamental tasks such as sorting and matrix operations, as well as problems in the field of image processing, graph theory, and computational geometry. The first chapter defines the computer models, problems to be solved, and notation that will be used throughout the book.
There is increasing interest in genetic programming by both researchers and professional software developers. These twenty-two invited contributions show how a wide variety of problems across disciplines can be solved using this new paradigm.