Paperback | $30.00 Short | £20.95 | ISBN: 9780262526036 | 496 pp. | 7 x 9 in | 77 b&w illus.| March 2014
Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.
This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well.
The book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.
About the Authors
Robert E. Schapire is Professor of Computer Science at Princeton University.
Yoav Freund is Professor of Computer Science at the University of California, San Diego.
“This excellent book is a mind-stretcher that should be read and reread, even by nonspecialists.”
“Boosting is, quite simply, one of the best-written books I've read on machine learning...”
—The Bactra Review
“For those who wish to work in the area, it is a clear and insightful view of the subject that deserves a place in the canon of machine learning and on the shelves of those who study it.”—Giles Hooker, Journal of American Statistical Association
“Robert Schapire and Yoav Freund made a huge impact in machine and statistical learning with their invention of boosting, which has survived the test of time. There have been lively discussions about alternative explanations of why it works so well, and the jury is still out. This well-balanced book from the 'masters' covers boosting from all points of view, and gives easy access to the wealth of research that this field has produced.”
—Trevor Hastie, Statistics Department, Stanford University
“Boosting has provided a platform for thinking about and designing machine learning algorithms for over 20 years. The simple and elegant idea behind boosting is a 'Mirror of Erised' that researchers view from many different perspectives. This book beautifully ties together these views, using the same limpid style found in Robert Schapire and Yoav Freund's original research papers. It's an important resource for machine learning research.”
—John Lafferty, University of Chicago and Carnegie Mellon University
“An outstanding text, which provides an authoritative, self-contained, broadly accessible and very readable treatment of boosting methods, a widely applied family of machine learning algorithms pioneered by the authors. It nicely covers the spectrum from theory through methodology to applications.”
—Peter Bartlett, University of California, Berkeley
“Boosting is an amazing machine learning algorithm of 'intelligence' with much success in practice. It allows a weak learner to adapt to the data at hand and become 'strong'; it seamlessly integrates statistical estimation and computation. In this book, Robert Schapire and Yoav Freund, two inventors of the field, present multiple, fascinating views of boosting to explain why and how it works.”
—Bin Yu, University of California, Berkeley
Selected as a Best of 2012 by Computing Reviews.