David A. McAllester

David A McAllester is an Assistant Professor of Computer Science at MIT.


    A Knowledge Representation System for Mathematics

    David A. McAllester

    ONTIC, the interactive system for verifying "natural" mathematical arguments that David McAllester describes in this book, represents a significant change of direction in the field of mechanical deduction, a key area in computer science and artificial intelligence. ONTIC is an interactive theorem prover based on novel forward chaining inference techniques. It is an important advance over such earlier systems for checking mathematical arguments as Automath, Nuprl, and the Boyer Moore system. The first half of the book provides a high-level description of the ONTIC system and compares it with these and other automated theorem proving and verification systems. The second half presents a complete formal specification of the inference mechanisms used. McAllester's is the only semi automated verification system based on classical Zermelo-Fraenkel set theory. It uses object oriented inference, a unique automated inference mechanism for a syntactic variant of first order predicate calculus. The book shows how the ONTIC system can be used to check such serious proofs as the proof of the Stone representation theorem without expanding them to excessive detail.

    ONTIC: A Knowledge Representation System for Mathematics is included in the Artificial Intelligence series, edited by Patrick Henry Winston and Michael J. Brady.

    • Hardcover $29.95


  • Predicting Structured Data

    Predicting Structured Data

    Gökhan BakIr, Thomas Hofmann, Bernhard Schölkopf, Alexander J. Smola, Ben Taskar, and S.V.N Vishwanathan

    State-of-the-art algorithms and theory in a novel domain of machine learning, prediction when the output has structure.

    Machine learning develops intelligent computer systems that are able to generalize from previously seen examples. A new domain of machine learning, in which the prediction must satisfy the additional constraints found in structured data, poses one of machine learning's greatest challenges: learning functional dependencies between arbitrary input and output domains. This volume presents and analyzes the state of the art in machine learning algorithms and theory in this novel field. The contributors discuss applications as diverse as machine translation, document markup, computational biology, and information extraction, among others, providing a timely overview of an exciting field.

    Contributors Yasemin Altun, Gökhan Bakir, Olivier Bousquet, Sumit Chopra, Corinna Cortes, Hal Daumé III, Ofer Dekel, Zoubin Ghahramani, Raia Hadsell, Thomas Hofmann, Fu Jie Huang, Yann LeCun, Tobias Mann, Daniel Marcu, David McAllester, Mehryar Mohri, William Stafford Noble, Fernando Pérez-Cruz, Massimiliano Pontil, Marc'Aurelio Ranzato, Juho Rousu, Craig Saunders, Bernhard Schölkopf, Matthias W. Seeger, Shai Shalev-Shwartz, John Shawe-Taylor, Yoram Singer, Alexander J. Smola, Sandor Szedmak, Ben Taskar, Ioannis Tsochantaridis, S.V.N Vishwanathan, Jason Weston

    • Hardcover $47.00 £38.00
    • Paperback $45.00 £38.00