I. Bernard Cohen

  • Isaac Newton's Natural Philosophy

    Isaac Newton's Natural Philosophy

    Jed Z. Buchwald and I. Bernard Cohen

    Newton studies have undergone radical changes in the last half-century as more of his work has been uncovered and more details of his life and intellectual context have come to light. This volume singles out two strands in recent Newton studies: the intellectual background to Newton's scientific thought and both specific and general aspects of his technical science. The essays make new claims concerning Newton's mathematical methods, experimental investigations, and motivations, as well as the effect that his long presence had on science in England.

    The book is divided into two parts. The essays in part I shed new light on Newton's motivations and the sources of his method. The essays in part II explore Newton's mathematical philosophy and his development of rational mechanics and celestial dynamics. An appendix includes the last paper by Newton biographer Richard W. Westfall, examining some of the ways that mathematics came to be used in the age of Newton in pursuits and domains other than theoretical or rational mechanics.

    • Hardcover $11.75
    • Paperback $25.00
  • Howard Aiken

    Howard Aiken

    Portrait of a Computer Pioneer

    I. Bernard Cohen

    Biography of Howard Aiken, a major figure of the early digital era, by a major historian of science who was also a colleague of Aiken's at Harvard.

    Howard Hathaway Aiken (1900-1973) was a major figure of the early digital era. He is best known for his first machine, the IBM Automatic Sequence Controlled Calculator or Harvard Mark I, conceived in 1937 and put into operation in 1944. But he also made significant contributions to the development of applications for the new machines and to the creation of a university curriculum for computer science.

    This biography of Aiken, by a major historian of science who was also a colleague of Aiken's at Harvard, offers a clear and often entertaining introduction to Aiken and his times. Aiken's Mark I was the most intensely used of the early large-scale, general-purpose automatic digital computers, and it had a significant impact on the machines that followed. Aiken also proselytized for the computer among scientists, scholars, and businesspeople and explored novel applications in data processing, automatic billing, and production control. But his most lasting contribution may have been the students who received degrees under him and then took prominent positions in academia and industry. I. Bernard Cohen argues convincingly for Aiken's significance as a shaper of the computer world in which we now live.

    • Hardcover $60.00
    • Paperback $30.00
  • Makin' Numbers

    Makin' Numbers

    Howard Aiken and the Computer

    I. Bernard Cohen and Gregory W. Welch

    With the cooperation of Robert V. D. Campbell. This collection of technical essays and reminiscences is a companion volume to I. Bernard Cohen's biography, Howard Aiken: Portrait of a Computer Pioneer. After an overview by Cohen, Part I presents the first complete publication of Aiken's 1937 proposal for an automatic calculating machine, which was later realized as the Mark I, as well as recollections of Aiken's first two machines by the chief engineer in charge of construction of Mark II, Robert Campbell, and the principal programmer of Mark I, Richard Bloch. Henry Tropp describes Aiken's hostility to the exclusive use of binary numbers in computational systems and his alternative approach. Part II contains essays on Aiken's administrative and teaching styles by former students Frederick Brooks and Peter Calingaert and an essay by Gregory Welch on the difficulties Aiken faced in establishing a computer science program at Harvard. Part III contains recollections by people who worked or studied with Aiken, including Richard Bloch, Grace Hopper, Anthony Oettinger, and Maurice Wilkes. Henry Tropp provides excerpts from an interview conducted just before Aiken's death. Part IV gathers the most significant of Aiken's own writings. The appendixes give the specs of Aiken's machines and list his doctoral students and the topics of their dissertations.

    • Hardcover $48.00
    • Paperback $25.00
  • Interactions

    Interactions

    Some Contacts between the Natural Sciences and the Social Sciences

    I. Bernard Cohen

    One of the fruits of the Scientific Revolution was the idea of a social science that would operate in ways comparable to the newly triumphant natural sciences. Thus was set in motion a long and often convoluted chain of two-way interactions that still have implications for both scholarship and public policy. This book, by the dean of American historians of science, offers an excellent historical perspective on these interactions.

    One of the fruits of the Scientific Revolution was the idea of a social science—a science of government, of individual behavior, and of society—that would operate in ways comparable to the newly triumphant natural sciences. Thus was set in motion a long and often convoluted chain of two-way interactions that still have implications for both scholarship and public policy. This book, by the dean of American historians of science, offers an excellent historical perspective on these interactions.

    The core of the book consists of two long essays. The first focuses on the role of analogies as linking factors between the two realms. Examples are drawn from the physics of rational mechanics and energy physics (in relation to marginalist or neoclassical economics) and from the biology of the cell theory (in relation to nineteenth-century sociology). The second essay looks closely at the relations between the natural and the social sciences in the period of the Scientific Revolution.

    The book also includes a record of a series of conversations between the author and Harvey Brooks (Professor of Technology and Public Policy Emeritus at Harvard) that addresses the present-day public policy implications of the historical interactions between the natural and the social sciences. A short but illuminating history of the terms "natural science" and "social science" concludes the book.

    • Hardcover $38.50
    • Paperback $25.00

Contributor

  • Proof, Language, and Interaction

    Proof, Language, and Interaction

    Essays in Honour of Robin Milner

    Gordon Plotkin, Colin P. Stirling, and Mads Tofte

    This collection of original essays reflects the breadth of current research in computer science.

    This collection of original essays reflects the breadth of current research in computer science. Robin Milner, a major figure in the field, has made many fundamental contributions, particularly in theoretical computer science, the theory of programming languages, and functional programming languages. Following a brief biography of Milner, the book contains five sections: Semantic Foundations, Programming Logic, Programming Languages, Concurrency, and Mobility. Together the pieces convey a seamless whole, ranging from highly abstract concepts to systems of great utility.

    Contributors Samson Abramsky, J. C. M. Baeten, Sergey Berezin, J. A. Bergstra, Gérard Berry, Lars Birkedal, Gérard Boudol, Edmund Clarke, Pierre Collette, Robert L. Constable, Pierre-Louis Curien, Jaco de Bakker, Uffe H. Engberg, William Ferreira, Fabio Gadducci, Mike Gordon, Robert Harper, Matthew Hennessy, Yoram Hirshfeld, C. A. R. Hoare, Gérard Huet, Paul B. Jackson, Alan S. A. Jeffrey, Somesh Jha, He Jifeng, Cliff B. Jones, Cosimo Laneve, Xinxin Liu, Will Marrero, Faron Moller, Ugo Montanari, Pavel Naumov, Mogens Nielsen, Joachim Parrow, Lawrence C. Paulson, Benjamin C. Pierce, Gordon Plotkin, M. A. Reniers, Amokrane Saïbi, Augusto Sampaio, Davide Sangiorgi, Scott A. Smolka, Eugene W. Stark, Christopher Stone, Mads Tofte, David N. Turner, Juan Uribe, Franck van Breugel, David Walker, Glynn Winskel

    • Hardcover $17.75
  • Competition in Telecommunications

    Competition in Telecommunications

    Jean-Jacques Laffont and Jean Tirole

    Theoretical models based on the assumption that telecommunications is a natural monopoly no longer reflect reality. As a result, policymakers often lack the guidance of economic theorists. Competition in Telecommunications is written in a style accessible to managers, consultants, government officials, and others. Jean-Jacques Laffont and Jean Tirole analyze regulatory reform and the emergence of competition in network industries using the state-of-the-art theoretical tools of industrial organization, political economy, and the economics of incentives. The book opens with background information for the reader who is unfamiliar with current issues in the telecommunications industry. The following sections focus on four central aspects of the recent deregulatory movement: the introduction of incentive regulation; one-way access (access given by a local network to the providers of complementary segments, such as long-distance or information services); the special nature of competition in an industry requiring two-way access (whereby competing networks depend on the mutual termination of calls); and universal service, in particular the two leading contenders for the competitively neutral provision of universal service: the use of engineering models to compute subsidies and the design of universal service auctions. The book concludes with a discussion of the Internet and regulatory institutions.

    Copublished with the Center for Economic Studies and the Ifo Institute.

    • Hardcover $30.00
    • Paperback $50.00
  • The Definition of Standard ML, Revised Edition

    The Definition of Standard ML, Revised Edition

    Robin Milner, Robert Harper, David MacQueen, and Mads Tofte

    Standard ML is a general-purpose programming language designed for large projects. This book provides a formal definition of Standard ML for the benefit of all concerned with the language, including users and implementers. Because computer programs are increasingly required to withstand rigorous analysis, it is all the more important that the language in which they are written be defined with full rigor. One purpose of a language definition is to establish a theory of meanings upon which the understanding of particular programs may rest. To properly define a programming language, it is necessary to use some form of notation other than a programming language. Given a concern for rigor, mathematical notation is an obvious choice. The authors have defined their semantic objects in mathematical notation that is completely independent of Standard ML. In defining a language one must also define the rules of evaluation precisely—that is, define what meaning results from evaluating any phrase of the language. The definition thus constitutes a formal specification for an implementation. The authors have developed enough of their theory to give sense to their rules of evaluation. The Definition of Standard ML is the essential point of reference for Standard ML. Since its publication in 1990, the implementation technology of the language has advanced enormously and the number of users has grown. The revised edition includes a number of new features, omits little-used features, and corrects mistakes of definition.

    • Paperback $30.00
  • Dual Labor Markets

    Dual Labor Markets

    A Macroeconomic Perspective

    Gilles Saint-Paul

    The labor market consists of two tiers. Workers in the upper tier enjoy high wages, good benefits, and employment security, and they are often unionized. Workers in the lower tier experience low wages, high turnover, job insecurity, and little chance of promotion. Until now, dual labor market theory has focused mainly on microeconomic factors such as discrimination, poverty, and public welfare. Dual Labor Markets considers the macroeconomic implications of the dual market. The book uses theoretical models derived from the author's research over the past six years to analyze such policy issues as the level and persistence of unemployment, the level of real wages, the accumulation of human capital, and the political viability of labor market reform in the United States and Europe.

    • Hardcover $45.00
  • The Prudential Regulation of Banks

    The Prudential Regulation of Banks

    Mathias Dewatripont and Jean Tirole

    The Prudential Regulation of Banks applies modern economic theory to prudential regulation of financial intermediaries. Dewatripont and Tirole tackle the key problem of providing the right incentives to management in banks by looking at how external intervention by claimholders (holders of equity or debt) affects managerial incentives and how that intervention might ideally be implemented. Their primary focus is the regulation of commercial banks and S&Ls, but many of the implications of their theory are also valid for other intermediaries such as insurance companies, pension funds, and securities funds. Observing that the main concern of the regulation of intermediaries is solvency (the relation between equity, debt, and asset riskiness), the authors provide institutional background and develop a case for regulation as performing the monitoring functions (screening, auditing, convenant writing, and intervention) that dispersed depositors are unable or unwilling to perform. They also illustrate the dangers of regulatory failure in a summary of the S&L crisis of the 1980s. Following a survey of banking theory, Dewatripont and Tirole develop their model of the capital structure of banks and show how optimal regulation can be achieved using capital adequacy requirements and external intervention when banks are violated. They explain how regulation can be designed to minimize risks of accounting manipulations and to insulate bank managers from macroeconomic shocks, which are beyond their control. Finally, they provide a detailed evaluation of the existing regulation and of potential alternatives, such as rating agencies, private deposit insurance, and large private depositors. They show that these reforms are, at best, a complement, rather than a substitute, to the existing regulation which combines capital ratios with external intervention in case of insolvency. The Prudential Regulation of Banks is part of the Walras Pareto Lectures, from the Universiy of Lausanne.

    • Hardcover $50.00
    • Paperback $30.00
  • Game Theory

    Game Theory

    Drew Fudenberg and Jean Tirole

    This advanced text introduces the principles of noncooperative game theory in a direct and uncomplicated style that will acquaint students with the broad spectrum of the field while highlighting and explaining what they need to know at any given point.

    This advanced text introduces the principles of noncooperative game theory—including strategic form games, Nash equilibria, subgame perfection, repeated games, and games of incomplete information—in a direct and uncomplicated style that will acquaint students with the broad spectrum of the field while highlighting and explaining what they need to know at any given point. The analytic material is accompanied by many applications, examples, and exercises. The theory of noncooperative games studies the behavior of agents in any situation where each agent's optimal choice may depend on a forecast of the opponents' choices. "Noncooperative" refers to choices that are based on the participant's perceived selfinterest. Although game theory has been applied to many fields, Fudenberg and Tirole focus on the kinds of game theory that have been most useful in the study of economic problems. They also include some applications to political science. The fourteen chapters are grouped in parts that cover static games of complete information, dynamic games of complete information, static games of incomplete information, dynamic games of incomplete information, and advanced topics.

    • Hardcover $105.00
  • Commentary on Standard ML

    Robin Milner and Mads Tofte

    The full mathematical description of the functional programming language ML was given in Milner, Tofte, and Harper's Definition of Standard ML. This companion volume explains in depth the meaning, or semantic theory, of ML. Together, the two volumes provide a complete understanding of the most prominent of a new group of functional programming languages that includes Haskell and Scheme.In making the Definition easier to understand, the authors not only explain what ML is, they explain why it is. They present some of the rigorous analysis that supports the Definition including a selection of theorems that express important properties of the language. The Commentary is also a working document that shows the way in which the specialized theory of ML can contribute to broader research on language design and semantics.

    Contents Preface • Executing a Simple Program • Dynamic Semantics for the Core • Dynamic Semantics for the Modules • Static Semantics for the Core • Type Declarations and Principality • Static Semantics for the Modules • Signature Matching • Elaboration of Functors • Admissible Semantic Objects and Proofs • Elaboration of Signature Expressions • Principal Signatures • Appendixes: Proof of Principality • Identifier Status • Solutions to Exercises • Mistakes and Ambiguities

    • Hardcover $42.00
    • Paperback $21.00
  • The Definition of Standard ML

    Robert Harper, Robin Milner, and Mads Tofte

    This book presents the official, formal definition of the programming language ML including the rules for grammar and static and dynamic semantics. ML is the most well-developed and prominent of a new group of functional programming languages. On the cutting edge of theoretical computer science, ML embodies the ideas of static typing and polymorphism and has also contributed a number of novel ideas to the design of programming languages.

    Contents Syntax of the Core • Syntax of Modules • Static Semantics for the Core • Static Semantics for Modules • Dynamic Semantics for Modules • Programs

    Appendixes: Derived Forms • Full Grammar • The Initial Static Basis • The Initial Dynamic Basis • The Development of ML

    • Hardcover $30.00
    • Paperback $15.00
  • A Manual of Operation for the Automated Sequence Controlled Calculator

    A Manual of Operation for the Automated Sequence Controlled Calculator

    Harvard Computation Laboratory

    If the Mark I itself was a milestone in digital computing, so was this Manual: it was one of the first publications to address the fundamental question of how to get a computer to solve problems.

    In the summer of 1944, at a dedication ceremony at Harvard's Cruft Laboratory, one of the world's first automatic digital calculating machines was unveiled to the public. The machine was the Automatic Sequence Controlled Calculator, more commonly known as the Harvard Mark I. The staff of the Harvard Computation Laboratory was unprepared for the interest which news of the machine's dedication touched off, and in response to many inquiries they arranged for the publication of this Manual of Operation. If the Mark I itself was a milestone in digital computing, so was this Manual: it was one of the first publications to address the fundamental question of how to get a computer to solve problems. Scattered throughout the book are listings of operation codes that represent sequences of operations the Mark I would carry out: these are among the first examples anywhere of what are now called computer programs. Both this Manual of Operation and the computer it describes reveal the profound transition from an age when computing was something human beings did, with varying degrees of mechanical aids, to one where machines themselves do most of the work. A Manual of Operation for the Automatic Sequence Controlled Calculator was originally published in 1946 by Harvard University Press. It is Volume VII in the Charles babbage Institute reprint series.

    • Hardcover $95.00
    • Paperback $69.00