This is a book about the computer revolution of the mid-twentieth century and the people who made it possible. Unlike most histories of computing, it is not a book about machines, inventors, or entrepreneurs. Instead, it tells the story of the vast but largely anonymous legions of computer specialists--programmers, systems analysts, and other software developers--who transformed the electronic computer from a scientific curiosity into the defining technology of the modern era.
The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of “smart” hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective.
Long ago, in 1985, personal computers came in two general categories: the friendly, childish game machine used for fun (exemplified by Atari and Commodore products); and the boring, beige adult box used for business (exemplified by products from IBM). The game machines became fascinating technical and artistic platforms that were of limited real-world utility. The IBM products were all utility, with little emphasis on aesthetics and no emphasis on fun. Into this bifurcated computing environment came the Commodore Amiga 1000.
A Hollywood biopic about the life of computer pioneer Grace Murray Hopper (1906–1992) would go like this: a young professor abandons the ivy-covered walls of academia to serve her country in the Navy after Pearl Harbor and finds herself on the front lines of the computer revolution. She works hard to succeed in the all-male computer industry, is almost brought down by personal problems but survives them, and ends her career as a celebrated elder stateswoman of computing, a heroine to thousands, hailed as the inventor of computer programming.
For much of the first half of the twentieth century, meteorology was more art than science, dependent on an individual forecaster’s lifetime of local experience. In Weather by the Numbers, Kristine Harper tells the story of the transformation of meteorology from a “guessing science” into a sophisticated scientific discipline based on physics and mathematics.
Much of the world’s Internet management and governance takes place in a corridor extending west from Washington, DC, through northern Virginia toward Washington Dulles International Airport. Much of the United States’ military planning and analysis takes place here as well. At the center of that corridor is Tysons Corner—an unincorporated suburban crossroads once dominated by dairy farms and gravel pits.
In the first three and a half years of its existence, Fairchild Semiconductor developed, produced, and marketed the device that would become the fundamental building block of the digital world: the microchip. Founded in 1957 by eight former employees of the Schockley Semiconductor Laboratory, Fairchild created the model for a successful Silicon Valley start-up: intense activity with a common goal, close collaboration, and a quick path to the market (Fairchild’s first device hit the market just ten months after the company’s founding).
When we think of the Internet, we generally think of Amazon, Google, Hotmail, Napster, MySpace, and other sites for buying products, searching for information, downloading entertainment, chatting with friends, or posting photographs. In the academic literature about the Internet, however, these uses are rarely covered. The Internet and American Business fills this gap, picking up where most scholarly histories of the Internet leave off--with the commercialization of the Internet established and its effect on traditional business a fact of life.
In The Allure of Machinic Life, John Johnston examines new forms of nascent life that emerge through technical interactions within human-constructed environments--“machinic life”--in the sciences of cybernetics, artificial life, and artificial intelligence. With the development of such research initiatives as the evolution of digital organisms, computer immune systems, artificial protocells, evolutionary robotics, and swarm systems, Johnston argues, machinic life has achieved a complexity and autonomy worthy of study in its own right.
During the Cold War, the field of computing advanced rapidly within a complex institutional context. In Calculating a Natural World, Atsushi Akera describes the complicated interplay of academic, commercial, and government and military interests that produced a burst of scientific discovery and technological innovation in 1940s and 1950s America. This was the era of big machines--the computers that made the reputations of IBM and of many academic laboratories--and Akera uses the computer as a historical window on the emerging infrastructure of American scientific and engineering research.