Skip navigation

History of Computing

  •  
  • Page 1 of 2
The Story of India's IT Revolution

The rise of the Indian information technology industry is a remarkable economic success story. Software and services exports from India amounted to less than $100 million in 1990, and today come close to $100 billion. But, as Dinesh Sharma explains in The Outsourcer, Indian IT’s success has a long prehistory; it did not begin with software support, or with American firms’ eager recruitment of cheap and plentiful programming labor, or with India’s economic liberalization of the 1990s. The foundations of India’s IT revolution were laid long ago, even before the country’s independence from British rule in 1947, as leading Indian scientists established research institutes that became centers for the development of computer science and technology. The “miracle” of Indian IT is actually a story about the long work of converting skills and knowledge into capital and wealth. With The Outsourcer, Sharma offers the first comprehensive history of the forces that drove India’s IT success.

Sharma describes India’s early development of computer technology, part of the country’s efforts to achieve national self-sufficiency, and shows that excessive state control stifled IT industry growth before economic policy changed in 1991. He traces the rise and fall (and return) of IBM in India and the emergence of pioneering indigenous hardware and software firms. He describes the satellite communication links and state-sponsored, tax-free technology parks that made software-related outsourcing by foreign firms viable, and the tsunami of outsourcing operations at the beginning of the new millennium. It is the convergence of many factors, from the tradition of technical education to the rise of entrepreneurship to advances in communication technology, that have made the spectacular growth of India’s IT industry possible.

Stanford and the Computer Music Revolution

In the 1960s, a team of Stanford musicians, engineers, computer scientists, and psychologists used computing in an entirely novel way: to produce and manipulate sound and create the sonic basis of new musical compositions. This group of interdisciplinary researchers at the nascent Center for Computer Research in Music and Acoustics (CCRMA, pronounced “karma”) helped to develop computer music as an academic field, invent the technologies that underlie it, and usher in the age of digital music. In The Sound of Innovation, Andrew Nelson chronicles the history of CCRMA, tracing its origins in Stanford’s Artificial Intelligence Laboratory through its present-day influence on Silicon Valley and digital music groups worldwide.

Nelson emphasizes CCRMA’s interdisciplinarity, which stimulates creativity at the intersections of fields; its commitment to open sharing and users; and its pioneering commercial engagement. He shows that Stanford’s outsized influence on the emergence of digital music came from the intertwining of these three modes, which brought together diverse supporters with different aims around a field of shared interest. Nelson thus challenges long-standing assumptions about the divisions between art and science, between the humanities and technology, and between academic research and commercial applications, showing how the story of a small group of musicians reveals substantial insights about innovation.

Nelson draws on extensive archival research and dozens of interviews with digital music pioneers; the book’s website provides access to original historic documents and other material.

Turing, Gödel, Church, and Beyond

In the 1930s a series of seminal works published by Alan Turing, Kurt Gödel, Alonzo Church, and others established the theoretical basis for computability. This work, advancing precise characterizations of effective, algorithmic computability, was the culmination of intensive investigations into the foundations of mathematics. In the decades since, the theory of computability has moved to the center of discussions in philosophy, computer science, and cognitive science. In this volume, distinguished computer scientists, mathematicians, logicians, and philosophers consider the conceptual foundations of computability in light of our modern understanding.

Some chapters focus on the pioneering work by Turing, Gödel, and Church, including the Church-Turing thesis and Gödel’s response to Church’s and Turing’s proposals. Other chapters cover more recent technical developments, including computability over the reals, Gödel’s influence on mathematical logic and on recursion theory and the impact of work by Turing and Emil Post on our theoretical understanding of online and interactive computing; and others relate computability and complexity to issues in the philosophy of mind, the philosophy of science, and the philosophy of mathematics.

Contributors:
Scott Aaronson, Dorit Aharonov, B. Jack Copeland, Martin Davis, Solomon Feferman, Saul Kripke, Carl J. Posy, Hilary Putnam, Oron Shagrir, Stewart Shapiro, Wilfried Sieg, Robert I. Soare, Umesh V. Vazirani

A Shadow History of the Internet

The vast majority of all email sent every day is spam, a variety of idiosyncratically spelled requests to provide account information, invitations to spend money on dubious products, and pleas to send cash overseas. Most of it is caught by filters before ever reaching an in-box. Where does it come from? As Finn Brunton explains in Spam, it is produced and shaped by many different populations around the world: programmers, con artists, bots and their botmasters, pharmaceutical merchants, marketers, identity thieves, crooked bankers and their victims, cops, lawyers, network security professionals, vigilantes, and hackers. Every time we go online, we participate in the system of spam, with choices, refusals, and purchases the consequences of which we may not understand.

This is a book about what spam is, how it works, and what it means. Brunton provides a cultural history that stretches from pranks on early computer networks to the construction of a global criminal infrastructure. The history of spam, Brunton shows us, is a shadow history of the Internet itself, with spam emerging as the mirror image of the online communities it targets. Brunton traces spam through three epochs: the 1970s to 1995, and the early, noncommercial computer networks that became the Internet; 1995 to 2003, with the dot-com boom, the rise of spam’s entrepreneurs, and the first efforts at regulating spam; and 2003 to the present, with the war of algorithms—spam versus anti-spam. Spam shows us how technologies, from email to search engines, are transformed by unintended consequences and adaptations, and how online communities develop and invent governance for themselves.

Computer Models, Climate Data, and the Politics of Global Warming

Global warming skeptics often fall back on the argument that the scientific case for global warming is all model predictions, nothing but simulation; they warn us that we need to wait for real data, “sound science.” In A Vast Machine Paul Edwards has news for these skeptics: without models, there are no data. Today, no collection of signals or observations—even from satellites, which can “see” the whole planet with a single instrument—becomes global in time and space without passing through a series of data models. Everything we know about the world’s climate we know through models. Edwards offers an engaging and innovative history of how scientists learned to understand the atmosphere—to measure it, trace its past, and model its future.

Women’s Changing Participation in Computing

Today, women earn a relatively low percentage of computer science degrees and hold proportionately few technical computing jobs. Meanwhile, the stereotype of the male “computer geek” seems to be everywhere in popular culture. Few people know that women were a significant presence in the early decades of computing in both the United States and Britain. Indeed, programming in postwar years was considered woman’s work (perhaps in contrast to the more manly task of building the computers themselves). In Recoding Gender, Janet Abbate explores the untold history of women in computer science and programming from the Second World War to the late twentieth century. Demonstrating how gender has shaped the culture of computing, she offers a valuable historical perspective on today’s concerns over women’s underrepresentation in the field.

Abbate describes the experiences of women who worked with the earliest electronic digital computers: Colossus, the wartime codebreaking computer at Bletchley Park outside London, and the American ENIAC, developed to calculate ballistics. She examines postwar methods for recruiting programmers, and the 1960s redefinition of programming as the more masculine “software engineering.” She describes the social and business innovations of two early software entrepreneurs, Elsie Shutt and Stephanie Shirley; and she examines the career paths of women in academic computer science.

Abbate’s account of the bold and creative strategies of women who loved computing work, excelled at it, and forged successful careers will provide inspiration for those working to change gendered computing culture.

Computers, Programmers, and the Politics of Technical Expertise

This is a book about the computer revolution of the mid-twentieth century and the people who made it possible. Unlike most histories of computing, it is not a book about machines, inventors, or entrepreneurs. Instead, it tells the story of the vast but largely anonymous legions of computer specialists--programmers, systems analysts, and other software developers--who transformed the electronic computer from a scientific curiosity into the defining technology of the modern era. As the systems that they built became increasingly powerful and ubiquitous, these specialists became the focus of a series of critiques of the social and organizational impact of electronic computing. To many of their contemporaries, it seemed the “computer boys” were taking over, not just in the corporate setting, but also in government, politics, and society in general.

In The Computer Boys Take Over, Nathan Ensmenger traces the rise to power of the computer expert in modern American society. His rich and nuanced portrayal of the men and women (a surprising number of the “computer boys” were, in fact, female) who built their careers around the novel technology of electronic computing explores issues of power, identity, and expertise that have only become more significant in our increasingly computerized society.

In his recasting of the drama of the computer revolution through the eyes of its principle revolutionaries, Ensmenger reminds us that the computerization of modern society was not an inevitable process driven by impersonal technological or economic imperatives, but was rather a creative, contentious, and above all, fundamentally human development.

A Concise History

The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of “smart” hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective. He identifies four major threads that run throughout all of computing’s technological development: digitization--the coding of information, computation, and control in binary form, ones and zeros; the convergence of multiple streams of techniques, devices, and machines, yielding more than the sum of their parts; the steady advance of electronic technology, as characterized famously by “Moore’s Law”; and the human-machine interface.

Ceruzzi guides us through computing history, telling how a Bell Labs mathematician coined the word “digital” in 1942 (to describe a high-speed method of calculating used in anti-aircraft devices), and recounting the development of the punch card (for use in the 1890 U.S. Census). He describes the ENIAC, built for scientific and military applications; the UNIVAC, the first general purpose computer; and ARPANET, the Internet’s precursor. Ceruzzi’s account traces the world-changing evolution of the computer from a room-size ensemble of machinery to a “minicomputer” to a desktop computer to a pocket-sized smart phone. He describes the development of the silicon chip, which could store ever-increasing amounts of data and enabled ever-decreasing device size. He visits that hotbed of innovation, Silicon Valley, and brings the story up to the present with the Internet, the World Wide Web, and social networking.

The Commodore Amiga

Long ago, in 1985, personal computers came in two general categories: the friendly, childish game machine used for fun (exemplified by Atari and Commodore products); and the boring, beige adult box used for business (exemplified by products from IBM). The game machines became fascinating technical and artistic platforms that were of limited real-world utility. The IBM products were all utility, with little emphasis on aesthetics and no emphasis on fun. Into this bifurcated computing environment came the Commodore Amiga 1000. This personal computer featured a palette of 4,096 colors, unprecedented animation capabilities, four-channel stereo sound, the capacity to run multiple applications simultaneously, a graphical user interface, and powerful processing potential. It was, Jimmy Maher writes in The Future Was Here, the world’s first true multimedia personal computer.

Maher argues that the Amiga’s capacity to store and display color photographs, manipulate video (giving amateurs access to professional tools), and use recordings of real-world sound were the seeds of the digital media future: digital cameras, Photoshop, MP3 players, and even YouTube, Flickr, and the blogosphere. He examines different facets of the platform--from Deluxe Paint to AmigaOS to Cinemaware--in each chapter, creating a portrait of the platform and the communities of practice that surrounded it. Of course, Maher acknowledges, the Amiga was not perfect: the DOS component of the operating systems was clunky and ill-matched, for example, and crashes often accompanied multitasking attempts. And Commodore went bankrupt in 1994. But for a few years, the Amiga’s technical qualities were harnessed by engineers, programmers, artists, and others to push back boundaries and transform the culture of computing.

A Hollywood biopic about the life of computer pioneer Grace Murray Hopper (1906–1992) would go like this: a young professor abandons the ivy-covered walls of academia to serve her country in the Navy after Pearl Harbor and finds herself on the front lines of the computer revolution. She works hard to succeed in the all-male computer industry, is almost brought down by personal problems but survives them, and ends her career as a celebrated elder stateswoman of computing, a heroine to thousands, hailed as the inventor of computer programming. Throughout Hopper’s later years, the popular media told this simplified version of her life story. In Grace Hopper and the Invention of the Information Age, Kurt Beyer reveals a more authentic Hopper, a vibrant and complex woman whose career paralleled the meteoric trajectory of the postwar computer industry.

Both rebellious and collaborative, Hopper was influential in male-dominated military and business organizations at a time when women were encouraged to devote themselves to housework and childbearing. Hopper’s greatest technical achievement was to create the tools that would allow humans to communicate with computers in terms other than ones and zeroes. This advance influenced all future programming and software design and laid the foundation for the development of user-friendly personal computers.

  •  
  • Page 1 of 2