Skip navigation

History of Technology

  •  
  • Page 1 of 25
Stanford and the Computer Music Revolution

In the 1960s, a team of Stanford musicians, engineers, computer scientists, and psychologists used computing in an entirely novel way: to produce and manipulate sound and create the sonic basis of new musical compositions. This group of interdisciplinary researchers at the nascent Center for Computer Research in Music and Acoustics (CCRMA, pronounced “karma”) helped to develop computer music as an academic field, invent the technologies that underlie it, and usher in the age of digital music. In The Sound of Innovation, Andrew Nelson chronicles the history of CCRMA, tracing its origins in Stanford’s Artificial Intelligence Laboratory through its present-day influence on Silicon Valley and digital music groups worldwide.

Nelson emphasizes CCRMA’s interdisciplinarity, which stimulates creativity at the intersections of fields; its commitment to open sharing and users; and its pioneering commercial engagement. He shows that Stanford’s outsized influence on the emergence of digital music came from the intertwining of these three modes, which brought together diverse supporters with different aims around a field of shared interest. Nelson thus challenges long-standing assumptions about the divisions between art and science, between the humanities and technology, and between academic research and commercial applications, showing how the story of a small group of musicians reveals substantial insights about innovation.

Nelson draws on extensive archival research and dozens of interviews with digital music pioneers; the book’s website provides access to original historic documents and other material.

The Story of India's IT Revolution

The rise of the Indian information technology industry is a remarkable economic success story. Software and services exports from India amounted to less than $100 million in 1990, and today come close to $100 billion. But, as Dinesh Sharma explains in The Outsourcer, Indian IT’s success has a long prehistory; it did not begin with software support, or with American firms’ eager recruitment of cheap and plentiful programming labor, or with India’s economic liberalization of the 1990s. The foundations of India’s IT revolution were laid long ago, even before the country’s independence from British rule in 1947, as leading Indian scientists established research institutes that became centers for the development of computer science and technology. The “miracle” of Indian IT is actually a story about the long work of converting skills and knowledge into capital and wealth. With The Outsourcer, Sharma offers the first comprehensive history of the forces that drove India’s IT success.

Sharma describes India’s early development of computer technology, part of the country’s efforts to achieve national self-sufficiency, and shows that excessive state control stifled IT industry growth before economic policy changed in 1991. He traces the rise and fall (and return) of IBM in India and the emergence of pioneering indigenous hardware and software firms. He describes the satellite communication links and state-sponsored, tax-free technology parks that made software-related outsourcing by foreign firms viable, and the tsunami of outsourcing operations at the beginning of the new millennium. It is the convergence of many factors, from the tradition of technical education to the rise of entrepreneurship to advances in communication technology, that have made the spectacular growth of India’s IT industry possible.

How a Box Changes the Way We Think

We live in a world organized around the container. Standardized twenty- and forty-foot shipping containers carry material goods across oceans and over land; provide shelter, office space, and storage capacity; inspire films, novels, metaphors, and paradigms. Today, TEU (Twenty Foot Equivalent Unit, the official measurement for shipping containers) has become something like a global currency. A container ship, sailing under the flag of one country but owned by a corporation headquartered in another, carrying auto parts from Japan, frozen fish from Vietnam, and rubber ducks from China, offers a vivid representation of the increasing, world-is-flat globalization of the international economy. In The Container Principle, Alexander Klose investigates the principle of the container and its effect on the way we live and think.

Klose explores a series of “container situations” in their historical, political, and cultural contexts. He examines the container as a time capsule, sometimes breaking loose and washing up onshore to display an inventory of artifacts of our culture. He explains the “Matryoshka principle,” explores the history of land-water transport, and charts the three phases of container history. He examines the rise of logistics, the containerization of computing in the form of modularization and standardization, the architecture of container-like housing (citing both Le Corbusier and Malvina Reynolds’s “Little Boxes”), and a range of artistic projects inspired by containers. Containerization, spreading from physical storage to organizational metaphors, Klose argues, signals a change in the fundamental order of thinking and things. It has become a principle.

Patents, HIV/AIDS, and Race

In The Genealogy of a Gene, Myles Jackson uses the story of the CCR5 gene to investigate the interrelationships among science, technology, and society. Mapping the varied “genealogy” of CCR5—intellectual property, natural selection, Big and Small Pharma, human diversity studies, personalized medicine, ancestry studies, and race and genomics—Jackson links a myriad of diverse topics. The history of CCR5 from the 1990s to the present offers a vivid illustration of how intellectual property law has changed the conduct and content of scientific knowledge, and the social, political, and ethical implications of such a transformation.

The CCR5 gene began as a small sequence of DNA, became a patented product of a corporation, and then, when it was found to be an AIDS virus co-receptor with a key role in the immune system, it became part of the biomedical research world—and a potential moneymaker for the pharmaceutical industry. When it was further discovered that a mutation of the gene found in certain populations conferred near-immunity to the AIDS virus, questions about race and genetics arose. Jackson describes these developments in the context of larger issues, including the rise of “biocapitalism,” the patentability of products of nature, the difference between U.S. and European patenting approaches, and the relevance of race and ethnicity to medical research.

The Pre-Chernobyl History of the Soviet Nuclear Industry

The Chernobyl disaster has been variously ascribed to human error, reactor design flaws, and industry mismanagement. Six former Chernobyl employees were convicted of criminal negligence; they defended themselves by pointing to reactor design issues. Other observers blamed the Soviet style of ideologically driven economic and industrial management. In Producing Power, Sonja Schmid draws on interviews with veterans of the Soviet nuclear industry and extensive research in Russian archives as she examines these alternate accounts. Rather than pursue one “definitive” explanation, she investigates how each of these narratives makes sense in its own way and demonstrates that each implies adherence to a particular set of ideas—about high-risk technologies, human-machine interactions, organizational methods for ensuring safety and productivity, and even about the legitimacy of the Soviet state. She also shows how these attitudes shaped, and were shaped by, the Soviet nuclear industry from its very beginnings.

Schmid explains that Soviet experts established nuclear power as a driving force of social, not just technical, progress. She examines the Soviet nuclear industry’s dual origins in weapons and electrification programs, and she traces the emergence of nuclear power experts as a professional community. Schmid also fundamentally reassesses the design choices for nuclear power reactors in the shadow of the Cold War’s arms race.

Schmid’s account helps us understand how and why a complex sociotechnical system broke down. Chernobyl, while unique and specific to the Soviet experience, can also provide valuable lessons for contemporary nuclear projects.

The mechanized assembly line was invented in 1913 and has been in continuous operation ever since. It is the most familiar form of mass production. Both praised as a boon to workers and condemned for exploiting them, it has been celebrated and satirized. (We can still picture Chaplin’s little tramp trying to keep up with a factory conveyor belt.) In America’s Assembly Line, David Nye examines the industrial innovation that made the United States productive and wealthy in the twentieth century.

The assembly line—developed at the Ford Motor Company in 1913 for the mass production of Model Ts—first created and then served an expanding mass market. It also transformed industrial labor. By 1980, Japan had reinvented the assembly line as a system of “lean manufacturing”; American industry reluctantly adopted the new approach. Nye describes this evolution and the new global landscape of increasingly automated factories, with fewer industrial jobs in America and questionable working conditions in developing countries. A century after Ford’s pioneering innovation, the assembly line continues to evolve toward more sustainable manufacturing.

A Shadow History of the Internet

The vast majority of all email sent every day is spam, a variety of idiosyncratically spelled requests to provide account information, invitations to spend money on dubious products, and pleas to send cash overseas. Most of it is caught by filters before ever reaching an in-box. Where does it come from? As Finn Brunton explains in Spam, it is produced and shaped by many different populations around the world: programmers, con artists, bots and their botmasters, pharmaceutical merchants, marketers, identity thieves, crooked bankers and their victims, cops, lawyers, network security professionals, vigilantes, and hackers. Every time we go online, we participate in the system of spam, with choices, refusals, and purchases the consequences of which we may not understand.

This is a book about what spam is, how it works, and what it means. Brunton provides a cultural history that stretches from pranks on early computer networks to the construction of a global criminal infrastructure. The history of spam, Brunton shows us, is a shadow history of the Internet itself, with spam emerging as the mirror image of the online communities it targets. Brunton traces spam through three epochs: the 1970s to 1995, and the early, noncommercial computer networks that became the Internet; 1995 to 2003, with the dot-com boom, the rise of spam’s entrepreneurs, and the first efforts at regulating spam; and 2003 to the present, with the war of algorithms—spam versus anti-spam. Spam shows us how technologies, from email to search engines, are transformed by unintended consequences and adaptations, and how online communities develop and invent governance for themselves.

The Cold War period saw a dramatic expansion of state-funded science and technology research. Government and military patronage shaped Cold War technoscientific practices, imposing methods that were project oriented, team based, and subject to national-security restrictions. These changes affected not just the arms race and the space race but also research in agriculture, biomedicine, computer science, ecology, meteorology, and other fields. This volume examines science and technology in the context of the Cold War, considering whether the new institutions and institutional arrangements that emerged globally constrained technoscientific inquiry or offered greater opportunities for it.

The contributors find that whatever the particular science, and whatever the political system in which that science was operating, the knowledge that was produced bore some relation to the goals of the nation-state. These goals varied from nation to nation; weapons research was emphasized in the United States and the Soviet Union, for example, but in France and China scientific independence and self-reliance dominated. The contributors also consider to what extent the changes to science and technology practices in this era were produced by the specific politics, anxieties, and aspirations of the Cold War.

Contributors
Elena Aronova, Erik M. Conway, Angela N. H. Creager, David Kaiser, John Krige, Naomi Oreskes, George Reisch, Sigrid Schmalzer, Sonja D. Schmid, Matthew Shindell, Asif A. Siddiqi, Zuoyue Wang, Benjamin Wilson

Pirates, Protest, and Politics in FM Radio Activism

The United States ushered in a new era of small-scale broadcasting in 2000 when it began issuing low-power FM (LPFM) licenses for noncommercial radio stations around the country. Over the next decade, several hundred of these newly created low-wattage stations took to the airwaves. In Low Power to the People, Christina Dunbar-Hester describes the practices of an activist organization focused on LPFM during this era. Despite its origins as a pirate broadcasting collective, the group eventually shifted toward building and expanding regulatory access to new, licensed stations. These radio activists consciously cast radio as an alternative to digital utopianism, promoting an understanding of electronic media that emphasizes the local community rather than a global audience of Internet users.

Dunbar-Hester focuses on how these radio activists impute emancipatory politics to the “old” medium of radio technology by promoting the idea that “microradio” broadcasting holds the potential to empower ordinary people at the local community level. The group’s methods combine political advocacy with a rare commitment to hands-on technical work with radio hardware, although the activists’ hands-on, inclusive ethos was hampered by persistent issues of race, class, and gender.

Dunbar-Hester’s study of activism around an “old” medium offers broader lessons about how political beliefs are expressed through engagement with specific technologies. It also offers insight into contemporary issues in media policy that is particularly timely as the FCC issues a new round of LPFM licenses.

Matter, Measure, and the Misadventures of Precision

When architects draw even brick walls to six decimal places with software designed to cut lenses, it is clear that the logic that once organized relations between precision and material error in construction has unraveled. Precision, already a promiscuous term, seems now to have been uncoupled from its contract with truthfulness. Meanwhile error, and the always-political space of its dissent, has reconfigured itself.

In The Architecture of Error Francesca Hughes argues that behind the architect’s acute fetishization of redundant precision lies a special fear of physical error. What if we were to consider the pivotal cultural and technological transformations of modernism to have been driven not so much by the causes its narratives declare, she asks, as by an unspoken horror of loss of control over error, material life, and everything that matter stands for? Hughes traces the rising intolerance of material vagaries—from the removal of ornament to digitalized fabrication—that produced the blind rejection of organic materials, the proliferation of material testing, and the rhetorical obstacles that blighted cybernetics. Why is it, she asks, that the more we cornered physical error, the more we feared it?

Hughes’s analysis of redundant precision exposes an architecture of fear whose politics must be called into question. Proposing error as a new category for architectural thought, Hughes draws on other disciplines and practices that have interrogated precision and failure, citing the work of scientists Nancy Cartwright and Evelyn Fox Keller and visual artists Gordon Matta-Clark, Barbara Hepworth, Rachel Whiteread, and others. These non-architect practitioners, she argues, show that error need not be excluded and precision can be made accountable.

  •  
  • Page 1 of 25