Skip navigation

technology

The Early American Daguerreotype

The Early American Daguerreotype

The daguerreotype, invented in France, came to America in 1839. It was, as Sarah Kate Gillespie's book The Early American Daguerreotype shows, something wholly and remarkably new: a product of science and innovative technology that resulted in a visual object. We're celebrating World Photo Day with an excerpt from The Early American Daguerreotype.

Originally a French invention, daguerreotyping—a photographic process that produces extremely detailed images—reached American shores in the fall of 1839. A daguerreotype is a direct-positive image on a silvered copper plate. Historically, the plate was polished until it had a mirror-like surface, then was treated with lightsensitive chemicals. The plate was then fitted into a camera and exposed to the subject. Once exposed, the plate was developed above a box of mercury fumes, and the image was fixed in a bath of hyposulfate of soda. The finished product was then washed and dried. Because the surface remained sensitive, it was placed under a plate of glass and usually put in a case.

National Robotics Week: The Technological Singularity

National Robotics Week: The Technological Singularity

Robotic technologies that we once only saw in the realm of science fiction are quickly becoming reality. Advancements continue to push forward at a dizzying rate, demonstrating a broad array of possibilities and uses for the technology. Artificial limbs, healthcare, national security, communication, and even artificial intelligence programs are just some of the ways robotic technologies have been integrated into our everyday lives.

National Robotics Week continues today with an excerpt from Murray Shanahan’s The Technological Singularity. In this book from the Essential Knowledge Series, Shanahan discusses the hypothetical event in which artificial intelligence would be able to adapt itself without human programming—commonly known as the “technological singularity.” The excerpt describes how advanced AI could communicate with humans.

Five Minutes with Tung-Hui Hu

Five Minutes with Tung-Hui Hu

What exactly is the digital cloud? And where did it come from? In A Prehistory of the Cloud, Tung-Hui Hu—a former network engineer and current professor at University of Michigan—traces its origins and examines the gap between the real and the virtual in our understanding of the cloud.

The Hidden Figures of the British Computer Industry

The Hidden Figures of the British Computer Industry

This Oscars weekend Hidden Figures, the previously untold story of three brilliant African-American women working at NASA during the Space Race, is a favorite to win best picture at the 89th Academy Awards. Marie Hicks's new book Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing tackles similar issues of forgotten history, exploring how Britain lost its early dominance in computing by systematically discriminating against its most qualified workers: women.

Margot Lee Shetterly, the author of Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Raceon which the movie is basedhas praised Programmed Inequality, saying, "Marie Hicks’s well-researched look into Britain’s computer industry, and its critical dependence on the work of female computer programmers, is a welcome addition to our body of knowledge of women’s historical employment in science and technology. Hicks confidently shows that the professional mobility of women in computing supports the success of the industry as a whole, an important lesson for scholars and policymakers seeking ways to improve inclusion in STEM fields.” 

In this post, Marie Hicks explains why, even today, possessing technical skill is not enough to ensure that women will rise to the top in science and technology fields, how the disappearance of women from the field had grave macroeconomic consequences for Britain, and why the United States risks repeating those errors in the twenty-first century.

Net Neutrality and Control of the Internet

Net Neutrality and Control of the Internet

Two days ago a federal appeals court upheld an earlier F.C.C. decision to label broadband technology a utility, maintaining net neutrality. Regulating Code authors Ian Brown and Christopher T. Marsden offer their take on the decision.

On June 14, the District of Columbia Court of Appeals upheld the Federal Communications Commission (FCC) right to regulate Internet Access Providers (IAPs) as common carriers. This confirms the US agency’s power to regulate IAPs to ensure users can access the content, applications, and services they wish without interference from their access provider: what is known as net neutrality. It is part of a wider international regulatory trend to support user rights and prevent interference with Internet traffic. However, as we showed in Regulating Code (2013), this regulatory trend is counteracted by the controlling tendency of the technologies deployed by the national security state and private surveillance partners. Our book was published only months prior to Edward Snowden’s revelations about cooperation between Five Eyes nations (including the USA, Canada and UK) and their corporate partners to conduct mass surveillance, and our warnings that net neutrality cannot be disentangled from privacy, surveillance, copyright enforcement, state censorship, and the role of social media, have come home to roost.

International Women's Day

International Women's Day

It's International Women's Day. Today we celebrate the social, economic, cultural, and political achievement of women. The MIT Press BITS—free excerpts from influential MIT Press books—offers four selections dedicated to honoring the achievements of women in technology.

Big Data and the Future of Entertainment Part 1

Big Data and the Future of Entertainment Part 1

Shake off some of that Labor Day rust by checking out the first part of a Q & A with Mike Smith and Rahul Telang who are the authors of Streaming, Sharing, Stealing: Big Data and the Future of Entertainment. Their book is about how big data is transforming the creative industries, and how those industries can use lessons from Netflix, Amazon, and Apple to fight back.

National Robotics Week: Developmental Robotics

National Robotics Week: Developmental Robotics

In our final post in celebration of National Robotics Week, Matthew Schlesinger and Angelo Cangelosi discuss science’s progress toward creating machines that appear to “think,” express themselves, and have authentic experiences like our own. Schlesinger and Cangelosi are authors of Developmental Robotics: From Babies to Robots, a comprehensive overview of robotics that takes direct inspiration from the developmental and learning phenomena observed in children’s cognitive development.

Changing the Face of Computing—One Stitch at a Time

Changing the Face of Computing—One Stitch at a Time

In honor of Ada Lovelace Day, an international celebration of the achievements of women in science, technology, engineering, and math (STEM), Yasmin Kafai and Jane Margolis reflect on the legacy of the British mathematician, who is famously regarded as the first female computer programmer.<--break->

Is "Memogate" Really that Surprising?

Is "Memogate" Really that Surprising?

On August 5th, VICE reported on a controversial document that espoused sexist beliefs written by a Google employee and circulated within the company . Jennifer Lieberman, author of Power Lines weighs in on "memogate"arguing against the widely held belief that technology always leads to progress.

This has been an exciting few weeks for those of us who think critically and historically about technology. My newsfeed has been abuzz with editorials about an event some are now calling “Memogate.” This story concerns a Google employee who wrote a sexist manifesto claiming that women were largely absent from technological fields because of biological differences. This software engineer was subsequently fired and then quickly rehired by Julian Assange.

“Memogate” has made very public a fact that surprised many, though it was apparent to science and technology studies scholars and to minorities and women working in technical industries: advancements in science and technology have not erased America’s prevailing social biases. Rather, these advancements continue to recapitulate or reinforce existing prejudices. A number of important editorials came out in response to this series of events that address these problems. I recommend recent pieces by The MIT Press’s own Marie Hicks, of Programmed Inequality fame, and by Chanda Prescod-Weinstein on these issues, and here I add my own voice to this conversation.

Five Minutes with Hugh Gusterson

Five Minutes with Hugh Gusterson

Drones are changing the conduct of war. Advocates say that drones are more precise than conventional bombers, allowing warfare with minimal civilian deaths while keeping American pilots out of harm’s way. Critics say that drones are cowardly and that they often kill innocent civilians while terrorizing entire villages on the ground. In Drone: Remote Control Warfare, Hugh Gusterson looks at the paradoxical mix of closeness and distance involved in remote killing: is it easier than killing someone on the physical battlefield if you have to watch onscreen? Hugh Gusterson discusses his new book.

How has the use of military drones altered the way that war is conducted?

Traditional definitions of war assume combatants on either side who can kill one another. In drone warfare, one side is now physically absent from the field of combat. This is why some people have said drone warfare is more like hunting than war.

Further, democratically elected leaders have always been aware of a certain risk in going to war: if too many of their own citizens come home in body bags, the country may turn against them (as happened to Presidents Johnson, Nixon, and George W. Bush). But drone warfare, by sparing us Americans in body bags, offers the possibility of indefinite war without victory, but with very little political cost at home.

World Poetry Day—Code Is Poetry

World Poetry Day—Code Is Poetry

In honor of World Poetry Day, enjoy an excerpt from 10 PRINT CHR$(205.5+RND(1)); : GOTO 10, which poses the question "where does the poetry of the poem lie?" and shows how "code is poetry" through a close reading of a one-line BASIC program.

Big Data and the Future of Entertainment Part 2

Big Data and the Future of Entertainment Part 2

Last week we posted the first part of a Q & A with Mike Smith and Rahul Teland, coauthors of Streaming, Sharing, Stealing:Big Data and the Future of Entertainment. Here's part 2:

Why has Big Data disrupted the entertainment industry more rapidly and with greater consequence than most other industries?

The two main reasons are access to data and culture. Think about the story Michael Lewis tells in Moneyball. Billy Beane’s decision to replace gut feel decision-making with data-driven decision-making required huge changes in the Oakland A’s organizational culture, and huge innovations in analytics. His leadership changed the game, but it only gave Oakland a year or two of competitive advantage. Everyone else in the league soon caught up.

Kentucky Derby: Off-Track and Online

Kentucky Derby: Off-Track and Online

On Derby Day, Holly Kruse discusses how horse racing has adapted to interactive and social media technologies. She is the author of Off-Track and Online: The Networked Spaces of Horse Racing.

The 142nd Kentucky Derby will be run today, May 7th, 2016, at Churchill Downs in Louisville, Kentucky. The first Derby was run in 1875 and was won by a horse named Aristides. Churchill Downs’ founder, Col. Meriwether Lewis Clark, Jr., was inspired to create the Kentucky Derby after attending the 1872 Epsom Derby in England. Every May since the inaugural event, three-year-old Thoroughbreds have met on the track in Louisville; and since 1886, when the race was shortened from 1½ miles, the race has been run at the “classic” distance of 1¼ miles.

The Migration Crisis in Europe—Making Smart Use of Smartphones

The Migration Crisis in Europe—Making Smart Use of Smartphones

How can technology be used to help the migration crisis in Europe? Joseph Bock, author of The Technology of Nonviolence, shares thoughts from Greece.

I know lots of people wish they could do something to help migrants who are fleeing war, injustice or poverty and flooding into Europe. I had the same feeling. I was surprised about three weeks ago to find an email message asking if I wanted to do just that. Will I go to Greece, supported by the Fulbright Foundation in Greece, to help the Municipality of Athens with this largest flow of displaced human beings since World War II?

I’m now in Athens working with a team assembled by Mayor Giorgos Kaminis to develop a plan on how to respond to this crisis. I’ve been giving some thought to how smartphones, social media and internet-based platforms can be used in this situation. So, in case this might be helpful to people working on these kinds of challenges, here are some ways to consider:

Our Reliance on Electricity

Our Reliance on Electricity

Jennifer Lieberman, author of Power Lines shares with us some thoughts on our dependency on electricity and what it means in the wake of Hurricane Irma.

A July Day on the Moon

A July Day on the Moon

On July 20, 1969 workers called in sick and children stayed home from school. Crowds gathered around televisions in department store windows to watch the Apollo 11 moon landing. Digital Apollo: Human and Machine in Spaceflight by David Mindell examines the design and execution of each of the six Apollo moon landings, drawing on transcripts and data telemetry from the flights, astronaut interviews, and NASA’s extensive archives. In honor of the anniversary of the first moon landing, the following is an excerpt from Digital Apollo that describes the high tension of that fateful day.

On a July day in 1969, after a silent trip around the far side of the moon, the two Apollo spacecraft reappeared out of the shadows and reestablished contact with earth. The command and service module (CSM) (sometimes simply ‘‘command module’’) was now the mother ship, the capsule and its supplies that would carry the astronauts home. The CSM continued to orbit the moon, with astronaut Michael Collins alone in the capsule. ‘‘Listen, babe,’’ Collins reported to ground controllers at NASA in Houston, ‘‘everything’s going just swimmingly. Beautiful.’’ His two colleagues Neil Armstrong and Edwin ‘‘Buzz’’ Aldrin had just separated the other spacecraft, the fragile, spidery lunar module (LM, pronounced ‘‘lem’’), nicknamed Eagle, from the command module. This odd, aluminum balloon, packed with instruments and a few engines, would carry the two men down to the lunar surface.

Five Minutes with Phillip Penix-Tadsen

Five Minutes with Phillip Penix-Tadsen

In Cultural Code, Phillip Penix-Tadsen examines Latin America’s gaming practices and the representation of the region’s cultures in games. He discusses his new book and how games have enormous potential for creating immersive and interactive cultural experiences.

Celebrating Ada Lovelace

Celebrating Ada Lovelace

In honor of Ada Lovelace Day, we look back at last year's essay ("Changing the Face of Computing—One Stitch at a Time") by Yasmin Kafai and Jane Margolis about the legacy of the pioneering British mathematician who became the first computer programmer.<--break-> 

As we celebrate Ada Lovelace Day, we should be reminded that one of the first computers in the nineteenth century, the “Analytical Engine,” was based on the design of the Jacquard loom, for weaving fashionable complex textiles of the times. It was fashion that inspired British mathematician Ada Lovelace to write the code for the loom that wove the complex patterns that were in vogue. She also wrote a most beautiful sentence linking computing and fashion: “We may say most aptly that the Analytical Engine weaves algebraic patterns just as the Jacquard loom weaves flowers and leaves.” And yet, the historical and intimate relationship between fashion and computer science has largely been forgotten and ignored, even as Lovelace’s pioneering spirit lives on today’s runways.

Five Minutes with Benjamin Peters

Five Minutes with Benjamin Peters

How Not to Network a Nation: The Uneasy History of the Soviet Internet recounts the Soviet Union’s failed attempts to construct its own Internet during the Cold War. Benjamin Peters discusses his book and considers the implications of the Soviet experience for today’s networked world. 

What was the OGAS project? What role did it play in the development of computer networks?

The OGAS project was the most ambitious attempt to network the Soviet Union—to construct a national computer network. Viktor M. Glushkov, whose New York Times obituary dubbed him the “king of Soviet cybernetics,” considered the OGAS his lifework between his appointment as director of the Institute of Cybernetics in Kiev in 1962 and his death of an apparent brain hemorrhage in 1982. “OGAS” is short for the obshchee-gosudarstvennaya avtomatizirovannaya system—or the all-state automated system, which itself was a shortening of its full train-length name: the All-State Automated System for the Gathering and Processing of Information for the Accounting, Planning, and Governance of the National Economy, USSR. This heroic or gargantuan project, in Glushkov’s 1962 proposal, sought to build incrementally on preexisting and new telephony networks until it would go fully online 30 years later, offering up in the process a real-time decentralized hierarchical computer network for managing all the information flows in the command economy. He envisioned it reaching from one central computer center in Moscow, to several hundred regional computer centers in prominent cities, and then to as many as 20,000 local computing centers in factories and enterprises stretching over all of Soviet Eurasia. Its higher purpose was to realize “electronic socialism” technocratically, guiding the socialist experiment another step toward communism itself. However, the project encountered significant obstacles on the path to its realization in the 1960s and 1970s. By the 1980s, the OGAS project had splintered into a patchwork of unconnected and non-interoperable local factory control systems spread throughout the country.

ENIAC's Birthday

ENIAC's Birthday

On February 15th 1946, ENIAC (the Electronic Numerical Integrator and Computer), the world's first programmable electronic computer made its debut. Thomas Haigh, author of the recently released ENIAC in Action, discusses the milestone and its significance.