Skip navigation

Computer Science and Intelligent Systems

Computer Science and Intelligent Systems

30% Discount Use Code MCOMPSCI30
  • Page 1 of 88

We turn on the lights in our house from a desk in an office miles away. Our refrigerator alerts us to buy milk on the way home. A package of cookies on the supermarket shelf suggests that we buy it, based on past purchases. The cookies themselves are on the shelf because of a “smart” supply chain. When we get home, the thermostat has already adjusted the temperature so that it’s toasty or bracing, whichever we prefer. This is the Internet of Things—a networked world of connected devices, objects, and people. In this book, Samuel Greengard offers a guided tour through this emerging world and how it will change the way we live and work.

Greengard explains that the Internet of Things (IoT) is still in its early stages. Smart phones, cloud computing, RFID (radio-frequency identification) technology, sensors, and miniaturization are converging to make possible a new generation of embedded and immersive technology. Greengard traces the origins of the IoT from the early days of personal computers and the Internet and examines how it creates the conceptual and practical framework for a connected world. He explores the industrial Internet and machine-to-machine communication, the basis for smart manufacturing and end-to-end supply chain visibility; the growing array of smart consumer devices and services—from Fitbit fitness wristbands to mobile apps for banking; the practical and technical challenges of building the IoT; and the risks of a connected world, including a widening digital divide and threats to privacy and security. Finally, he considers the long-term impact of the IoT on society, narrating an eye-opening “Day in the Life” of IoT connections circa 2025.

Stanford and the Computer Music Revolution

In the 1960s, a team of Stanford musicians, engineers, computer scientists, and psychologists used computing in an entirely novel way: to produce and manipulate sound and create the sonic basis of new musical compositions. This group of interdisciplinary researchers at the nascent Center for Computer Research in Music and Acoustics (CCRMA, pronounced “karma”) helped to develop computer music as an academic field, invent the technologies that underlie it, and usher in the age of digital music. In The Sound of Innovation, Andrew Nelson chronicles the history of CCRMA, tracing its origins in Stanford’s Artificial Intelligence Laboratory through its present-day influence on Silicon Valley and digital music groups worldwide.

Nelson emphasizes CCRMA’s interdisciplinarity, which stimulates creativity at the intersections of fields; its commitment to open sharing and users; and its pioneering commercial engagement. He shows that Stanford’s outsized influence on the emergence of digital music came from the intertwining of these three modes, which brought together diverse supporters with different aims around a field of shared interest. Nelson thus challenges long-standing assumptions about the divisions between art and science, between the humanities and technology, and between academic research and commercial applications, showing how the story of a small group of musicians reveals substantial insights about innovation.

Nelson draws on extensive archival research and dozens of interviews with digital music pioneers; the book’s website provides access to original historic documents and other material.

The Story of India's IT Revolution

The rise of the Indian information technology industry is a remarkable economic success story. Software and services exports from India amounted to less than $100 million in 1990, and today come close to $100 billion. But, as Dinesh Sharma explains in The Outsourcer, Indian IT’s success has a long prehistory; it did not begin with software support, or with American firms’ eager recruitment of cheap and plentiful programming labor, or with India’s economic liberalization of the 1990s. The foundations of India’s IT revolution were laid long ago, even before the country’s independence from British rule in 1947, as leading Indian scientists established research institutes that became centers for the development of computer science and technology. The “miracle” of Indian IT is actually a story about the long work of converting skills and knowledge into capital and wealth. With The Outsourcer, Sharma offers the first comprehensive history of the forces that drove India’s IT success.

Sharma describes India’s early development of computer technology, part of the country’s efforts to achieve national self-sufficiency, and shows that excessive state control stifled IT industry growth before economic policy changed in 1991. He traces the rise and fall (and return) of IBM in India and the emergence of pioneering indigenous hardware and software firms. He describes the satellite communication links and state-sponsored, tax-free technology parks that made software-related outsourcing by foreign firms viable, and the tsunami of outsourcing operations at the beginning of the new millennium. It is the convergence of many factors, from the tradition of technical education to the rise of entrepreneurship to advances in communication technology, that have made the spectacular growth of India’s IT industry possible.

The vast differences between the brain’s neural circuitry and a computer’s silicon circuitry might suggest that they have nothing in common. In fact, as Dana Ballard argues in this book, computational tools are essential for understanding brain function. Ballard shows that the hierarchical organization of the brain has many parallels with the hierarchical organization of computing; as in silicon computing, the complexities of brain computation can be dramatically simplified when its computation is factored into different levels of abstraction.

Drawing on several decades of progress in computational neuroscience, together with recent results in Bayesian and reinforcement learning methodologies, Ballard factors the brain’s principal computational issues in terms of their natural place in an overall hierarchy. Each of these factors leads to a fresh perspective. A neural level focuses on the basic forebrain functions and shows how processing demands dictate the extensive use of timing-based circuitry and an overall organization of tabular memories. An embodiment level organization works in reverse, making extensive use of multiplexing and on-demand processing to achieve fast parallel computation. An awareness level focuses on the brain’s representations of emotion, attention and consciousness, showing that they can operate with great economy in the context of the neural and embodiment substrates.

We now know that there is much more to classical mechanics than previously suspected. Derivations of the equations of motion, the focus of traditional presentations of mechanics, are just the beginning. This innovative textbook, now in its second edition, concentrates on developing general methods for studying the behavior of classical systems, whether or not they have a symbolic solution. It focuses on the phenomenon of motion and makes extensive use of computer simulation in its explorations of the topic. It weaves recent discoveries in nonlinear dynamics throughout the text, rather than presenting them as an afterthought. Explorations of phenomena such as the transition to chaos, nonlinear resonances, and resonance overlap to help the student develop appropriate analytic tools for understanding. The book uses computation to constrain notation, to capture and formalize methods, and for simulation and symbolic analysis. The requirement that the computer be able to interpret any expression provides the student with strict and immediate feedback about whether an expression is correctly formulated.

This second edition has been updated throughout, with revisions that reflect insights gained by the authors from using the text every year at MIT. In addition, because of substantial software improvements, this edition provides algebraic proofs of more generality than those in the previous edition; this improvement permeates the new edition.

Turing, Gödel, Church, and Beyond

In the 1930s a series of seminal works published by Alan Turing, Kurt Gödel, Alonzo Church, and others established the theoretical basis for computability. This work, advancing precise characterizations of effective, algorithmic computability, was the culmination of intensive investigations into the foundations of mathematics. In the decades since, the theory of computability has moved to the center of discussions in philosophy, computer science, and cognitive science. In this volume, distinguished computer scientists, mathematicians, logicians, and philosophers consider the conceptual foundations of computability in light of our modern understanding.

Some chapters focus on the pioneering work by Turing, Gödel, and Church, including the Church-Turing thesis and Gödel’s response to Church’s and Turing’s proposals. Other chapters cover more recent technical developments, including computability over the reals, Gödel’s influence on mathematical logic and on recursion theory and the impact of work by Turing and Emil Post on our theoretical understanding of online and interactive computing; and others relate computability and complexity to issues in the philosophy of mind, the philosophy of science, and the philosophy of mathematics.

Scott Aaronson, Dorit Aharonov, B. Jack Copeland, Martin Davis, Solomon Feferman, Saul Kripke, Carl J. Posy, Hilary Putnam, Oron Shagrir, Stewart Shapiro, Wilfried Sieg, Robert I. Soare, Umesh V. Vazirani

A Shadow History of the Internet

The vast majority of all email sent every day is spam, a variety of idiosyncratically spelled requests to provide account information, invitations to spend money on dubious products, and pleas to send cash overseas. Most of it is caught by filters before ever reaching an in-box. Where does it come from? As Finn Brunton explains in Spam, it is produced and shaped by many different populations around the world: programmers, con artists, bots and their botmasters, pharmaceutical merchants, marketers, identity thieves, crooked bankers and their victims, cops, lawyers, network security professionals, vigilantes, and hackers. Every time we go online, we participate in the system of spam, with choices, refusals, and purchases the consequences of which we may not understand.

This is a book about what spam is, how it works, and what it means. Brunton provides a cultural history that stretches from pranks on early computer networks to the construction of a global criminal infrastructure. The history of spam, Brunton shows us, is a shadow history of the Internet itself, with spam emerging as the mirror image of the online communities it targets. Brunton traces spam through three epochs: the 1970s to 1995, and the early, noncommercial computer networks that became the Internet; 1995 to 2003, with the dot-com boom, the rise of spam’s entrepreneurs, and the first efforts at regulating spam; and 2003 to the present, with the war of algorithms—spam versus anti-spam. Spam shows us how technologies, from email to search engines, are transformed by unintended consequences and adaptations, and how online communities develop and invent governance for themselves.

Computing is usually viewed as a technology field that advances at the breakneck speed of Moore’s Law. If we turn away even for a moment, we might miss a game-changing technological breakthrough or an earthshaking theoretical development. This book takes a different perspective, presenting computing as a science governed by fundamental principles that span all technologies. Computer science is a science of information processes. We need a new language to describe the science, and in this book Peter Denning and Craig Martell offer the great principles framework as just such a language. This is a book about the whole of computing—its algorithms, architectures, and designs.

Denning and Martell divide the great principles of computing into six categories: communication, computation, coordination, recollection, evaluation, and design. They begin with an introduction to computing, its history, its many interactions with other fields, its domains of practice, and the structure of the great principles framework. They go on to examine the great principles in different areas: information, machines, programming, computation, memory, parallelism, queueing, and design. Finally, they apply the great principles to networking, the Internet in particular.

Great Principles of Computing will be essential reading for professionals in science and engineering fields with a “computational” branch, for practitioners in computing who want overviews of less familiar areas of computer science, and for non-computer science majors who want an accessible entry way to the field.

The Politics and Aesthetics of Participation in Experience-Centered Design

In Taking [A]part, John McCarthy and Peter Wright consider a series of boundary-pushing research projects in human-computer interaction (HCI) in which the design of digital technology is used to inquire into participative experience. McCarthy and Wright view all of these projects—which range from the public and performative to the private and interpersonal—through the critical lens of participation. Taking participation, in all its variety, as the generative and critical concept allows them to examine the projects as a part of a coherent, responsive movement, allied with other emerging movements in DIY culture and participatory art. Their investigation leads them to rethink such traditional HCI categories as designer and user, maker and developer, researcher and participant, characterizing these relationships instead as mutually responsive and dialogical.

McCarthy and Wright explore four genres of participation—understanding the other, building relationships, belonging in community, and participating in publics—and they examine participatory projects that exemplify each genre. These include the Humanaquarium, a participatory musical performance; the Personhood project, in which a researcher and a couple explored the experience of living with dementia; the Prayer Companion project, which developed a technology to inform the prayer life of cloistered nuns; and the development of social media to support participatory publics in settings that range from reality game show fans to on-line deliberative democracies.

From Babies to Robots

Developmental robotics is a collaborative and interdisciplinary approach to robotics that is directly inspired by the developmental principles and mechanisms observed in children’s cognitive development. It builds on the idea that the robot, using a set of intrinsic developmental principles regulating the real-time interaction of its body, brain, and environment, can autonomously acquire an increasingly complex set of sensorimotor and mental capabilities. This volume, drawing on insights from psychology, computer science, linguistics, neuroscience, and robotics, offers the first comprehensive overview of a rapidly growing field.

After providing some essential background information on robotics and developmental psychology, the book looks in detail at how developmental robotics models and experiments have attempted to realize a range of behavioral and cognitive capabilities. The examples in these chapters were chosen because of their direct correspondence with specific issues in child psychology research; each chapter begins with a concise and accessible overview of relevant empirical and theoretical findings in developmental psychology. The chapters cover intrinsic motivation and curiosity; motor development, examining both manipulation and locomotion; perceptual development, including face recognition and perception of space; social learning, emphasizing such phenomena as joint attention and cooperation; language, from phonetic babbling to syntactic processing; and abstract knowledge, including models of number learning and reasoning strategies. Boxed text offers technical and methodological details for both psychology and robotics experiments.

  • Page 1 of 88