The New York Times declared 2012 to be “The Year of the MOOC” as millions of students enrolled in massive open online courses (known as MOOCs), millions of investment dollars flowed to the companies making them, and the media declared MOOCs to be earth-shaking game-changers in higher education. During the inevitable backlash that followed, critics highlighted MOOCs’ high dropout rate, the low chance of earning back initial investments, and the potential for any earth-shaking game change to make things worse instead of better.
How did the human mind emerge from the collection of neurons that makes up the brain? How did the brain acquire self-awareness, functional autonomy, language, and the ability to think, to understand itself and the world? In this volume in the Essential Knowledge series, Zoltan Torey offers an accessible and concise description of the evolutionary breakthrough that created the human mind.
Our beliefs constitute a large part of our knowledge of the world. We have beliefs about objects, about culture, about the past, and about the future. We have beliefs about other people, and we believe that they have beliefs as well. We use beliefs to predict, to explain, to create, to console, to entertain. Some of our beliefs we call theories, and we are extraordinarily creative at constructing them. Theories of quantum mechanics, evolution, and relativity are examples. But so are theories about astrology, alien abduction, guardian angels, and reincarnation.
Thinkers have been fascinated by paradox since long before Aristotle grappled with Zeno’s. In this volume in The MIT Press Essential Knowledge series, Margaret Cuonzo explores paradoxes and the strategies used to solve them. She finds that paradoxes are more than mere puzzles but can prompt new ways of thinking.
In our daily life, it really seems as though we have free will, that what we do from moment to moment is determined by conscious decisions that we freely make. You get up from the couch, you go for a walk, you eat chocolate ice cream. It seems that we’re in control of actions like these; if we are, then we have free will. But in recent years, some have argued that free will is an illusion. The neuroscientist (and best-selling author) Sam Harris and the late Harvard psychologist Daniel Wegner, for example, claim that certain scientific findings disprove free will.
In December 2012, the exuberant video “Gangnam Style” became the first YouTube clip to be viewed more than one billion times. Thousands of its viewers responded by creating and posting their own variations of the video--“Mitt Romney Style,” “NASA Johnson Style,” “Egyptian Style,” and many others.
Ever since the term “crowdsourcing” was coined in 2006 by Wired writer Jeff Howe, group activities ranging from the creation of the Oxford English Dictionary to the choosing of new colors for M&Ms have been labeled with this most buzz-generating of media buzzwords. In this accessible but authoritative account, grounded in the empirical literature, Daren Brabham explains what crowdsourcing is, what it is not, and how it works.
Sitting on the beach on a sunny summer day, we enjoy the steady advance and retreat of the waves. In the water, enthusiastic waders jump and shriek with pleasure when a wave hits them. But where do these waves come from? How are they formed and why do they break on the shore? In Waves, Fredric Raichlen traces the evolution of waves, from their generation in the deep ocean to their effects on the coast.
The Internet lets us share perfect copies of our work with a worldwide audience at virtually no cost. We take advantage of this revolutionary opportunity when we make our work “open access”: digital, online, free of charge, and free of most copyright and licensing restrictions. Open access is made possible by the Internet and copyright-holder consent, and many authors, musicians, filmmakers, and other creators who depend on royalties are understandably unwilling to give their consent.
The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of “smart” hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective.
While we have been preoccupied with the latest i-gadget from Apple and with Google's ongoing expansion, we may have missed something: the fundamental transformation of whole firms and industries into giant information-processing machines. Today, more than eighty percent of workers collect and analyze information (often in digital form) in the course of doing their jobs.
Most managers leave intellectual property issues to the legal department, unaware that an organization’s intellectual property can help accomplish a range of management goals, from accessing new markets to improving existing products to generating new revenue streams. In this book, intellectual property expert and Harvard Law School professor John Palfrey offers a short briefing on intellectual property strategy for corporate managers and nonprofit administrators.