In the era of “big data,” science is increasingly information driven, and the potential for computers to store, manage, and integrate massive amounts of data has given rise to such new disciplinary fields as biomedical informatics. Applied ontology offers a strategy for the organization of scientific information in computer-tractable form, drawing on concepts not only from computer and information science but also from linguistics, logic, and philosophy.
Between Humanities and the Digital offers an expansive vision of how the humanities engage with digital and information technology, providing a range of perspectives on a quickly evolving, contested, and exciting field. It documents the multiplicity of ways that humanities scholars have turned increasingly to digital and information technology as both a scholarly tool and a cultural object in need of analysis.
“Big Data” is on the covers of Science, Nature, the Economist, and Wired magazines, on the front pages of the Wall Street Journal and the New York Times. But despite the media hyperbole, as Christine Borgman points out in this examination of data and scholarly research, having the right data is usually better than having more data; little data can be just as valuable as big data. In many cases, there are no data—because relevant data don’t exist, cannot be found, or are not available.
In Knowledge Machines, Eric Meyer and Ralph Schroeder argue that digital technologies have fundamentally changed research practices in the sciences, social sciences, and humanities. Meyer and Schroeder show that digital tools and data, used collectively and in distributed mode—which they term e-research—have transformed not just the consumption of knowledge but also the production of knowledge. Digital technologies for research are reshaping how knowledge advances in disciplines that range from physics to literary analysis.
Maps of physical spaces locate us in the world and help us navigate unfamiliar routes. Maps of topical spaces help us visualize the extent and structure of our collective knowledge; they reveal bursts of activity, pathways of ideas, and borders that beg to be crossed. This book, from the author of Atlas of Science, describes the power of topical maps, providing readers with principles for visualizing knowledge and offering as examples forty large-scale and more than 100 small-scale full-color maps.
The rise of the Indian information technology industry is a remarkable economic success story. Software and services exports from India amounted to less than $100 million in 1990, and today come close to $100 billion. But, as Dinesh Sharma explains in The Outsourcer, Indian IT’s success has a long prehistory; it did not begin with software support, or with American firms’ eager recruitment of cheap and plentiful programming labor, or with India’s economic liberalization of the 1990s.
In this book, Ronald Day offers a critical history of the modern tradition of documentation. Focusing on the documentary index (understood as a mode of social positioning), and drawing on the work of the French documentalist Suzanne Briet, Day explores the understanding and uses of indexicality. He examines the transition as indexes went from being explicit professional structures that mediated users and documents to being implicit infrastructural devices used in everyday information and communication acts.
The New York Times declared 2012 to be “The Year of the MOOC” as millions of students enrolled in massive open online courses (known as MOOCs), millions of investment dollars flowed to the companies making them, and the media declared MOOCs to be earth-shaking game-changers in higher education. During the inevitable backlash that followed, critics highlighted MOOCs’ high dropout rate, the low chance of earning back initial investments, and the potential for any earth-shaking game change to make things worse instead of better.
The computer systems of government agencies are notoriously complex. New technologies are piled on older technologies, creating layers that call to mind an archaeological dig. Obsolete programming languages and closed mainframe designs offer barriers to integration with other agency systems. Worldwide, these unwieldy systems waste billions of dollars, keep citizens from receiving services, and even—as seen in interoperability failures on 9/11 and during Hurricane Katrina—cost lives.
Through five editions since 1981, this book has offered the most comprehensive accessible guide available to all aspects of copyright law. Now, with the sixth edition, The Copyright Book has been thoroughly updated to cover copyright for the Internet age, discussing a range of developments in the law since 2000.