Skip navigation

Library Science

  • Page 3 of 3
  •  

University libraries have a long tradition of sharing the information they house among themselves and of making it freely available to scholars generally. This volume extends this tradition to the modern realm of automated library systems by demonstrating how such libraries can collaborate in developing automated systems and by sharing this information with 1ibrarians at large.

The Collaborative Library Systems Development (CLSD) project was a joint venture between the Chicago, Columbia, and Stanford University libraries established in 1968 by a grant from the National Science Foundation. It was formed to provide for an exchange of working data, technical reports, and ideas concerning library automation and information transfer systems among the participating institutions and to coordinate their aims and schedules.

A casual review of the automated systems described here, which are now under development at the Chicago, Columbia and Stanford libraries, would seem to indicate that each has developed independently, without cognizance of the others. In fact, their differences are complementary and have been carefully predefined in collaboration; in effect, these differences extend the range of the study in that they allow several quite diverse methods to be subjected to common review. Since 1968, senior technical personnel responsible for systems development in each institution have worked closely together with the objective of testing the feasibility of designing and implementing a common or compatible system. Early in the effort it was established that this specific objective was unrealistic for a variety of technical and logistic reasons, and it was decided that a more achievable objective would be found at a more general design level. Even at this level is was apparent that significant differences existed in terms of philosophy, approach, and scope which could not and probably should not be resolved at this stage of library automation development.

The consensus was that the most valuable contributions that these three institutions could make would be to develop individual systems, whose special features could afterward be compared, and which would reflect different yet technically valid approaches to the solution of a common problem. Grossly stated, Stanford's approach is to make the fullest and most innovative use of the on-line, interactive potential of computer technology. At the opposite extreme, Columbia's approach emphasizes using this technology conservatively, stressing off-line, batch-oriented operations. Chicago's approach falls between these two extremes, stressing the use of batched, on-line operations against fully integrated files.

The contributions presented here describe and compare these systems. They are derived from the two CLSD conferences that have been held. All the major papers presented at the New York conference (1970) are included, as are selected papers from the Stanford conference (1968). In addition, there is a paper summarizing the CLSD experience from its inception.

The emergence of the Internet and the wide availability of affordable computing equipment have created tremendous interest in digital libraries and electronic publishing. This book is the first to provide an integrated overview of the field, including a historical perspective, the state of the art, and current research.

The term "digital libraries" covers the creation and distribution of all types of information over networks, ranging from converted historical materials to kinds of information that have no analogues in the physical world. In some ways digital libraries and traditional libraries are very different, yet in other ways they are remarkably similar. People still create information that has to be organized, stored, and distributed, and they still need to find and use information that others have created. An underlying theme of this book is that no aspect of digital libraries can be understood in isolation or without attention to the needs of the people who create and use information. Although the book covers a wide range of technical, economic, social, and organizational topics, the focus is on the actual working components of a digital library.

The American Public Library in the Information Age

Quintessentially American institutions, symbols of community spirit and the American faith in education, public libraries are ubiquitous in the United States. Close to a billion library visits are made each year, and more children join summer reading programs than little league baseball. Public libraries are local institutions, as different as the communities they serve. Yet their basic services, techniques, and professional credo are essentially similar; and they offer, through technology and cooperative agreements, myriad materials and information far beyond their own walls.

In Civic Space/Cyberspace, Redmond Kathleen Molz and Phyllis Dain assess the current condition and direction of the American public library. They consider the challenges and opportunities presented by new electronic technologies, changing public policy, fiscal realities, and cultural trends. They draw on site visits and interviews conducted across the country; extensive reading of reports, surveys, and other documents; and their long-standing interest in the library's place in the social and civic structure. The book uniquely combines a scholarly, humanistic, and historical approach to public libraries with a clear-eyed look at their problems and prospects, including their role in the emerging national information infrastructure.

Until recently, information systems have been designed around different business functions, such as accounts payable and inventory control. Object-oriented modeling, in contrast, structures systems around the data—the objects—that make up the various business functions. Because information about a particular function is limited to one place—to the object—the system is shielded from the effects of change. Object-oriented modeling also promotes better understanding of requirements, clear designs, and more easily maintainable systems.

This book focuses on recent developments in representational and processing aspects of complex data-intensive applications. The chapters cover "hot" topics such as application behavior and consistency, reverse engineering, interoperability and collaboration between objects, and work-flow modeling. Each chapter contains a review of its subject, followed by object-oriented modeling techniques and methodologies that can be applied to real-life applications.

Contributors:
F. Casati, S. Ceri, R. Cicchetti, L. M. L. Delcambre, E. F. Ecklund, D. W. Embley, G. Engels, J. M. Gagnon, R. Godin, M. Gogolla, L. Groenewegen, G. S. Jensen, G. Kappel, B. J. Krämer, S. W. Liddle, R. Missaoui, M. Norrie, M. P. Papazoglou, C. Parent, B. Perniei, P. Poncelet, G. Pozzi, M. Schreft, R. T. Snodgrass, S. Spaccapietra, M. Stumptner, M. Teisseire, W. J. van den Heuevel, S. N. Woodfield.

Classification and Its Consequences

What do a seventeenth-century mortality table (whose causes of death include "fainted in a bath," "frighted," and "itch"); the identification of South Africans during apartheid as European, Asian, colored, or black; and the separation of machine- from hand-washables have in common? All are examples of classification—the scaffolding of information infrastructures.

In Sorting Things Out, Geoffrey C. Bowker and Susan Leigh Star explore the role of categories and standards in shaping the modern world. In a clear and lively style, they investigate a variety of classification systems, including the International Classification of Diseases, the Nursing Interventions Classification, race classification under apartheid in South Africa, and the classification of viruses and of tuberculosis.

The authors emphasize the role of invisibility in the process by which classification orders human interaction. They examine how categories are made and kept invisible, and how people can change this invisibility when necessary. They also explore systems of classification as part of the built information environment. Much as an urban historian would review highway permits and zoning decisions to tell a city's story, the authors review archives of classification design to understand how decisions have been made. Sorting Things Out has a moral agenda, for each standard and category valorizes some point of view and silences another. Standards and classifications produce advantage or suffering. Jobs are made and lost; some regions benefit at the expense of others. How these choices are made and how we think about that process are at the moral and political core of this work. The book is an important empirical source for understanding the building of information infrastructures.


With the rapid growth of the World Wide Web and electronic information services, information is becoming available on-line at an incredible rate. One result is the oft-decried information overload. No one has time to read everything, yet we often have to make critical decisions based on what we are able to assimilate. The technology of automatic text summarization is becoming indispensable for dealing with this problem. Text summarization is the process of distilling the most important information from a source to produce an abridged version for a particular user or task.

Until now there has been no state-of-the-art collection of the most important writings in automatic text summarization. This book presents the key developments in the field in an integrated framework and suggests future research areas. The book is organized into six sections: Classical Approaches, Corpus-Based Approaches, Exploiting Discourse Structure, Knowledge-Rich Approaches, Evaluation Methods, and New Summarization Problem Areas.

Contributors: D. A. Adams, C. Aone, R. Barzilay, E. Bloedorn, B. Boguraev, R. Brandow, C. Buckley, F. Chen, M. J. Chrzanowski, H. P. Edmundson, M. Elhadad, T. Firmin, R. P. Futrelle, J. Gorlinsky, U. Hahn, E. Hovy, D. Jang, K. Sparck Jones, G. M. Kasper, C. Kennedy, K. Kukich, J. Kupiec, B. Larsen, W. G. Lehnert, C. Lin, H. P. Luhn, I. Mani, D. Marcu, M. Maybury, K. McKeown, A. Merlino, M. Mitra, K. Mitze, M. Moens, A. H. Morris, S. H. Myaeng, M. E. Okurowski, J. Pedersen, J. J. Pollock, D. R. Radev, G. J. Rath, L. F. Rau, U. Reimer, A. Resnick, J. Robin, G. Salton, T. R. Savage, A. Singhal, G. Stein, T. Strzalkowski, S. Teufel, J. Wang, B. Wise, A. Zamora.


The Role of Academic Libraries in Teaching, Learning, and Research
Edited by Lawrence Dowler

Gateways to Knowledge is about change, about suspending old ideas without rejecting them and rethinking the purpose of the university and the library. Proponents of the gateway concept—which ties together these fifteen essays by scholars, librarians, and academic administrators—envision the library as a point of access to other library and research resources, and electronically beyond; as a place for teaching; and as a site for services and support where students and faculty can locate and use the information they need in the form in which they need it.

Struggling to define the library of the future, librarians have too often bolted new technology, programs, and services on to existing library functions. These essays focus instead on how information may be packaged and disseminated in a networked environment, as well as on how to think about the nature and qualities of electronic information.

There are discussions of specific gateway projects such as the Mann Library at Cornell, the new gateway library at the University of Southern California, the Information Arcade at the University of Iowa, and of "Who Built America?"—one of the most interesting new educational software packages currently available.

Contributors: Anthony Appiah (Harvard University). Steve Brier (City University of New York). Richard DeGennaro (Harvard College). Lawrence Dowler (Harvard College). Billy E. Frye (Emory University). Paul Ginsparg (Los Alamos National Laboratory). Richard Lanham (University of California, Los Angeles). Anita Lowry (University of Iowa). Peter Lyman (University of California at Berkeley). Patrick Manning (Northeastern University). Jan Olsen (Cornell University). Karen Price (Harvard University). Richard Rockwell (University of Michigan). Roy Rosenzweig (George Mason University). John Unsworth (University of Virginia). James Wilkinson (Harvard University)

From Method to Metaphor

Designing Information Technology in the Postmodern Age puts the theoretical discussion of computer systems and information technology on a new footing. Shifting the discourse from its usual rationalistic framework, Richard Coyne shows how the conception, development, and application of computer systems is challenged and enhanced by postmodern philosophical thought. He places particular emphasis on the theory of metaphor, showing how it has more to offer than notions of method and models appropriated from science.

Coyne examines the entire range of contemporary philosophical thinkingincluding logical positivism, analytic philosophy, pragmatism, phenomenology, critical theory, hermeneutics, and deconstructioncomparing them and showing how they differ in their consequences for design and development issues in electronic communications, computer representation, virtual reality, artificial intelligence, and multimedia. He also probes the claims made of information technology, including its presumptions of control, its so-called radicality, even its ability to make virtual worlds, and shows that many of these claims are poorly founded.

Among the writings Coyne visits are works by Heidegger, Adorno, Benjamin, Gadamer, Derrida, Habermas, Rorty, and Foucault. He relates their views to information technology designers and critics such as Herbert Simon, Alan Kay, Terry Winograd, Hubert Dreyfus, and Joseph Weizenbaum. In particular, Coyne draws extensively from the writing of Martin Heidegger, who has presented one of the most radical critiques of technology to date.

Proceedings of the First National Conference on Artificial Intelligence

AAAI proceedings describe innovative concepts, techniques, perspectives, and observations that present promising research directions in artificial intelligence.

Human Factors in Information Systems

Industry veteran Raymond Nickerson provides an extensive introduction to the information technology revolution that is transforming industrial society. He focuses particularly on the study of person-computer interaction, noting how computers are affecting their users and society as a whole, and describes a variety of ways in which information technology is expected to develop in the foreseeable future.

Nickerson summarizes the development of information technology and discusses many of its applications—in farming, research, education and training, manufacturing, general management, retailing, defense, and elsewhere—that have already had a substantial impact on society. He reviews the human-factors research that has been done and is underway, with special attention to the physical and cognitive interface, including languages, conversational interactions, and the concepts of friendliness and usability.

A Bradford Book

  • Page 3 of 3
  •