Skip navigation

Scientific & Engineering Computation

  •  
  • Page 1 of 4
  • ÝÝ
Lessons about Simulation Technology and Organizational Change from Automotive Design

Every workday we wrestle with cumbersome and unintuitive technologies. Our response is usually “That’s just the way it is.” Even technology designers and workplace managers believe that certain technological changes are inevitable and that they will bring specific, unavoidable organizational changes. In this book, Paul Leonardi offers a new conceptual framework for understanding why technologies and organizations change as they do and why people think those changes had to occur as they did.

The introduction of high-throughput methods has transformed biology into a data-rich science. Knowledge about biological entities and processes has traditionally been acquired by thousands of scientists through decades of experimentation and analysis. The current abundance of biomedical data is accompanied by the creation and quick dissemination of new information. Much of this information and knowledge, however, is represented only in text form--in the biomedical literature, lab notebooks, Web pages, and other sources.

A Concise History

The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of “smart” hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective.

This text offers a comprehensive treatment of VHDL and its applications to the design and simulation of real, industry-standard circuits. It focuses on the use of VHDL rather than solely on the language, showing why and how certain types of circuits are inferred from the language constructs and how any of the four simulation categories can be implemented. It makes a rigorous distinction between VHDL for synthesis and VHDL for simulation.

Devices

This text offers an introduction to quantum computing, with a special emphasis on basic quantum physics, experiment, and quantum devices. Unlike many other texts, which tend to emphasize algorithms, Quantum Computing without Magic explains the requisite quantum physics in some depth, and then explains the devices themselves.

The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.

Use of Beowulf clusters (collections of off-the-shelf commodity computers programmed to act in concert, resulting in supercomputer performance at a fraction of the cost) has spread far and wide in the computational science community. Many application groups are assembling and operating their own "private supercomputers" rather than relying on centralized computing centers. Such clusters are used in climate modeling, computational biology, astrophysics, and materials science, as well as non-traditional areas such as financial modeling and entertainment.

Achieving System Balance
Edited by Daniel A. Reed

As we enter the "decade of data," the disparity between the vast amount of data storage capacity (measurable in terabytes and petabytes) and the bandwidth available for accessing it has created an input/output bottleneck that is proving to be a major constraint on the effective use of scientific data for research.

Randomization is an important tool in the design of algorithms, and the ability of randomization to provide enhanced power is a major research topic in complexity theory. Noam Nisan continues the investigation into the power of randomization and the relationships between randomized and deterministic complexity classes by pursuing the idea of emulating randomness, or pseudorandom generation.


There is increasing interest in genetic programming by both researchers and professional software developers. These twenty-two invited contributions show how a wide variety of problems across disciplines can be solved using this new paradigm.

  •  
  • Page 1 of 4
  • ÝÝ