Skip navigation

Scientific & Engineering Computation

  • Page 3 of 5
Telerobotics and Telepistemology in the Age of the Internet
Edited by Ken Goldberg

The Robot in the Garden initiates a critical theory of telerobotics and introduces telepistemology, the study of knowledge acquired at a distance. Many of our most influential technologies, the telescope, telephone, and television, were developed to provide knowledge at a distance. Telerobots, remotely controlled robots, facilitate action at a distance. Specialists use telerobots to explore actively environments such as Mars, the Titanic, and Chernobyl. Military personnel increasingly employ reconnaissance drones and telerobotic missiles.

Computational Modeling and Organizational Theories

An organization is more than the sum of its parts, and the individual components that function as a complex social system can be understood only by analyzing their collective behavior. This book shows how state-of-the-art simulation methods, including genetic algorithms, neural networks, and cellular automata, can be brought to bear on central problems of organizational theory related to the emergence, permanence, and dissolution of hierarchical macrostructures.

Automated reasoning has matured into one of the most advanced areas of computer science. It is used in many areas of the field, including software and hardware verification, logic and functional programming, formal methods, knowledge representation, deductive databases, and artificial intelligence. This handbook presents an overview of the fundamental ideas, techniques, and methods in automated reasoning and its applications. The material covers both theory and implementation.

Automated reasoning has matured into one of the most advanced areas of computer science. It is used in many areas of the field, including software and hardware verification, logic and functional programming, formal methods, knowledge representation, deductive databases, and artificial intelligence. This handbook presents an overview of the fundamental ideas, techniques, and methods in automated reasoning and its applications. The material covers both theory and implementation.

Theory and Practice

A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics.

This textbook takes an innovative approach to the teaching of classical mechanics, emphasizing the development of general but practical intellectual tools to support the analysis of nonlinear Hamiltonian systems. The development is organized around a progressively more sophisticated analysis of particular natural systems and weaves examples throughout the presentation. Explorations of phenomena such as transitions to chaos, nonlinear resonances, and resonance overlap to help the student to develop appropriate analytic tools for understanding.

What assumptions and methods allow us to turn observations into causal knowledge, and how can even incomplete causal knowledge be used in planning and prediction to influence and control our environment? In this book Peter Spirtes, Clark Glymour, and Richard Scheines address these questions using the formalism of Bayes networks, with results that have been applied in diverse areas of research in the social, behavioral, and physical sciences.

Performance evaluation and benchmarking are of concern to all computer-related disciplines. A benchmark is a standard program or set of programs that can be run on different computers to give an accurate measure of their performance. This book covers a variety of aspects of computer performance evaluation, with a focus on Standard Performance Evaluation Corporation (SPEC) benchmarks. SPEC is a nonprofit organization whose members represent industry, academia, and other organizations.


Genetic programming is a form of evolutionary computation that evolves programs and program-like executable structures for developing reliable time- and cost-effective applications. It does this by breeding programs over many generations, using the principles of natural selection, sexual recombination, and mutuation. This third volume of Advances in Genetic Programming highlights many of the recent technical advances in this increasingly popular field.


A Guide to the Implementation and Application of PC Clusters

Supercomputing research--the goal of which is to make computers that are ever faster and more powerful--has been at the cutting edge of computer technology since the early 1960s. Until recently, research cost in the millions of dollars, and many of the companies that originally made supercomputers are now out of business.The early supercomputers used distributed computing and parallel processing to link processors together in a single machine, often called a mainframe.

  • Page 3 of 5