Skip navigation

Theory of Computation

  • Page 3 of 5

Genetic programming is a form of evolutionary computation that evolves programs and program-like executable structures for developing reliable time- and cost-effective applications. It does this by breeding programs over many generations, using the principles of natural selection, sexual recombination, and mutuation. This third volume of Advances in Genetic Programming highlights many of the recent technical advances in this increasingly popular field.

Exploratory Essays in Philosophical Computer Modeling

Philosophical modeling is as old as philosophy itself; examples range from Plato's Cave and the Divided Line to Rawls's original position. What is new are the astounding computational resources now available for philosophical modeling. Although the computer cannot offer a substitute for philosophical research, it can offer an important new environment for philosophical research.

Computational Models of Institutions and Groups

The globalization of the economy, increasing number of transnational organizations, and rapid changes in robotics, information, and telecommunication technologies are just a few of the factors significantly altering organizational time scales, forms, complexity, and environments. Time scales have shrunk, new organizational forms are emerging, and organizational environments are expanding and mutating at unprecedented rates. Computational modeling affords opportunities to both understand and respond to these complex changes.

Anatomy of a Parallel Computing System

foreword by Gordon Bell and afterword by H.T. Kung Although researchers have proposed many mechanisms and theories for parallel systems, only a few have actually resulted in working computing platforms. The iWarp is an experimental parallel system that was designed and built jointly by Carnegie Mellon University and Intel Corporation. The system is based on the idea of integrating a VLIW processor and a sophisticated fine-grained communication system on a single chip. This book describes the complete iWarp system, from instruction-level parallelism to final parallel applications.

Building a computer ten times more powerful than all the networked computing capability in the United States is the subject of this book by leading figures in the high performance computing community. It summarizes the near-term initiatives, including the technical and policy agendas for what could be a twenty-year effort to build a petaFLOP scale computer. (A FLOP—Floating Point OPeration—is a standard measure of computer performance and a PetaFLOP computer would perform a million billion of these operations per second.)

The thirteen chapters written expressly for this book by logicians, theoretical computer scientists, philosophers, and semanticists address, from the perspective of mathematical logic, the problems of understanding and studying the flow of information through any information-processing system.

Formal Models and Semantics
Edited by J. van Leeuwen

Theoretical computer science provides the foundations for understanding and exploiting the concepts and mechanisms in computing and information processing. This handbook will provide professionals and students with a comprehensive overview of the main results and developments in this rapidly evolving field. It consists of thirty-seven chapters in two volumes, all addressing core areas of theoretical computer science as it is practiced today. The material is written by leading American and European researchers, and each volume may be used independently.

Algorithms and Complexity
Edited by J. van Leeuwen

The Handbook of Theoretical Computer Science provides professionals and students with a comprehensive overview of the main results and developments in this rapidly evolving field. Volume A covers models of computation, complexity theory, data structures, and efficient computation in many recognized subdisciplines of theoretical computer science.

At the 1900 International Congress of Mathematicians, held that year in Paris, the German mathematician David Hilbert put forth a list of 23 unsolved problems that he saw as being the greatest challenges for twentieth-century mathematics. Hilbert's 10th problem, to find a method (what we now call an algorithm) for deciding whether a Diophantine equation has an integral solution, was solved by Yuri Matiyasevich in 1970. Proving the undecidability of Hilbert's 10th problem is clearly one of the great mathematical results of the century.

  • Page 3 of 5