Skip navigation

Theory of Computation

  • Page 3 of 4

Building a computer ten times more powerful than all the networked computing capability in the United States is the subject of this book by leading figures in the high performance computing community. It summarizes the near-term initiatives, including the technical and policy agendas for what could be a twenty-year effort to build a petaFLOP scale computer. (A FLOP—Floating Point OPeration—is a standard measure of computer performance and a PetaFLOP computer would perform a million billion of these operations per second.)

The thirteen chapters written expressly for this book by logicians, theoretical computer scientists, philosophers, and semanticists address, from the perspective of mathematical logic, the problems of understanding and studying the flow of information through any information-processing system.

Algorithms and Complexity
Edited by J. van Leeuwen

The Handbook of Theoretical Computer Science provides professionals and students with a comprehensive overview of the main results and developments in this rapidly evolving field. Volume A covers models of computation, complexity theory, data structures, and efficient computation in many recognized subdisciplines of theoretical computer science.

At the 1900 International Congress of Mathematicians, held that year in Paris, the German mathematician David Hilbert put forth a list of 23 unsolved problems that he saw as being the greatest challenges for twentieth-century mathematics. Hilbert's 10th problem, to find a method (what we now call an algorithm) for deciding whether a Diophantine equation has an integral solution, was solved by Yuri Matiyasevich in 1970. Proving the undecidability of Hilbert's 10th problem is clearly one of the great mathematical results of the century.

Philosophy, Cognitive Science, and Parallel Distributed Processing

Parallel Distributed Processing is transforming the field of cognitive science. Microcognition provides a clear, readable guide to this emerging paradigm from a cognitive philosopher's point of view. It explains and explores the biological basis of PDP, its psychological importance, and its philosophical relevance.

Formal Models and Semantics
Edited by J. van Leeuwen

Theoretical computer science provides the foundations for understanding and exploiting the concepts and mechanisms in computing and information processing. This handbook will provide professionals and students with a comprehensive overview of the main results and developments in this rapidly evolving field. It consists of thirty-seven chapters in two volumes, all addressing core areas of theoretical computer science as it is practiced today. The material is written by leading American and European researchers, and each volume may be used independently.

Animation provides a rich environment for actively exploring algorithms. Multiple, dynamic, graphical displays of an algorithm reveal properties that might otherwise be difficult to comprehend or even remain unnoticed. This exciting new approach to the study of algorithms is taken up by Marc Brown in Algorithm Animation.

Teaching the theory of error correcting codes on an introductory level is a difficult task. The theory, which has immediate hardware applications, also concerns highly abstract mathematical concepts. This text explains the basic circuits in a refreshingly practical way that will appeal to undergraduate electrical engineering students as well as to engineers and technicians working in industry.

An Introduction to Computational Geometry

Perceptrons - the first systematic study of parallelism in computation - has remained a classical work on threshold automata networks for nearly two decades. It marked a historical turn in artificial intelligence, and it is required reading for anyone who wants to understand the connectionist counterrevolution that is going on today.

  • Page 3 of 4