Skip navigation

Computer Science and Intelligent Systems

  • Page 4 of 14
A Probabilistic Perspective

Today’s Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach.

This graduate-level textbook introduces fundamental concepts and methods in machine learning. It describes several important modern algorithms, provides the theoretical underpinnings of these algorithms, and illustrates key aspects for their application. The authors aim to present novel theoretical tools and concepts while giving concise proofs even for relatively advanced topics.

The development of the Semantic Web, with machine-readable content, has the potential to revolutionize the World Wide Web and its uses. A Semantic Web Primer provides an introduction and guide to this continuously evolving field, describing its key ideas, languages, and technologies.

Computer graphics technology is an amazing success story. Today, all of our PCs are capable of producing high-quality computer-generated images, mostly in the form of video games and virtual-life environments; every summer blockbuster movie includes jaw-dropping computer generated special effects. This book explains the fundamental concepts of 3D computer graphics.

Foundations and Algorithms

Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.

Pixels, Numbers, and Programs

This book explores image processing from several perspectives: the creative, the theoretical (mainly mathematical), and the programmatical. It explains the basic principles of image processing, drawing on key concepts and techniques from mathematics, psychology of perception, computer science, and art, and introduces computer programming as a way to get more control over image processing operations. It does so without requiring college-level mathematics or prior programming experience.

Logic, Language, and Analysis

In Software Abstractions Daniel Jackson introduces an approach to software design that draws on traditional formal methods but exploits automated tools to find flaws as early as possible. This approach—which Jackson calls “lightweight formal methods” or “agile modeling”—takes from formal specification the idea of a precise and expressive notation based on a tiny core of simple and robust concepts but replaces conventional analysis based on theorem proving with a fully automated analysis that gives designers immediate feedback.

C# is an object-oriented programming language that is similar to Java in many respects but more comprehensive and different in most details. This book offers a quick and accessible reference for anyone who wants to know C# in more detail than that provided by a standard textbook. It will be particularly useful for C# learners who are familiar with Java. This second edition has been updated and expanded, reflecting the evolution and extension of the C# programming language. It covers C# versions 3.0 and 4.0 and takes a look ahead at some of the innovations of version 5.0.

A First Course

This book guides students through an exploration of the idea that thinking might be understood as a form of computation. Students make the connection between thinking and computing by learning to write computer programs for a variety of tasks that require thought, including solving puzzles, understanding natural language, recognizing objects in visual scenes, planning courses of action, and playing strategic games.

A Gentle Introduction

The combination of two of the twentieth century’s most influential and revolutionary scientific theories, information theory and quantum mechanics, gave rise to a radically new view of computing and information. Quantum information processing explores the implications of using quantum mechanics instead of classical mechanics to model information and its processing. Quantum computing is not about changing the physical substrate on which computation is done from classical to quantum but about changing the notion of computation itself, at the most basic level.

  • Page 4 of 14