Skip navigation

Econometrics & Statistical Methods

  • Page 3 of 7
A Math Tool Kit

This text offers an accessible yet rigorous development of many of the fields of mathematics necessary for success in investment and quantitative finance, covering topics applicable to portfolio theory, investment banking, option pricing, investment, and insurance risk management. The approach emphasizes the mathematical framework provided by each mathematical discipline, and the application of each framework to the solution of finance problems. It emphasizes the thought process and mathematical approach taken to develop each result instead of the memorization of formulas to be applied (or misapplied) automatically. The objective is to provide a deep level of understanding of the relevant mathematical theory and tools that can then be effectively used in practice, to teach students how to “think in mathematics” rather than simply to do mathematics by rote. Each chapter covers an area of mathematics such as mathematical logic, Euclidean and other spaces, set theory and topology, sequences and series, probability theory, and calculus, in each case presenting only material that is most important and relevant for quantitative finance. Each chapter includes finance applications that demonstrate the relevance of the material presented. Problem sets are offered on both the mathematical theory and the finance applications sections of each chapter. The logical organization of the book and the judicious selection of topics make the text customizable for a number of courses. The development is self-contained and carefully explained to support disciplined independent study as well. A solutions manual for students provides solutions to the book’s Practice Exercises; an instructor’s manual offers solutions to the Assignment Exercises as well as other materials.

Downloadable instructor resources available for this title: instructor's manual

Postmodern Developments in the Theory of General Economic Equilibrium

In The Equilibrium Manifold, noted economic scholar and major contributor to the theory of general equilibrium Yves Balasko argues that, contrary to what many textbooks want readers to believe, the study of the general equilibrium model did not end with the existence and welfare theorems of the 1950s. These developments, which characterize the modern phase of the theory of general equilibrium, led to what Balasko calls the postmodern phase, marked by the reintroduction of differentiability assumptions and the application of the methods of differential topology to the study of the equilibrium equation. Balasko’s rigorous study demonstrates the central role played by the equilibrium manifold in understanding the properties of the Arrow-Debreu model and its extensions. Balasko argues that the tools of differential topology articulated around the concept of equilibrium manifold offer powerful methods for studying economically important issues, from existence and uniqueness to business cycles and economic fluctuations. After an examination of the theory of general equilibrium’s evolution in the hundred years between Walras and Arrow-Debreu, Balasko discusses the properties of the equilibrium manifold and the natural projection. He highlights the important role of the set of no-trade equilibria, the structure of which is applied to the global structure of the equilibrium manifold. He also develops a geometric approach to the study of the equilibrium manifold. Applications include stability issues of adjustment dynamics for out-of-equilibrium prices, the introduction of price-dependent preferences, and aspects of time and uncertainty in extensions of the general equilibrium model that account for various forms of market frictions and imperfections. Special effort has been made at reducing the mathematical technicalities without compromising rigor. The Equilibrium Manifold makes clear the ways in which the postmodern” developments of the Arrow-Debreu model improve our understanding of modern market economies.

The field of forest economics has expanded rapidly in the last two decades, and yet there exists no up-to-date textbook for advanced undergraduate-graduate level use or rigorous reference work for professionals. Economics of Forest Resources fills these gaps, offering a comprehensive technical survey of the field with special attention to recent developments regarding policy instrument choice and uncertainty. It covers all areas in which mathematical models have been used to explain forest owner and user incentives and government behavior, introducing the reader to the rigor needed to think through the consequences of policy instruments. Technically difficult concepts are presented with a unified and progressive approach; an appendix outlines the basic concepts from calculus needed to understand the models and results developed. The book first presents the historical and classic models that every student or researcher in forest economics must know, including Faustman and Hartman approaches, public goods, spatial interdependence, two period life-cycle models, and overlapping generations problems. It then discusses topics including policy instrument choice, deforestation, biodiversity conservation, and age-class based forest modeling. Finally, it surveys such advanced topics as uncertainty in two period models, catastrophic risk, stochastic control problems, deterministic optimal control, and stochastic and deterministic dynamic programming approaches. Boxes with empirical content illustrating applications of the theoretical material appear throughout. Each chapter is self-contained, allowing the reader, student, or instructor to use the text according to individual needs.

Theory and Computation

This text provides an introduction to the modern theory of economic dynamics, with emphasis on mathematical and computational techniques for modeling dynamic systems. Written to be both rigorous and engaging, the book shows how sound understanding of the underlying theory leads to effective algorithms for solving real world problems. The material makes extensive use of programming examples to illustrate ideas. These programs help bring to life the abstract concepts in the text. Background in computing and analysis is offered for readers without programming experience or upper-level mathematics. Topics covered in detail include nonlinear dynamic systems, finite-state Markov chains, stochastic dynamic programming, stochastic stability and computation of equilibria. The models are predominantly nonlinear, and the emphasis is on studying nonlinear systems in their original form, rather than by means of rudimentary approximation methods such as linearization. Much of the material is new to economics and improves on existing techniques. For graduate students and those already working in the field, Economic Dynamics will serve as an essential resource.

Selected Papers of Lionel W. McKenzie

Influential neoclassical economist Lionel McKenzie has made major contributions to postwar economic thought in the fields of equilibrium, trade, and capital accumulation. This selection of his papers traces the development of his thinking in these three crucial areas.

McKenzie's early academic life took him to Duke, Princeton, Oxford, the University of Chicago, and the Cowles Commission. In 1957, he went to the University of Rochester to head the economics department there, and he remains at Rochester, now Wilson Professor Emeritus of Economics. McKenzie's most significant research was undertaken during a period that saw the development of the major themes of neoclassical economics and the use of fundamental mathematical methods to do so. McKenzie contributed to both aspects of this research program. He helped shape the direction of the field and, at Rochester, influenced generations of future scholars. In 2002, The MIT Press published McKenzie's Classical General Equilibrium Theory, a detailed summary of the model and methodology. This book, collecting his most important papers in the form in which they were originally published, can be seen as a companion to that one. The many state-of-the-art results achieved in McKenzie's original papers present sophisticated theoretical work that will continue to be important to future developments in the discipline.

Debates over post-Kyoto Protocol climate change policy often take note of two issues: the feasibility and desirability of international cooperation on climate change policies, given the failure of the United States to ratify Kyoto and the very limited involvement of developing countries, and the optimal timing of climate policies. In this book essays by leading international economists offer insights on both these concerns.

The book first considers the appropriate institutions for effective international cooperation on climate change, proposing an alternative to the Kyoto arrangement and a theoretical framework for such a scheme. The discussions then turn to the stability of international environmental agreements, emphasizing the logic of coalition forming and demonstrating the applicability of game-theoretical analysis. Finally, contributors address both practical and quantitative aspects of policy design, offering theoretical analyses of such specific policy issues as intertemporal carbon trade and implementation of a sequestration policy, and then by formal mathematical models examining policies related to the rate of climate change, international trade and carbon leakage, and the shortcomings of the standard Global Warming Potential index.

Contributors: Philippe Ambrosi, David F. Bradford, Barbara Buchner, Carlo Carraro, Parkash Chander, Stéphane De Cara, Damien Demailly, A. Denny Ellerman, Johan Eyckmans, Michael Finus, Elodie Galko, Roger Guesnerie, Jean-Charles Hourcade, Pierre-Alain Jayet, Gilles Lafforgue, Bernard Magné, Sandrine Mathy, Michel Moreaux, Sushama Murty, William A. Pizer, Philippe Quirion, Katrin Rehdanz, P. R. Shukla, Jaemin Song, Ian Sue Wing, Sylvie Thoron, Richard S. J. Tol, Henry Tulkens.

CESifo Seminar series

A Review

No names are more closely associated with modern trade theory than Eli Heckscher and Bertil Ohlin. The basic Heckscher-Ohlin proposition, according to which a country exports factors (embodied in goods) in relatively abundant supply and imports factors in relatively scarce supply, is a key component of modern trade theory. In this book, Robert Baldwin traces the development of the HO model, describing the historical twists and turns that have led to the basic modern theoretical model in use today. Baldwin not only presents a clear and cohesive view of the model's evolution but also reviews the results of empirical tests of its various versions.

Baldwin, who published his first theoretical article on the HO model in 1948, first surveys the development of the HO model and then assesses empirical tests of its basic proposition. Most discussions of empirical work on HO models confine themselves to the basic theorem, but Baldwin devotes a chapter to empirical tests of its three related propositions: the Stolper-Samuelson theorem, the Rybczynski theorem, and the factor price equalization theorem. He concludes that economists' understanding of the forces shaping international trade have been greatly improved through the interactive process of empirical testing and theoretical modification, but that many empirical economists (himself included) became so enamored with the elegant but highly unrealistic factor price equalization model developed from the insights of Heckscher and Ohlin that they neglected investigations of other versions of the basic HO model without this relationship.

Ohlin Lectures series

Policy makers need quantitative as well as qualitative answers to pressing policy questions. Because of advances in computational methods, quantitative estimates are now derived from coherent nonlinear dynamic macroeconomic models embodying measures of risk and calibrated to capture specific characteristics of real-world situations. This text shows how such models can be made accessible and operational for confronting policy issues.

The book starts with a simple setting based on market-clearing price flexibility. It gradually incorporates departures from the simple competitive framework in the form of price and wage stickiness, taxes, rigidities in investment, financial frictions, and habit persistence in consumption.

Most chapters end with computational exercises; the MATLAB code for the base model can be found in the appendix. As the models evolve, readers are encouraged to modify the codes from the first simple model to more complex extensions.

Computational Macroeconomics for the Open Economy can be used by graduate students in economics and finance as well as policy-oriented researchers.

Dynamic General Equilibrium in a Non-Ricardian World

An important recent advance in macroeconomics is the development of dynamic stochastic general equilibrium (DSGE) macromodels. The use of DSGE models to study monetary policy, however, has led to paradoxical and puzzling results on a number of central monetary issues including price determinacy and liquidity effects. In Money, Interest, and Policy, Jean-Pascal Bénassy argues that moving from the standard DSGE models--which he calls "Ricardian" because they have the famous "Ricardian equivalence" property--to another, "non-Ricardian" model would resolve many of these issues. A Ricardian model represents a household as a homogeneous family of infinitely lived individuals, and Bénassy demonstrates that a single modification--the assumption that new agents are born over time (which makes the model non-Ricardian)--can bridge the current gap between monetary intuitions and facts, on one hand, and rigorous modeling, on the other.After comparing Ricardian and non-Ricardian models, Bénassy introduces a model that synthesizes the two approaches, incorporating both infinite lives and the birth of new agents. Using this model, he considers a number of issues in monetary policy, including liquidity effects, interest rate rules and price determinacy, global determinacy, the Taylor principle, and the fiscal theory of the price level. Finally, using a simple overlapping generations model, he analyzes optimal monetary and fiscal policies, with a special emphasis on optimal interest rate rules.

Reconciling Theory and Evidence

Though competition occupies a prominent place in the history of economic thought, among economists today there is still a limited, and sometimes contradictory, understanding of its impact. In Competition and Growth, Philippe Aghion and Rachel Griffith offer the first serious attempt to provide a unified and coherent account of the effect competition policy and deregulated entry has on economic growth.

The book takes the form of a dialogue between an applied theorist calling on "Schumpeterian growth" models and a microeconometrician employing new techniques to gauge competition and entry. In each chapter, theoretical models are systematically confronted with empirical data, which either invalidates the models or suggests changes in the modeling strategy. Aghion and Griffith note a fundamental divorce between theorists and empiricists who previously worked on these questions. On one hand, existing models in industrial organization or new growth economics all predict a negative effect of competition on innovation and growth: namely, that competition is bad for growth because it reduces the monopoly rents that reward successful innovators. On the other hand, common wisdom and recent empirical studies point to a positive effect of competition on productivity growth. To reconcile theory and evidence, the authors distinguish between pre- and post-innovation rents, and propose that innovation may be a way to escape competition, an idea that they confront with microeconomic data. The book's detailed analysis should aid scholars and policy makers in understanding how the benefits of tougher competition can be achieved while at the same time mitigating the negative effects competition and imitation may have on some sectors or industries.

  • Page 3 of 7