Saturday, January 30, 2010

How Canada Avoided the Financial Bubble

Canada avoided the Financial Crisis of 2007-2010 using simple, no-nonsense financial regulation involving (1) capital requirements (targets for tier-one capital holdings), (2) quality of capital regulations (75% of tier-one capital in common rather than preferred stock) and (3) a leverage ratio of 20 to 1.

Similar lesson about no-nonsense control and regulation of the health care system are also available from Canada: (1) control of physicians salaries, (2) control of capital investment and technology and (3) control of drug prices.

Even though Canada has flirted with Neoliberalism, it seems to have avoided the nonsensical parts.

Tuesday, January 26, 2010

History Repeating: 1937

In 1937, the Federal Reserve and the Roosevelt administration decided the Great Depression was over and that it was time to stop spending. The result of the fiscal and monetary tightening was a second downturn in Gross Domestic Product (GDP). Today, the Obama administration has decided to seek spending freezes and to trim deficits. Is history repeating?

Hopefully not, but a time plot of real GDP during the great Depression shows that the bottom was hit around 1932. The episode in 1937 was a brief downturn compared to the Crash itself.

Science as System

In the current global warming debate it would be helpful to have an idea about how the science system works when trying to evaluate the widely publicized outputs. Here is a meta-model discussed today in the Principles of Environmental Science.

Science is fundamentally about developing models. Usually the models are expressed mathematically or as computer code or both. Model development, however, is just the first step. The model must be tested against reality and revised if it doesn't fit the data.

After enough independent tests, revisions and validations, a model can be generalized into a theory (a model along with all its tests and revisions). Theories, so defined, accumulate our base of knowledge (scientia).

Is this the way it always happens? Not exactly! Many models in economics and system dynamics skip from model directly to theory and received wisdom. For example, consider William Norhaus' DICE model and the Limits to Growth models. I have been unable to find anything other than casual attempts to relate the model outputs to historical data.

Another approach in the global warming debate is to present tests without either a model or a theory. The work of Bjorn Lomborg provides a notorious example. Simple historical time plots are presented to "disprove" the global warming assertion and argue that things are actually getting better.

Compared to the two approaches above, the Intergovernmental Panel on Climate Change (IPCC) is doing science. The assessment and technical reports are essentially presentations of models along with results from testing. For some technical areas, there are no models. For some other technical areas, there are few tests. For the Integrated Assessment Models used to develop the family of Emissions Scenarios, it's sometimes hard to tell the status of the models. The testing probably needs improvement. Possibly that's why the IPCC publishes scenarios rather than forecasts with probability assessments. It's an important area that needs more scientific attention.

Sunday, January 24, 2010

The Three-legged Health Care Stool

In a NY Times OP-ED piece, Paul Krugman argues that health care reform is a three-legged stool that cannot be passed incrementally. The legs of the stool are (1) banning insurance discrimination (underwriting), (2) mandating universal coverage and (3) providing financial aid to low-income families. In the long-run, that's probably correct. In the long-run, however, the three legs of the stool, even if all in place, don't address the fundamental health care cost drivers (capital investment/technology, pharmaceutical prices and prices for physician services). In any event, the current legislation won't be phased in for a few years which in politics becomes the long-run.

The way government in the US seems to work is to address one crisis at a time. If passing a law banning underwriting leads to a "death spiral" where "healthier Americans would choose not to buy insurance, leading to high premiums for those who remain, driving out more people, and so on..." then Congress would have to deal with that problem when it happened. Since Congress can only seem to deal with one problem at a time, maybe this is the best that can be expected from our political institutions.
Returning to the analogy, maybe the seat of the health care reform is sustainability and only one leg is social equity (the image above from Sustainability Now!) Maybe Mr. Krugman doesn't have all the legs or even the seat, for that matter.

Wednesday, January 20, 2010

Mr. Brown Goes to Washington

Scott Brown (R-MA) won the special election to fill Ted Kennedy's seat. As the 41st Republican Senator, he finds himself in a powerful position. In his press conference this morning, he said that (1) he favors state-level health programs like the Massachusetts Health Care Program, (2) he favors health care reform at the national level, (3) he is independent, even though he will caucus with Republicans, (4) he wants to solve problems rather than take ideological positions and (5) he seems to get along with President Obama.

I'm not sure where Mr. Brown stands on all health care issues (and he might not be entirely sure either), but there are plenty of problems to solve. As a problem solver, he should support a divide-and-conquer strategy. Let's take what can be agreed on in the current legislation (e.g., health insurance reform), pass it and move on to other problems on the list.

Let's also call him (and Orin Hatch, R-Utah) on state-level programs taking a page from the successful Race To The Top education grant reform program: put money on the table and States will go far out of their way to compete for it, to the amazing extent of quickly changing laws to qualify for the funding. How about a Race To The Top for health reform?

Good luck in Washington, Mr. Brown. Use your power for good while you've got it.

Tuesday, January 19, 2010

A Cold North Wind for Neoliberalism

Iceland is in the midst of of a financial crisis after its economic bubble burst. The bubble was driven by neoliberal, free-market philosophy and now the party is over. Johannes Por Skulason is heading a group opposed to the terms of repayment (austerity) being imposed on Iceland. Here is the basis of his opposition:

"I can be very frank about it: What we want to do is abolish this neo-liberal greed philosophy that was driving things in the bubble years," he says. "What we want to re-establish in Iceland is a strong Nordic welfare society with equal justice and equality."

Good luck to you, Mr. Skulason!

A Framework for Policy

Today in the Principles of Environmental Science, Cal DeWitt described a framework for the course that has wider application.

The framework involves three related considerations: science, ethics and praxis. From the policy perspective, to put something into practice, the framework suggests that the science and ethics of the policy also need to be considered. For example, if one believes that forests should be economically productive, forest biomass can be used as an energy source. However, the science tells us that forests need the biomass (dead trees, rotting vegetation, etc.) for their own survival, thus returning to the ethical dilemma.

As a result of the subprime mortgage crisis, debate over this framework has started within economics. N. Gregory Mankiw, a Harvard economist, recently wrote a paper on The Macroeconomist as a Scientist and Engineer. He argues that the current disagreements in macroeconomics are between the economic scientists (neoclassical, neoliberal, free market) and the economic engineers (Keynesian). If the goal of science is defined narrowly as figuring out how the world works, the subprime mortgage crisis pointed out the weaknesses in economic science. The framework above also points out the weaknesses of the ethical foundations, not withstanding welfare economics.

Thursday, January 14, 2010

What is the Individual Mandate?

Here's an excellent policy brief from Health Affairs giving what's in the current legislation and the pros and cons of an individual mandate for health insurance coverage. What caught my attention was the discussion of whether an individual mandate makes sense without a public option.

If your interested in legal issues, read The Constitutionality of the Individual Mandate for Health Insurance by Jack Balkin of the Yale Law School. Bottom line: the mandate is just a tax that can be avoided by purchasing health insurance. Congress has the constitutional power to tax.

Wednesday, January 13, 2010

Wall Street High Rollers on Capital Hill

Today, before the Financial Crisis Inquiry Commission, J. P. Morgan CEO Jaime Dimon and other Wall Street Barons testified about their role in the financial crisis. A few days ago, Mr. Dimon commented that J. P. Morgan's operations are run for clients and "it is not a casino." Although Mr. Dimon tempered his comments a bit before the Committee, me thinks he doth protest too much.

Monday, January 11, 2010

A Random Walk Among the Undead


The efficient market model was pronounced dead over a year ago at the World Economic Forum in Davos. But, Jeff Anderson-Lee, a commentator on Paul Krugman's blog, commented that "the 'efficient-market hypothesis' ... seems harder to kill than the undead."

What's going on here; why won't the theory die? In this case, I have to agree with Robert Shiller, the theory won't die because it's partly true! If you've been following my attempt to forecast the S&P 500, you know that I've produce a number of plausible forecasts for the market future from optimistic to pessimistic. One explanation is that I don't know what I'm doing, another is that the future is unknowable and a third is the efficient market hypothesis (EMH).

What is the efficient market hypothesis? That's a little difficult to present clearly because the concept has become overloaded with multiple meanings. The simplest form of the hypothesis is the random walk hypothesis, that is, stock prices move according to a random walk. The random walk hypothesis requires that the largest characteristic root of a state-space model including S&P 500 prices as the only output variable should be close to unity. For a monthly model I estimate running from January 1950 to January 2010, the largest characteristic root is 0.9986935 or, with rounding, unity.
A forecast with this model (displayed above) shows that our best prediction for the future of S&P 500 prices is the current price, as called for by the random walk hypothesis. Actually, the forecast above was not made with a pure random walk model but rather a random walk with drift model. The pure random walk equation is X(t) = X(t-1) + e(t-1) where e(t-1) is random, uncorrelated error. The random walk with drift is X(t) = a + X(t-1) + e(t-1) where a is the drift term. Supposedly, the drift term invalidates the random walk hypothesis, but even with drift the market is not very predictable.

Interestingly, plotting just X(t) = a + X(t-1) provides a very basic bubble predictor. From 1950 to 1995, a buy-and-hold investment strategy (advocated by EMH proponents) made some sense. No matter when you purchased the stock that tracked the S&P 500, you made money on any sale. After 1995, things became a lot more risky. Stocks purchased in 2000 and sold in 2010 generated huge loses. Buy-and-hold after 1995, in retrospect, wouldn't have made much sense as an investment strategy.

If EMH was restricted to random walk or random walk with drift models, it is a good basic starting point for understanding investment strategies and market bubbles. However, "efficient" should not be equated with "optimal." A casino is essentially a random walk with drift for the house. Where the EMH embraces visions of perfection, it obviously (after the dot-com bubble and the subprime mortgage bubble) goes too far. In future posts I'll struggle with what optimization would mean in the context of the stock market and struggle even harder with the question of whether, whatever the stock market is, it benefits the US economy.


Funding Health Care Reform by Taxing "Cadillac" Health Plans

Tonight on the PBS News Hour, Josh Bivens of the Economic Policy Institute commented that (1) in the sort-run, an excise tax on generous health benefit plans will generate revenue but (2) in the long-run, employers will stop offering these plans to employees to avoid the tax. If this happens, where does the federal government go next to fund health care reform?

Thursday, January 7, 2010

Failing to Connect the Dots

The White House released the Undie Bomber security review today. The president labeled the incident "...a failure to connect the dots..." but (1) there wasn't much more detail in the publicly released review and (2) it didn't seem to call for a change in strategy just more centralization and information passing.

Here are some readings that suggest a different strategy: (1) Congratulations, Osama, how Ben-Gurion Airport does security, (2) Profile Me If You Must, focus on people rather than screening, (3) Terror Database Has Quadrupled in Four Years, the ugly details of how the Terrorist Identities Datamart Environment (TIDE) functions and fails. (4) Meeting the Threat of Terrorism, why "discoverability" of information is more important to connecting the dots than sharing and (5) WIJIS Architecture Overview: Version 2.0, a formal information architecture to enable "discoverability"--with a criticism of the current approach.

Wednesday, January 6, 2010

Double Bubble, Toil and Trouble

The NY Times today posed the question "Fed Missed This Bubble. Will It See a New One?" Both Fed chairmen Alan Greenspan and Ben Bernanke famously missed the development of the dot-com and the Great Recession bubbles (even conservative News.max accepts this analysis). So, how hard is it really to identify bubbles?
In an earlier post I showed how feather forecasting from a state-space model estimated up to 1990, showed that the market was over-priced during both the dot-com and the subprime mortgage bubbles. Implicitly, the feather forecast points to a sustainable level for the market but doesn't really display the level explicitly. To do that we need to remove the cyclical components and the month-to-month shocks from the model and run a counterfactual simulation for the stock market. The simulation is displayed above. Rather than reaching almost 1600 in the peak of the dot-com bubble, the simulation suggests that the market should have been at about 600. And, 1000 would have been a better level when subprime mortgage crisis broke.

Why is it so difficult for the US Federal Reserve to identify bubbles? Part of the problem is that one of the major cyclical state variable is the US economy is driven by Fed policy. Another cyclical state variable involves more fundamental economic factors such as housing, corporate profits, oil prices and unemployment. Whether the Fed is counter-cyclical or pro-cyclical will have to be a topic for another posting but clearly the Fed is in the middle of market cycles and has trouble looking outside the box.

And, compared to my pessimistic forecast for the S&P 500, the forecast eliminating the cyclical state variables for the US economy is optimistic. In the figure above, the S&P 500 is back to pre-crisis levels by 2015. The usual disclaimers apply even more strongly to this counterfactual forecast.

At best, financial reform might reduce but not totally eliminate bubbles in the US economy. Since late 20th Century bubbles seem to last about a half decade, the pattern suggests a bubble investing strategy that would differ from the standard diversified portfolio approach--a topic for a future posting.

Sunday, January 3, 2010

Need To Know vs. Need To Share

The Cold-War culture that I was exposed to in the US military and still permeates the federal government is that information should be made available on a "need to know" basis. This approach might have made some sense during the cold war but, after 9/11, it seems to make much less sense today. The cultural change being pushed right now is the "need to share".

What hopefully will not get lost in the pendulum swing from one extreme to the other is that these approaches are not mutually exclusive. Some things need to remain in the federal government stove pipes and other things need to be shared. The Markle Task Force on National Security has developed a balanced set of initiatives for information sharing emphasizing: (1) Privacy and civil liberties protection, (2) Discoverability "... offering users the ability to 'discover' data that exists elsewhere without gaining access to the underling information until the user requesting access is authorized and authenticated", (3) An "authorized use" standard for information sharing and (4) Culture change from need-to-know to need-to-share.

Particularly to the second point on discoverability, I would (1) add the important motivational element of creating a publish/subscribe system--if agencies publish, they also get to subscribe ("share-to-play") and (2) emphasize the importance of starting with basic, standardized information about events of importance to national security. Here is the relevant paragraph from the Markle Foundation report on Discoverability:

Discoverability is the first step in an effective system for information sharing, offering users the ability to “discover” data that exists elsewhere. Data is tagged at the point of collection with standardized information (e.g., who, what, where, when) and submitted to a central index. Just as a card catalogue in a library serves as a central index, directing users to relevant books—but doesn’t provide the book itself—these “data indices” point users to data holders and documents, depending on the search criteria used.

Saturday, January 2, 2010

The Dot-com and Subprime Mortgage Bubbles: An Example of State-Space Forecasting Techniques

In a prior post, I described state space models and why I think they are useful for time series forecasting. I'm using state-space models to generate forecasts for 2010 and plan to look at how well the models did in 2011. However, I can pretend that I was trying this same exercise in 1990 by estimating the model only on historical data up to 1990, using the 1990 model to forecast forward to the present and see how well the model would forecast two notorious crises that standard economic models seem unable to have predicted, the dot-com bubble from 1998-2000 and the subprime mortgage crisis from 2007-2010. For this exercise, I'll just look at the stock market, particularly the S&P 500 index using data from Yahoo! Finance.
First, as a point of comparison, I estimated a state-space model for the entire period. The model uses the state of the US economy as an input variable and the volumes and price data for the S&P 500 as the output variables. The fit of the model for S&P 500 volumes and prices in the figure above (dotted red line) is excellent.Using a model estimated only up to 1990, the price fit is still good but the volume forecast (upper panel above) starts to deteriorate after 2005. Notice for both series that the model tends to miss the turning points--a common feature of all forecasting models (the models don't know that the bubble has burst until after it has happened).

Although models appear unable to capture turning points, they are able to identify developing bubbles using feather forecasting. A feather forecast uses the model to predict a number of periods into the future (thirty-six in the figure above for monthly S&P 500 data), starting from each period in the sample. Up to 1995, the model forecasts basic linear growth for S&P 500 prices. In 1995, however, the feather forecast identifies the developing dot-com bubble as having started in 1995 rather than the conventional dating of 1998. From 1995 to well into 2001, the feather forecast is predicting a collapse in prices. Indeed, from 1995 to 2007 (the last point where a 36 month feather forecast can be made) the feather forecast is warning of a collapse in prices. A conservative investing strategy would have been not to ignore the developing bubble but realize that one would have to be vigilant and liquid with any new investments made after 1995.

This then brings us back to a future forecast made with the model estimated up to 1990. The model is essentially predicting a sell off for 2010 (increasing volumes and decreasing prices). It will be interesting to see what actually happens over the next year. The model cannot predict monthly rallies and sell-offs or the turning points for recovery, but studying the feather forecast suggests that the S&P 500 cannot realistically reach 1000 for a few more years into the future. In any event, the usual disclaimers apply to this prediction.

Forecasting Disclaimer

At the beginning of 2010, I'm publishing a collection of economic and environmental forecasts based on state-space time series models. Given that economists were unable to forecast the financial crisis of 2007-2010, I should probably explain why I think my approach is better.

First, I'm not sure my approach is better. That's part of the purpose for doing the forecasts in early 2010. At the start of 2011 (or any point in between), I can look back at the forecasts, see what went wrong (if anything) and try to figure out the reasons for any failures.

The state-space approaches are attractive because they address a number of known problems with existing forecasting models. State-space models forecast based on the "unobserved" state of the system being studied. The state variables are the minimum collection of independent, orthogonal variables that connect the model's input variables to the output variables. Typically, economic forecasting models have a rigid set of structural equations that are used to predict each output variable. Real human systems, however, are too flexible and too emergent to be modeled with a rigid set of equations.

A related attractive feature of state-space models is that there are two sources of error variation, one for the state-vectors and another for the output variables. Typically, economic forecasting models have one source of error for each output equation. And, typically the error distributions are modeled using a bell-shaped normal distribution. Nouriel Roubini, for one, has strongly criticized economic forecasting models both for their unrealistic structure, estimation techniques and assumptions about error terms. Rather than assuming that the functional form of the error term is known, state-space models can easily use a non-parametric bootstrapping technique to estimate confidence intervals around forecasts and the Kalman filter to compute the system state while forecasting (after an approximate system state is used to estimate the model).

Another interesting aspect of state-space models is the use of computer simulation to understand the forecasts. For example, one can estimate a state-space model for a based period (say 1950-1990 for the US economy) and then use simulation techniques to calculate the unobserved system state (using the Kalman filter) and the time path of the output variables to any point in the future. I'll present an example of the analysis in a later post.

For all these reasons, I find state-space models attractive for forecasting. For some of the answers to how well the models actually perform, we'll have to wait until 2011. However, we can also analyze two notorious historical episodes (the dot-com bubble and the financial crisis of 2007-2010) using a model estimated on US data up to 1990. Could these two crises have been predicted with state-space models? That will be the subject of my next post.

Friday, January 1, 2010

A Pessimistic S&P 500 Forecast for 2010-2012

It's probably appropriate that my first forecast for 2010 would involve the stock market. The US has gone through two stock market bubbles (the dot-com bubble from 1998--2000 and the subprime mortgage crisis from 2007--2010, hopefully) in addition to a direct terrorist attack on New York City (September 11, 1002) that paralyzed the financial system. Before presenting a forecast for 2010, let's look at how well my time series models dealt with this period of stock market instability.
The figure above displays a feather forecast of volume (the upper panel) and price for the S&P 500 (data from Yahoo! Finance). Although the models track the stock market data very well (displayed here), the feather forecasts (in this case, the feather forecast starts from every month in the sample and predicts forward three years) show that the S&P 500 was over-valued during the dot-com bubble, undervalued during 9-11 and over-valued again during the subprime mortgage crisis. In other words, using the models would have suggested a conservative investing strategy and warned of developing stock market bubbles.

The S&P 500 model is driven by the state of the US economy. Using financial terminology, the market fundamentals are determined by the state of the economy. Using systems terminology, the stock market is a hierarchical subsystem within the US economy. The three forecasts (dotted red, blue and green lines) in the graphic above suggest that the future will depend on the state of the US economy. Volumes will be flat while prices will either rally or remain flat based on the future of the US economy. Not really that surprising a range of forecasts.

When I use an actual forecast for the state of the US economy through 2012 as input for the model, the forecast is pessimistic: declining prices and increasing volumes, in other words, a sell off. You can compare my forecast to the one provided by the Financial Forecast Center here. Their forecast for prices also points downward (out to Jul 2010).

Of course, it's also possible to create an optimistic forecast for the S&P 500 based on a more optimistic forecast for the US economy and the usual disclaimers apply. We're at a point where uncertainty rules.




Forecasts for the New Year

In prior posts, I have made forecasts for the US economy and the World system. For the US economy, I have developed forecasts for nuclear power (optimistic) and for the airline industry (pessimistic). For the World system, I have forecasted relatively poor performance for the European Cap-and-Trade (emission trading) system. The forecasts were based on outputs from my time series models of the US economy and the World system.

My New Year's resolution is to make a series of forecasts, at least for the next few years and possibly for the entire decade. My intention will be to review these forecasts in January of 2011 to see how well the models performed.