Thursday, December 31, 2009

Chertoff on Continuing Information Sharing Problems

This morning, on CNN, Michael Chertoff (the second director of the US Department of Homeland Security) gave two reasons why the US was unable to identify the Undie Bomber: (1) The European Union (EU) blocked US access to their Visa database and (2) The airlines have been reluctant to upload all their passenger information to the federal government.

If the US, EU and Canadians built a publish/subscribe index system, there would be no need to either access or share databases. Here's how it would work. The Europeans would publish only the identifying information for individuals who, for example, were denied a Visa. They would publish the event when the denial occurred.

This is very different from what Mr. Chertoff wants when he says "We have to have access to these databases." To me this means someone in the National Counterterrorism Center in the US would be logging in to the European Visa database. This is a bad idea, which the EU rightly rejected, for a number of reasons: (1) It's too slow and is based on human intervention which becomes two points of potential failure. (2) It presents a security risk to the EU in that someone has to manage the accounts and the identification methods of those in the US that are allowed access. Legitimate, dormant accounts provide one method hackers can use to access a system.

The idea of asking the airlines to upload their entire passenger database to the federal government is also a bad idea: (1) It's too slow. If it happens over night or even two hours before flight departure, the current data is always a few hours out of date e.g., someone today purchasing a ticket with cash won't be part of the last data load. (2) It's unnecessary. What is the federal government going to do with all the mundane administrative information contained in a reservation system?

The NY Times today published a description of how slow, manual and subject to human failure the current system is and was in the case of the Undie bomber. This is the mentality in the US federal government: give us all the data and we'll have someone sit at a computer and look through it. Information sharing involves humans sending information back and forth to the authorities. Any approach that is not electronically based; that does not focus on indexing in real time objects and events of importance for US security; that does not allow all the participants to publish and subscribe; and that doesn't decentralize decision making, will create more security problems than it solves.

Tuesday, December 29, 2009

Information Sharing and Counterterrorism

The recent attempt by a Nigerian man, evidently working for the Yemeni branch of Al Qaeda, to set off an underwear bomb (he is being called the Undiebomber) in a flight from Amsterdam to Detroit, has created new concerns about US security. The typical fire-drills after a terrorist threat (increased TSA shakedowns on domestic flights, new restrictions on luggage, removal of shoes, etc.) are similar to the ones that resulted from an attempt by Richard Reid on December 22, 2001 to set off a shoe bomb on an America Airlines flight. Hopefully, future travelers will not have to take off their underwear to get through airport security.

The interesting issue for me involves the continuing failure of information sharing. In response to 9/11, the US federal government has "squandered tens of millions of dollars on faulty technology, like high-tech 'puffer' machines that repeatedly broke down and flunked the most basic tests ... [but also] ... the government has yet to fully deploy a sophisticated method for matching passenger names with terrorist watch lists." The alleged Nigerian terrorist was flying from Lagos, Nigeria through Amsterdam to Detroit without luggage, possibly on a one-way ticket paid for in cash! This information alone should have raised many, many RED FLAGS, but it didn't. The dots still are not being connected.
To address the continuing failure of information sharing, I have a straight-forward solution. It's the same solution I presented to the US Department of Homeland Security (DHS) in 2004 [here, here and here] and is displayed in the graphic above. Create an XML-based indexing system that maintains pointers to sources that have information about objects of interest (e.g., people on terrorism watch lists). It is essentially a publish/subscribe system: agencies can electronically query the system, match objects indexed to objects about which they have information, and return new index records pointing to their holdings. Events would trigger new publish/subscribe transactions electronically. TSA screeners, for example, would scan passenger tickets and the system would be queried electronically. Criteria for secondary screening of passengers would be flexible and could be linked to the national threat level and changed instantly. Security for the underlying information would remain with the agency.

In 2004, DHS didn't seem very interested in my idea. They wanted all potential terrorist information stored in a very flexible XML format fusion center that would support any kind of direct querying. If that's what DHS is still insisting on, it won't happen within the bureaucracy of the US Federal government. Agencies guard their data too carefully and the central repository could not really guarantee the security of the mega-database (the 9/11 Commission Report section 13.3 UNITY OF EFFORT IN INFORMATION SHARING is interesting). In any event, it's not clear to me how the existing systems (TSA's Secure Flight and the National Counterterrorism Center's TIDE, which were based on the Northwest Airlines CAPS program) work together or are linked to, for example, the State Department's VISA database. From the descriptions, they seem too centralized rather than distributed--push the decisions as close as possible to the front lines--and too reliant on human querying.

My system isn't perfect. Agencies have to be willing or at least be compelled to query and publish to the database. The provision that they retain their own information should help with cooperation. Privacy advocates have questioned whether the index itself amounted to a fusion center or whether adequate safeguards were in place to accurately identify people. Since the system would not contain original data, only pointers, it is not a fusion center. Accurate personal identification remains a problem. A REAL ID system with stronger privacy protections than currently proposed could help reduce the identification problem (the current government ID requirement for air travel is weak). Certainly, the passport ID system could be strengthened.

This is not to say that there is a simple technical solution to the information sharing problem. There are plenty of other factors related to the growth of the US economy, the growth of the US airline industry and the growth of the US federal government. I'll talk about these issues in other posts.

Monday, December 28, 2009

Cap-and-Trade Market Failure

The NY Times recently editorialized on the collapse in the price of carbon on the European Climate Exchange (the ECX system of emissions trading) after the Copenhagen Conference. Although the price of carbon emissions has collapsed to $18.20 per ton, the Times is optimistic: "Fortunately, there is good reason to believe the price of European emission permits will rise over time. Their price tends to fall when the price of oil or the economy slows---dynamics that reduce energy usage and naturally cut emissions of carbon. As the world fell even deeper into recession last year, the price of permits tumbled from a peak of around $45 per ton in July 2008."
The graphic above shows the three relvant time series and a forecast for each. The upper panel shows total volume on the ECX and the second panel shows the price of futures contracts for carbon emissions in December 2010 (data from the ECX). My business-as-usual (BAU) forecast is for a continued price decline and a rebound next year. The bottom panel shows World Oil Prices (data from the US EIA). In the model as in the Times analysis, futures prices of carbon emissions and world oil prices are intimately related. And, my BAU forecast is for increasing oil prices (this shouldn't be a surprise).

The other part of the Times analysis, however, is equally important. Everything depends on what happens in the world economy.

If Europe manages to disconnect from the world economy (no carbon leakage) and fix CO2 emissions, the forecast above would suggest the ECX would be on the path to developing an effective cap-and-trade system.

However, if Europe continues to be linked to the world system (more likely?), there will be continued instability in emissions with no evident cap. If that happens, the cap-and-trade system would probably collapse.

In 2008, the GAO did an analysis of the ECX and concluded it wasn't working as anticipated and made a number of other interesting points I'll discuss in a future posting. Through all this hand-wringing, however, it's important to remember that the best proven way to limit carbon emissions is to control the growth of the world system (best in that controversial cap-and-trade markets are not need, proven in that we've just seen slowing growth of the world system reduce emissions and least likely given the political reality of growth mania).

Thursday, December 24, 2009

A Future for Nuclear Power?

The New York Times is reporting that a US DOE loan program combined with cap-and-trade legislation may give new life to the moribund US nuclear power industry. However, the future of nuclear power is not very bright given the problems, to include: "high relative costs; perceived adverse safety; environmental and health effects; potential security risks stemming from proliferation; and unresolved challenges in long-term management of nuclear wastes." A study in the Bulletin of the Atomic Scientists is equally pessimistic given problems with the existing fleet of nuclear power plants.
My own forecasts (using a three-factor index model of the US economy) are more optimistic. Net generating capacity (the top panel above) peaks after 2040 at about a 30% share (lower panel) of total electricity generation.
My forecast should be contrasted with the very pessimistic forecast from the US EIA (the solid line above) compared to my forecast (the dotted line) for net generation in billion kilowatt hours. The EIA forecast is based on an analysis of plans and goals of the nuclear power industry. Time will tell; no one knows the future. The EIA forecast seems more reasonable.

Tuesday, December 22, 2009

Airline Re-regulation

The process of airline re-regulation is about to start with DOT's decision to fine airlines that keep passengers waiting in planes for more than three hours before takeoff. Airline deregulation started in 1978 and there are two views about how successful it has been. The first view, advanced by neo-liberal and neo-conservative economists is that airline regulation has had "... overwhelmingly positive results." Any small problems (poor service, bankruptcy, safety violations, monopoly practices, overloaded traffic control systems, NIMBY constraints on more airport construction, lack of profitability, luggage fees, TSA shake-downs, intransigent unions, terrorist attacks, etc.) would be solved by more free market fundamentalism.

The other view is that airline deregulation has been an unmitigated disaster--for all the same reasons. Environmentalist would also add that air travel has a very large carbon footprint (even higher than automobile travel). In fact, airlines will be the first industry (even before coal-burning power utilities) to face cap-and-trade requirements in the European Union.

Which side of the argument you favor depends to some extent on your view of the future. If the future holds unrestrained exponential growth in air travel with continually decreasing prices, your projections about the future are probably based on assumptions about how increasingly free markets generate unlimited economic growth. If the future doesn't look that positive, you probably favor some kind of airline re-regulation. You might also favor a broader perspective that looks at a range of transportation alternatives (you probably recall that the Penn Central railroad failed a few years before airlines were deregulated and marked the end of long-haul private-sector passenger service in the US).
So, how well does the airline free market work and what are the predictions for the future? The figure above (data from the US Bureau of Transportation Statistics) shows a forecast for the airline market with airfare prices on the bottom graph and airline operations on the upper graph. Prior to 2009, there was a huge expansion of airline operations peaking in 2008 with very modest price movements (actually, prices have been highly variable, increasing to 2001, decreasing to 2005, increasing again to 2008 and then collapsing during the subprime mortgage crisis--the time plot is just compressed due to the large forecasted price increase).

The forecast suggests that we haven't seen anything yet: prices are going to skyrocket and operations are going to stagnate. If this happens, travelers are going to divert to automobiles (my wife and I are already doing this for long trips in the US). We would use long-haul passenger train service (as we do in Europe) but that was dismantled in the late 1970's. If only the system had not been dismantled in the brave new world of economic deregulation. Do you think we'll all ever have personal jet packs?

P.S. My forecasting model finds that there is very little interaction between quantities and prices in the airline market. That shouldn't be surprising since operations are flow-constrained. Market fundamentalism won't remove the current network and environmental constraints so the neo-liberal dreams of unconstrained growth in air travel are just dreams.

Friday, December 18, 2009

Controlling Drug Promotion

The US and New Zealand are the only countries that allow direct-to-consumer (DTC) advertising of pharmaceuticals. The policy debate in the US involves whether or not DTC advertising should be controlled. On Wednesday, I presented a paper on the topic in the Health Care Track at the Winter Simulation Conference. My response to the policy debate: controlling DTC advertising is unlikely to have much impact and misses a better approach to controlling overall drug promotion.
The time plot above (from the Donohue et. al. 2007 data set), provides the simplest summary of my findings. DTC advertising has increased modestly since 1997 when the FDA modified its side-effect disclosure rules to allow television advertising. The real action, as can be seen from the time plots, involves free samples and promotional detailing (pharmaceutical sales agents direct contact with physicians). In fact, starting in 2004, there has been a substitution between free samples and promotional detailing. Detailing is expensive and physicians tend to discount claims of sales agents. The marginal cost of handing out free samples is very small and has a powerful effect on future prescribing. Controlling free samples and detailing, rather than the much more visible DTC advertising, provides the most direct path to controlling the effects of drug promotion on the sales of patent medicines.

Monday, December 14, 2009

Improving Health Care Through Computer Simulation

In the morning session at the WSC Conference, Sally Brailsford from the University of Southampton, UK presented an interesting statistic: under 10% of all healthcare simulation studies reported in the massive academic literature were actually implemented! The politics and conflicting stakeholder demands in healthcare present the major hurdles to implementation. This should be contrasted with manufacturing where discrete event simulation has had a major impact on increasing the efficiency of industrial processes. In Tillal Eldabi's presentation, he noted that we are dealing with the "wicked nature of healthcare problems." A wicked problem indeed!

Cracks in Climate Policy

In today's keynote address at the Winter Simulation Conference, Undersecretary of Science Ray Orbach highlighted three major challenges for computer simulation: modeling cracks that form in the containment vessels of nuclear reactors, taking a systems approach to CO2 generation and absorption (a fixable flaw in the Kyoto Protocol and the U.S. Cap-and-Trade plans) and including human behavior in Global Circulation Models (GCMs). Each of these problems will require immense amounts of computing power and the U.S. Department of Energy has the computing power available through its Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program.

I'll pick up these topics in later posts. Back to the conference...

Wednesday, December 9, 2009

Squeeze the Public Option Trigger

Ben Smith reports that, in response to the Senate shift away from a public option, an insurance industry insider says "We WIN. Administered by private insurance companies. No government competitor."

There are lots of reason to think a public option trigger won't work and will face opposition in the Senate. A well-crafter trigger that went into effect when measurable conditions weren't met (decreasing percent without health insurance, reasonable rates, elmination of regional monopolies, etc.) and was supported by a strong planning effort over the next few years, could be effective. Does anyone think our political process can produce such a result?

In the end, it's time for our political leaders to prove their stuff. They were elected to govern and now they need to deliver. If the only way forward is a public option or if it's expansion of Medicare or if it's expansion of the Federal Employees Health Benefits Plan (FEHB), now is the time to see whether our publicly elected officials can make policy that benefits the American people.

Tuesday, December 8, 2009

Stabilizing Emissions with Policy Wedges

Robert Socolow and Stephen Pacala have a plan to keep carbon emissions in check (as discussed in today's Global Warming Debate). The plan is based on a divide-and-conquer strategy: divide the total emission reductions needed into manageable pieces ("wedges") and propose existing technologies to tackle each wedge.

Total carbon emissions are forecast to be 14 billion tons a year by 2056. To get back to the 2006 level of 7 billion tons per year you need seven billion-ton-a-year wedges for the next fifty years. Pacala and Sokolow actually propose 15 wedges covering end-user efficiency and conservation, power generation, carbon capture and storage (CCS), alternative energy sources and agriculture and forestry--an ample menu of existing technologies to choose from. And, after 2056, we can implement another 3 wedges to get us back to 4 billion tons a year which is around the amount of carbon that the existing earth systems can effectively absorb.

Essentially, Pacala and Sokolow take the Emission Equation and focus on carbon intensity, energy intensity and population growth (it could be one wedge if reduced) while leaving output per capita (economic growth) alone. This is a very attractive formulation (it's even been applied to controlling the US health care system). Electric cars, wind turbines, solar panels, new CCS coal-fired power plants, super-insulated homes, etc. all create economic growth and improve our standard of living.

There is even a Stabilization Wedge Game that can be used as a teaching tool. Actually, the game has drawn more criticism than the scientific articles: the costs are underestimated, the implementation time is underestimated and the demand side (population growth and economic growth) is ignored in favor of technological "fixes".

My problem with Stabilization Wedges is that they ignore the systemic aspects of the environment. Carbon is not the only problem facing the world system. Demand has increased our ecological footprint beyond sustainable levels and there is no quick technical fix for creating more ecological capacity than our current Earth system can provide. We will need both supply and demand solutions.

Saturday, December 5, 2009

Mammography and the PSA Test

One criticism of both mammography and the PSA test (for prostate cancer) has been that both tests produce false positives. What should be remembered is that positive results from either mammography or the PSA test does not lead immediately to radiation, chemotherapy or surgery.

For mammography, the American College of Radiography has a uniform way for radiologists to describe mammogram findings and suggest a follow-up plan.
Notice that levels 4-6 require a biopsy after a positive test. The same is true for the PSA test. Each is part of a process and the presence of cancer is still established with a biopsy even if the initial screening test is positive.

Tuesday, December 1, 2009

Carbon Accounting and Policy in Copenhagen

The challenge for policy makers at the upcoming (Dec 7 - Dec 18) UN Climate Change Conference in Copenhagen can be seen from some simple carbon accounting and one equation (as discussed today in the Global Warming Debate).


The Figure above (from IPCC 2007 WG1 Ch. 7 Fig. 7.3) displays the estimated carbon cycle for the 1990's. What's important to notice is the equilibrium flows (up- and down-arrows). The black arrows indicate pre-industrial "natural" fluxes and the red arrows indicate "anthropogenic" (man-made) fluxes. The question here is how much carbon will the biosphere and the oceans absorb relative to how much is emitted. If you add together all the net fluxes for Weathering, Respiration, Land, and Oceans you get 4.4 GtC/yr of carbon absorbed by the biosphere and the oceans. Of course, notice the 6.4 GtC/yr unbalanced emission from fossil fuels.
To say it another way, the biosphere and the oceans are capable of absorbing about 4.4 GtC/yr (+/- 20%). The figure above shows the actual world carbon emissions from 1950 to 2010. In 2008, we emitted 8.59 GtC/yr which is about twice the absorptive capacity of the biosphere and oceans. In other words, sometime in the 1970's or early 1980's the world's carbon cycle went out of equilibrium. Ultimately, that equilibrium has to be restored.

The "Emissions Equations" shows our policy choices:

CO2 = (N) x (Q/N) x (E/Q) x (CO2/E)

(CO2 Emissions) = (Population) x (Output per capita) x (Energy Production per capita) x (Carbon Intensity)

(CO2 Emissions) = (Population) x (Output per capita) x (Energy Intensity) x (Carbon Intensity)

If we want to control CO2 emissions we can (1) reduce population growth, (2) reduce output per capita, (3) reduce energy intensity or (4) reduce carbon intensity. Since reducing population growth is off the table (only China has tried population control) and since reducing output per capita (economic growth) is off the table, we are left with the technical challenges of reducing energy intensity (heavily insulated buildings, electric cars, mass transit, etc.) and reducing carbon intensity using renewable energy sources (solar, wind, geothermal, etc.)
If only it was a "simple" as totally changing out our existing energy systems. The figure above shows a plot of the World's Ecological Footprint (EF), a measure of human demand on all the Earth's ecosystems. The figure above is calculated as a ratio of the number of Earths needed to support human demand over the number of earths actually available (one Earth). Interestingly enough, we exceeded the Earth's ability to meet human demands at about the same time in the 1970's that we exceeded the ability of the biosphere and the oceans to absorb our carbon emissions.

There is no immediate, quick fix technical solution (heavily insulated buildings, electric cars, etc.) for the EF problem. However, both the EF and the carbon cycle have to be brought back into equilibrium. To be successful in Copenhagen, policy makers will have to find a way to take us back the the 1970's in terms of our carbon emissions and our demands on the Earth's ecosystems. With population growth and economic growth taken off the table, they don't have a chance.