A reader recently emailed me a link to this post on The Malcontent asking me to comment on the appearance of Marc Siegal on The Daily Show. Siegal is the author of a book that claims we're systematically over-estimating the threat of terrorism. Well, that depends, but it certainly is possible to over-estimate a threat. And it's somewhat ironic that the Moveon folks are currently suggesting that we underestimated the threat to New Orleans, now that the cat has escaped the bag.
Awhile ago an opinion research team conducted a survey in small-town Iowa, asking the question: "What security issue is the most urgent?" (This was long before 9-11.) Of the options, which included terrorism, bad weather, a crop blight, and a disease attacking livestock, the one most frequently mentioned was "crime in the streets." This was rural Iowa, remember. So the interviewer asked a followup: "What crime, in what streets?" Turns out they were afraid of crime in the streets of Chicago, because most read Chicago papers like The Trib or watched Chicago TV News. They were definitely over-estimating that threat, and probably under-estimating the threats to their livelihood from crop blight and livestock disease.
But the point here, upon which Dr. Siegal may have only a weak grasp, has to do with living under conditions of uncertainty. The issue isn't whether we're over-estimating the first-order threat to ourselves as individuals. That may or may not be true, but about the most one can say is that we might be over-estimating it. We don't know and we may not ever be able to know until the odds become unity, and our challenge is that we have to make a rational tradeoff between the consequences of over and under-estimation. The methodological issue is that with terrorism it's better to over-estimate than under-estimate the threat, because it's a Beta situation involving rational concern about Type II error. The point, in a Beta situation, is to minimize the odds of a false sense of security. It's precisely analogous to a concern about false acquittal. Under conditions of high uncertainty that, almost by definition, means there will be more false alarms, or in the criminal justice analogy, more false convictions. It's just the nature of the beast, and we have to learn to live with it.
There are also two components to a threat assessment: the odds that it will happen and the consequences if it does. Ideally we want low odds of occurrence and minimal consequences. Some things have low odds of occurrence, but catastrophic consequence, like the Earth being hit by a planet-killing asteroid or the eruption of a caldera volcano. You multiply the odds of occurrence times some standardized estimate of the first-order consequences, to obtain the threat. There are also primary and secondary threats, and the secondary threats have to be considered and summed. If you don't know the odds of occurrence you produce a range of threat assessments and decide what you can live with. The first-order threat from a caldera volcano eruption would be to the people within, say, a 500-mile radius. The second-order threat might be several orders of magnitude greater, impacting the entire globe, but it may also be harder to predict because the number of variables, as well as their uncertainty, are greater. Right now just about everyone thinks we ought to have planned for the possibility of a Cat 5 hurricane hitting New Orleans, but the ex-post odds of that occurrence were actually pretty low. According to the Corps of Egineers, less than 1 in 200 in any given year. Still, it's a Beta situation if the consequences of a false sense of security are greater than the consequences of a false sense of alarm. See?
Finally, there are the game-theoretical considerations which go beyond the damage assessments of the first and second-order consequences. These apply to so-called "natural disasters" to some extent because human reactions are involved (looting for example) but they're far more important to man-made disasters. For instance, one consequence of a failure to retaliate in the event of a nuclear attack would be that the enemy might assume a low cost to themselves of followup attacks. Or, in Wretchard's Third Conjecture, the consequences of a tit-for-tat slow escalation after a terrorist WMD attack is that both sides would suffer catastrophic losses in a drawn out doomsday scenario. In that case it becomes less costly to initiate an all-out nuclear strike on all possible enemy targets, rather that adopting a tit-for-tat approach. Ironically, you save more lives by being merciless. The consequences are far greater than a few piddling hurricanes, probably exceeding the consequences of a caldera eruption.
And that's the real nature of the threat we face, almost universally misunderstood and discounted. That's the ultimate "sum of all fears," and there's just no rational way to minimize it except to do everything in our power to ensure that things don't go that far. We don't have the option of reducing the catastrophic consequences, so the only thing we can do is reduce the odds of occurrence. Because if we don't, our options just run out. It is possible for a situation to arise in which the only defense against genocide, is genocide.
What we fear is a real possibility, so it's not irrational to bring as much of it under our direct and indirect control as possible. And just about the most rational and far-sighted strategy we can adopt now to avoid that awful choice later is to sew the seeds of liberal democracy in place of authoritarianism and tyranny. There is naturally some uncertainty about whether the strategy is necessary, and there's even more uncertainty about whether it will work. But make no mistake, we have the best of all possible strategies, and the only question left is whether we have the best of all possible implementations.
Seriously, that's the whole of it.
Posted by Demosophist at September 8, 2005 12:46 AM | TrackBack