Quickly Noted: Peter Sandman Spits Knowledge — Again
To further reduce the backlog, here’s a post from three weeks ago that I wanted to make sure that you all saw.
Dr. Sandman is one of my favorites in the communications field. After 9/11, his crisis communication theories have shaped tons of what folks in this field have done. When he speaks, you’ll be better off after sitting up and listening — even if you don’t necessarily agree.
In late May, while Deepwater Horizon was still on the front page, Dr. Sandman was interviewed by blogger DemfromCT on the subject of the spill response. The whole thing is good, so go read it here, but if you can’t get there due to work filters (the interview is hosted on an explicitly political website, DailyKos) here are some choice cuts:
In a crisis it is extremely damaging to come back later and say “it’s worse than we thought.” Far better to come back later and say “it’s not as bad as we feared.” So estimates and predictions about uncertain situations should be as alarmist as they need to be in order to virtually guarantee that “worse than we thought” won’t be tomorrow’s story. From the outset, BP should have been saying something like this: “We’re hearing estimates as high as 25,000 barrels a day, and some even higher. We’re also hearing lower estimates, some as low as 5,000 barrels a day. We hope the lower estimates turn out right, but we’re not counting on that. For now, we’re thinking of this as a 25,000-barrel-a-day disaster.”
In short, Cassandra is a better guide to crisis communication than Pollyanna. Yes, you’re hoping for the best, and working to avert the worst – and you should say so. But you’re also preparing for the worst, and urging the rest of us to prepare for the worst: emotionally as well as practically. Say that too.
and my favorite part
There is one significant downside to this recommended strategy. If the situation turns out a lot less serious than your worst case prediction, perhaps even less serious than your likeliest prediction, you’re due for some criticism for having “hyped the oil spill.” (Or the pandemic http://www.psandman.com/… . Or the hurricanehttp://www.nola.com/… .) People will blame you for all the worry you put them through, all the tourists you scared away unnecessarily, all the expense you wasted on preparedness that wasn’t needed. The risk perception literature calls this “hindsight bias.” http://www.forecasters.org/… After the results are known, people tend to imagine you should have been able to guess right. So there’s a fair amount of anger in the world today at public health officials for having taken the H1N1 pandemic more seriously than it ended up deserving. In the U.K. a few months ago, some of the same newspapers were simultaneously criticizing http://news.bbc.co.uk/… the U.K. government for buying too much pandemic vaccine and too little road salt and grit. Editorialists felt that officials should have guessed correctly that the pandemic would turn out mild and the winter would turn out severe.
But it’s not “damned if you do and damned if you don’t.” The criticism for under-responding to what turns out a full-bore emergency is long-lived and devastating. The criticism for over-responding to what turns out a minor problem is briefer and milder. Not to mention that under-responding costs lives, while over-responding costs mostly money and anxiety. If you are trying to figure out how much to sound the alarm about an evolving uncertain situation, remember this rule: It’s “DARNED if you do, damned if you don’t.”