The book that received the most positive reaction in a recent homeland security course was Thinking, Fast and Slow, by Daniel Kahneman.
Kahneman won a Nobel Prize in economics. He never took an economics course. He is one of the people who created behavioral economics.
Creating a new discipline is a very good way to avoid taking an economics course.
I think the book is overwritten. It’s more than 400 pages; 250 pages would probably have been enough.
But if you’ve received a Nobel Prize, your books can be as long as you want.
Here is a video (about an hour long) where Kahneman outlines the core ideas in his book.
The video gets going for real at about the 8 minute mark. (Modified Wadsworth Constant at work.)
————————————–
Here’s a question:
What is 2+2?
You probably have an immediate answer.
Here’s another question:
What is 17 times 36?
You probably do not have an immediate answer.
Kahneman posits two thinking styles. System 1 is quick, intuitive and emotional.
System 2 isn’t.
————————————–
What do you think about this image?
Or what about this headline and paragraph:
Climate-Change Deniers Are On The Ropes — But So Is The Planet
It’s been a tough few weeks for the forces of climate-change denial.
First came the giant billboard with Unabomber Ted Kacynzki’s face plastered across it: “I Still Believe in Global Warming. Do You?” Sponsored by the Heartland Institute, the nerve-center of climate-change denial, it was supposed to draw attention to the fact that “the most prominent advocates of global warming aren’t scientists. They are murderers, tyrants, and madmen.” Instead it drew attention to the fact that these guys had over-reached, and with predictable consequences.
According to Kahneman’s findings, if you like and trust the Heartland Institute, you are likely to accept the anecdotal story the billboard tells, more than any climate-alarmist propaganda or scientific evidence about climate change.
If you like and trust, ThinkProgress — the source of the On the Ropes tale — you like their anecdotes; maybe more than science.
If you don’t know anything about Heartland or ThinkProgress, you used some other System 1 shortcut to decide which story worked better for you.
————————————–
Stories are concrete, specific and immediate. They cut through the need for all that heavy thinking stuff.
Stories appeal to System 1.
System 2 is slower, deliberative and a more logical way of thinking. It’s also a more difficult style to use.
It takes work.
Have you calculated 17 times 36 yet?
Nope; probably not worth the effort.
That’s System 2 at work; or rather at avoiding work.
————————————–
Here’s another question:
A bat and a ball together cost $1.10. The bat cost one dollar more than the ball. How much does the ball cost?
System 1 says the ball cost ten cents.
Next question?
System 2, when it gets around to it, — and in some tests, 80% of the time it doesn’t get around to it — System 2 will let you know ten cents is the wrong answer.
(If your System 2 gets off the couch, you’ll see why ten cents is not the correct answer.)
People are more afraid of dying in a terrorist attack than they are afraid of dying.
But the chances of dying are greater than the chances of dying in a terrorist attack. Why does the less likely path to death have a more greater emotional impact?
System 1 again.
People are more likely to believe the following statement is true:
“Woes unite foes”
than they are to believe this statement is true:
“Woes unite enemies”
Why do people tend “to see the rhyming [aphorisms] as more accurate than the non-rhyming ones”?
Even if both sayings mean the same thing.
System 1 likes rhymes.
————————————–
What does this have to do with homeland security?
The April 2012 issue of Risk Analysis (“An Official Publication of the Society for Risk Analysis”) is filled with examples illustrating the significance of risk perception and communication.
The issue is titled “Risk Perception Behavior: Anticipating and Responding to Crisis.”
Take a look at the table of contents.
Or look at this report about one of the articles: [my emphasis]
A dirty bomb attack centered on downtown Los Angeles’ financial district could severely impact the region’s economy to the tune of nearly $16 billion, fueled primarily by psychological effects that could persist for a decade….
“We decided to study a terrorist attack on Los Angeles not to scare people, but to alert policymakers just how large the impact of the public’s reaction might be,” said study co-author William Burns, a research scientist at Decision Research in Eugene, Ore. “This underscores the importance of risk communication before and after a major disaster to reduce economic losses.”….
“The economic effects of the public’s change in behavior are 15 times more costly than the immediate damage in the wake of a disaster.”
“These findings illustrate that because the costs of modern disasters are so large, even small changes in public perception and behaviors may significantly affect the economic impact….”
Or look at these slides that report on a recent experiment about “Inoculation as a Strategy for Achieving Assertive Risk Communication.” [Please keep in mind the important caution that slides cannot substitute for the full study or being present at what I was told was a “fascinating” presentation by world class scholars.]
Assertive risk communication means “actively and continuously anticipating and preempting counter-arguments” that might be generated by someone else’s System 1 response to, say, a catastrophic incident in the United States.
“Inoculating messages foster resistance to counterarguments,” says one of the slides.
“Inoculation messages move individuals in the desired direction—initially enhancing confidence.
Inoculation messages enhance resistance to counter-arguments in high-risk circumstances.
Using inoculation messages fortify what is known about best practices for risk and crisis communication.” reports another slide.
So, what does that mean in practice?
Assume “a commercial airliner carrying 253 passengers from Los Angeles to New York exploded 70 minutes into flight leaving no survivors. Air traffic control lost radar contact with the plane and within minutes local officials in Nevada began receiving reports from witnesses who saw debris falling from the sky.”
Some people speculate it was terrorism. Others wait for evidence. No one is quite sure
What should the assertive risk communication message be to inoculate an uncertain nation against jumping to “inappropriate” System 1 conclusions? Or if people are going to jump to System 1 anyway, what kind of counter-perception could be seeded?
One message is:
The Department of Homeland Security said it had “no specific, credible information regarding an active terrorist plot against the U.S. at this time, although we continue to monitor efforts by al-Qa’ida and its affiliates to carry out terrorist attacks, both in the Homeland and abroad.”
Or how about this one:
“In addition to this event, DHS has detected and prevented numerous terrorist plots. All of these plots have been thwarted by a combination of intelligence work, policing, and citizen participation.”
————————————–
Right now, I’m wondering what your System 1 response is to either message and to the idea of inoculating people through assertive risk communication.
My System 2 reaction is I think people interested in homeland security will benefit from reading Thinking, Fast and Slow.
Or at least listening to Kahneman talk about his ideas.