Interesting interview from The Believer
magazine with psychologist Jonathan Haidt
[full text], who argues that ethical judgments are almost always snap judgments, and that our justifications come later:
JONATHAN HAIDT: People almost always start out by saying it’s wrong. Then they start to give reasons. The most common reasons involve genetic abnormalities or that it will somehow damage their relationship. But we say in the story that they use two forms of birth control, and we say in the story that they keep that night as a special secret and that it makes them even closer. So people seem to want to disregard certain facts about the story. When the experimenter points out these facts and says “Oh, well, sure, if they were going to have kids, that would cause problems, but they are using birth control, so would you say that it’s OK?” And people never say “Ooooh, right, I forgot about the birth control. So then it is OK.” Instead, they say, “Oh, yeah. Huh. Well, OK, let me think.”
So what’s really clear, you can see it in the videotapes of the experiment, is: people give a reason. When that reason is stripped from them, they give another reason. When the new reason is stripped from them, they reach for another reason. And it’s only when they reach deep into their pocket for another reason, and come up empty-handed, that they enter the state we call “moral dumbfounding.” Because they fully expect to find reasons. They’re surprised when they don’t find reasons. And so in some of the videotapes you can see, they start laughing. But it’s not an “it’s so funny” laugh. It’s more of a nervous-embarrassment puzzled laugh. So it’s a cognitive state where you “know” that something is morally wrong, but you can’t find reasons to justify your belief. Instead of changing your mind about what’s wrong, you just say: “I don’t know, I can’t explain it. I just know it’s wrong.” So the fact that this state exists indicates that people hold beliefs separate from, or with no need of support from, the justifications that they give. Or another way of saying it is that the knowing that something is wrong and the explaining why are completely separate processes.
BLVR: Are the subjects satisfied when they reach this state of moral dumbfounding? Or do they find something deeply problematic about it?
JH: For some people it’s problematic. They’re clearly puzzled, they’re clearly reaching, and they seem a little bit flustered. But other people are in a state that Scott Murphy, the honors student who conducted the experiment, calls “comfortably dumbfounded.” They say with full poise: “I don’t know; I can’t explain it; it’s just wrong.” Period. So we do know that there are big differences in people on a variable called “need for cognition.” Some people need to think about things, need to understand things, need to reason about things. Many of these people go to graduate school in philosophy. But most people, if they don’t have a reason for their moral judgments, they’re not particularly bothered.
BLVR: So your conclusion is that while we might think that Reason or reasons are playing a big causal role in how we arrive at moral judgments, it’s actually our intuitions—fueled by our emotions—that are doing most of the work. You say in your paper that reason is the press secretary of the emotions, the ex post facto spin doctor.
JH: Yes, that’s right.
If some of Haidt's scenarios seem familiar to you, I suspect you've played Taboo
at The Philosopers' Magazine
and referenced several other times).