In the spirit of the other day’s entry on the differences between conservatives and liberals, which is really about the differences in how people think, and why some people appear to insist on thinking with some part of their bodies other than their brains, comes this nice discussion, with abundant links, on taking a neuroscience slant to those same questions.
Articles within articles within articles! A sample:
The piece riffs on a recent study published in Science that reported that conservatives show greater skin conductance and higher blink rates to threatening images than liberals, indicating higher levels of arousal.
This was widely interpreted as suggesting conservatives are more fearful than liberals. Although the study didn’t ask about fear directly, both blinking and sweating have been linked to elevated fear responses before.
These interpretations are interesting, because they immediately make a value judgement about whether the fear response is appropriate or not. As the Slate piece notes, another interpretation is that liberal participants were less emotionally responsive.
If you’ve read, listened to, or thought about the Heidt results, this isn’t any surprise, really: if you have different moral axioms that guide your interpretation of the world, then you’re going to have different physiological responses to stimuli that push those moral buttons. Even so, this is a nice overview for the increasingly-mythical Person Who Gives A Shit.
From a completely surprising direction comes this piece on sort of the same idea. Seth Godin’s blog is usually about business and marketing and having good ideas so you won’t have to have some suckass job to pay the rent but rather some tremendous job that will allow you to blossom like the beautiful flower that you are. (I’m trying to work in a burn on DDB, who I like to think about as “blossoming into a beautiful flower” but I can’t think of anything.)
Every person makes decisions based on their worldview and the data at hand. If two people have the same worldview and the same data, they’ll make the same decision, every time (unless they’re stupid.)
In my experience, a closed-minded worldview (“I can’t read that book, I disagree with it”) is the most difficult hurdle to overcome. But a closed-minded worldview doesn’t mean you’re stupid, it means that you are selling yourself and your colleagues and your community short.
There must be something in the air; everybody’s beating around the edge of this idea. Godin is talking about business, so you can forgive him for using the language of change. He’s asking: what do we have to do to change this guy’s worldview? That’s fine for business, which is quantifiably successful or not, and whose DNA can be discussed as if there were some Right Answer. (Sort of. Not really, but let’s not complicate things.) But for people, is it “right” to avenge your father’s murder by killing the guy who knifed him, even if that guy has a wife and kids who will have no means of support if you do?
On a side note, the game Ultima VI, which was released … in 1990, I think, created your character in just this way. It asked you a bunch of difficult questions, and forced you to come up with a moral hierarchy for your character. I stole that “avenge your father” question directly from the game. Wes will doubtless be glad to hear of an example of games being considerably MORE morally sophisticated than culture.