Helpless, helpless, helpless
http://motherjones.com/politics/2011/03/denial-science-chris-mooney
The Science of Why We Don't Believe Science
How our brains fool us on climate, creationism, and the vaccine-autism link.
— By Chris Mooney
http://www.nytimes.com/2011/04/21/opinion/21bazerman.html
Stumbling Into Bad Behavior
Op-Ed Contributor
By MAX H. BAZERMAN and ANN E. TENBRUNSEL
I see a lot of articles lately about how we can't control our thoughts and beliefs as much as we would like to think we can.
I tend to believe this, and I think it's a good countermeasure against the kind of thing that Barbara Ehrenreich describes in Bright-sided: the notion that positive thinking is guaranteed to bring you everything you want, if you only work at it hard enough.
But I also note that it can be used to excuse behavior that causes harm.
Bazerman and Tenbrunsel:
The Chris Mooney article addresses a lot of current research on why different political groups believe different things. I haven't quoted much of that per se because I'm not interested in the political angle so much as the general notion of how bias unconsciously affects people.
I wonder if there are ways people can learn to see their biases better and compensate for them? Or is this all leading to "We're all hoplessly biased and rationalizing animals, so we might as well not even try to get closer to an agreement on what's happening, what's causing what, and what causes harm?"
The Science of Why We Don't Believe Science
How our brains fool us on climate, creationism, and the vaccine-autism link.
— By Chris Mooney
http://www.nytimes.com/2011/04/21/opinion/21bazerman.html
Stumbling Into Bad Behavior
Op-Ed Contributor
By MAX H. BAZERMAN and ANN E. TENBRUNSEL
I see a lot of articles lately about how we can't control our thoughts and beliefs as much as we would like to think we can.
I tend to believe this, and I think it's a good countermeasure against the kind of thing that Barbara Ehrenreich describes in Bright-sided: the notion that positive thinking is guaranteed to bring you everything you want, if you only work at it hard enough.
But I also note that it can be used to excuse behavior that causes harm.
Bazerman and Tenbrunsel:
we have found that much unethical conduct that goes on, whether in social life or work life, happens because people are unconsciously fooling themselves. They overlook transgressions — bending a rule to help a colleague, overlooking information that might damage the reputation of a client — because it is in their interest to do so.
...
When we fail to notice that a decision has an ethical component, we are able to behave unethically while maintaining a positive self-image. No wonder, then, that our research shows that people consistently believe themselves to be more ethical than they are.
...
In the run-up to the financial crisis, corporate boards, auditing firms, credit-rating agencies and other parties had easy access to damning data that they should have noticed and reported. Yet they didn’t do so, at least in part because of “motivated blindness” — the tendency to overlook information that works against one’s best interest. Ample research shows that people who have a vested self-interest, even the most honest among us, have difficulty being objective. Worse yet, they fail to recognize their lack of objectivity.
...
A solution often advocated for this lack of objectivity is to increase transparency through disclosure of conflicts of interest. But a 2005 study by Daylian M. Cain, George Loewenstein and Don A. Moore found that disclosure can exacerbate such conflicts by causing people to feel absolved of their duty to be objective. Moreover, such disclosure causes its “victims” to be even more trusting, to their detriment.
The Chris Mooney article addresses a lot of current research on why different political groups believe different things. I haven't quoted much of that per se because I'm not interested in the political angle so much as the general notion of how bias unconsciously affects people.
an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called "motivated reasoning" helps explain why we find groups so polarized over matters where the evidence is so unequivocal....It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
...
In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers
...
people's deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place
...
A key question—and one that's difficult to answer—is how "irrational" all this is. On the one hand, it doesn't make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information....Indeed, there's a sense in which science denial could be considered keenly "rational." In certain conservative communities, explains Yale's Kahan, "People who say, 'I think there's something to climate change,' that's going to mark them out as a certain kind of person, and their life is going to go less well."
...
political sophisticates are prone to be more biased than those who know less about the issues. "People who have a dislike of some policy—for example, abortion—if they're unsophisticated they can just reject it out of hand," says Lodge. "But if they're sophisticated, they can go one step further and start coming up with counterarguments." These individuals are just as emotionally driven and biased as the rest of us, but they're able to generate more and better reasons to explain why they're right—and so their minds become harder to change.
I wonder if there are ways people can learn to see their biases better and compensate for them? Or is this all leading to "We're all hoplessly biased and rationalizing animals, so we might as well not even try to get closer to an agreement on what's happening, what's causing what, and what causes harm?"
no subject
Yes. But the article goes into the extent to which biases come back into play as soon as any scientific research is reported outside the immediate group studying that field. (The article says that people decide whose qualifications to trust based more on whether they agree with the results than whether the qualifications are sound.)
no subject
no subject
no subject
no subject