Helpless, helpless, helpless
21 Apr 2011 01:10 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
http://motherjones.com/politics/2011/03/denial-science-chris-mooney
The Science of Why We Don't Believe Science
How our brains fool us on climate, creationism, and the vaccine-autism link.
— By Chris Mooney
http://www.nytimes.com/2011/04/21/opinion/21bazerman.html
Stumbling Into Bad Behavior
Op-Ed Contributor
By MAX H. BAZERMAN and ANN E. TENBRUNSEL
I see a lot of articles lately about how we can't control our thoughts and beliefs as much as we would like to think we can.
I tend to believe this, and I think it's a good countermeasure against the kind of thing that Barbara Ehrenreich describes in Bright-sided: the notion that positive thinking is guaranteed to bring you everything you want, if you only work at it hard enough.
But I also note that it can be used to excuse behavior that causes harm.
Bazerman and Tenbrunsel:
The Chris Mooney article addresses a lot of current research on why different political groups believe different things. I haven't quoted much of that per se because I'm not interested in the political angle so much as the general notion of how bias unconsciously affects people.
I wonder if there are ways people can learn to see their biases better and compensate for them? Or is this all leading to "We're all hoplessly biased and rationalizing animals, so we might as well not even try to get closer to an agreement on what's happening, what's causing what, and what causes harm?"
The Science of Why We Don't Believe Science
How our brains fool us on climate, creationism, and the vaccine-autism link.
— By Chris Mooney
http://www.nytimes.com/2011/04/21/opinion/21bazerman.html
Stumbling Into Bad Behavior
Op-Ed Contributor
By MAX H. BAZERMAN and ANN E. TENBRUNSEL
I see a lot of articles lately about how we can't control our thoughts and beliefs as much as we would like to think we can.
I tend to believe this, and I think it's a good countermeasure against the kind of thing that Barbara Ehrenreich describes in Bright-sided: the notion that positive thinking is guaranteed to bring you everything you want, if you only work at it hard enough.
But I also note that it can be used to excuse behavior that causes harm.
Bazerman and Tenbrunsel:
we have found that much unethical conduct that goes on, whether in social life or work life, happens because people are unconsciously fooling themselves. They overlook transgressions — bending a rule to help a colleague, overlooking information that might damage the reputation of a client — because it is in their interest to do so.
...
When we fail to notice that a decision has an ethical component, we are able to behave unethically while maintaining a positive self-image. No wonder, then, that our research shows that people consistently believe themselves to be more ethical than they are.
...
In the run-up to the financial crisis, corporate boards, auditing firms, credit-rating agencies and other parties had easy access to damning data that they should have noticed and reported. Yet they didn’t do so, at least in part because of “motivated blindness” — the tendency to overlook information that works against one’s best interest. Ample research shows that people who have a vested self-interest, even the most honest among us, have difficulty being objective. Worse yet, they fail to recognize their lack of objectivity.
...
A solution often advocated for this lack of objectivity is to increase transparency through disclosure of conflicts of interest. But a 2005 study by Daylian M. Cain, George Loewenstein and Don A. Moore found that disclosure can exacerbate such conflicts by causing people to feel absolved of their duty to be objective. Moreover, such disclosure causes its “victims” to be even more trusting, to their detriment.
The Chris Mooney article addresses a lot of current research on why different political groups believe different things. I haven't quoted much of that per se because I'm not interested in the political angle so much as the general notion of how bias unconsciously affects people.
an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called "motivated reasoning" helps explain why we find groups so polarized over matters where the evidence is so unequivocal....It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
...
In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers
...
people's deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place
...
A key question—and one that's difficult to answer—is how "irrational" all this is. On the one hand, it doesn't make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information....Indeed, there's a sense in which science denial could be considered keenly "rational." In certain conservative communities, explains Yale's Kahan, "People who say, 'I think there's something to climate change,' that's going to mark them out as a certain kind of person, and their life is going to go less well."
...
political sophisticates are prone to be more biased than those who know less about the issues. "People who have a dislike of some policy—for example, abortion—if they're unsophisticated they can just reject it out of hand," says Lodge. "But if they're sophisticated, they can go one step further and start coming up with counterarguments." These individuals are just as emotionally driven and biased as the rest of us, but they're able to generate more and better reasons to explain why they're right—and so their minds become harder to change.
I wonder if there are ways people can learn to see their biases better and compensate for them? Or is this all leading to "We're all hoplessly biased and rationalizing animals, so we might as well not even try to get closer to an agreement on what's happening, what's causing what, and what causes harm?"
no subject
Date: 21 Apr 2011 11:01 pm (UTC)no subject
Date: 21 Apr 2011 11:29 pm (UTC)I think so? I mean, it's something I'm actively working on at my morning job, and it's not like I'm a special snowflake or anything. I'm low-energy, crabby, and frankly, kind of a butthole. (Not proud, just, you know, being honest.) So...maybe people in general have more capabilities around that than we want to admit (or give ourselves credit for)?
no subject
Date: 22 Apr 2011 12:43 am (UTC)Recognizing my willingness to forgive my own ethical lapses helped me back away from citizen involvement in my municipality. (Now at least I can whine full throatedly.)
no subject
Date: 22 Apr 2011 11:36 am (UTC)no subject
Date: 22 Apr 2011 04:41 pm (UTC)no subject
Date: 22 Apr 2011 04:04 pm (UTC)no subject
Date: 22 Apr 2011 04:37 pm (UTC)This wouldn't particularly matter except that some of these people make policy...
no subject
Date: 22 Apr 2011 04:52 pm (UTC)I think I've probably said it to you before but my contention is that the most fundamental human drive is the urge to make sense of experience - also known as "why?". Science does not answer "why". I think people would be more accepting of science if it were presented as "electrons orbit the nucleus because jehova made them that way." But of course that wouldn't be science anymore.
no subject
Date: 22 Apr 2011 10:13 pm (UTC)goes to dig
http://hbswk.hbs.edu/item/6563.html
Recognizing why we do this and how we can get out of the trap is the subject of the new book, Blind Spots: Why We Fail to Do What's Right and What to Do about It, by Max H. Bazerman, a professor at Harvard Business School, and Ann E. Tenbrunsel, a professor of business ethics at the University of Notre Dame.
no subject
Date: 22 Apr 2011 10:16 pm (UTC)no subject
Date: 21 Apr 2011 11:49 pm (UTC)I do think that honesty and a willingness to take a cold, hard look at yourself and acknowledge as much as you can your own biases is a good first step. If you're aware of your biases and emotional thinking, that at least gives you some tools to deal with them, even if you can't get rid of them completely.
no subject
Date: 22 Apr 2011 02:13 am (UTC)no subject
Date: 22 Apr 2011 07:18 am (UTC)no subject
Date: 22 Apr 2011 07:20 am (UTC)Yes. But the article goes into the extent to which biases come back into play as soon as any scientific research is reported outside the immediate group studying that field. (The article says that people decide whose qualifications to trust based more on whether they agree with the results than whether the qualifications are sound.)
no subject
Date: 22 Apr 2011 07:27 am (UTC)no subject
Date: 22 Apr 2011 07:28 am (UTC)no subject
Date: 22 Apr 2011 07:43 am (UTC)no subject
Date: 22 Apr 2011 09:35 am (UTC)Linketies and life
Date: 24 Apr 2011 03:12 am (UTC)