firecat: damiel from wings of desire tasting blood on his fingers. text "i has a flavor!" (Default)
[personal profile] firecat
http://motherjones.com/politics/2011/03/denial-science-chris-mooney
The Science of Why We Don't Believe Science
How our brains fool us on climate, creationism, and the vaccine-autism link.
— By Chris Mooney

http://www.nytimes.com/2011/04/21/opinion/21bazerman.html
Stumbling Into Bad Behavior
Op-Ed Contributor
By MAX H. BAZERMAN and ANN E. TENBRUNSEL

I see a lot of articles lately about how we can't control our thoughts and beliefs as much as we would like to think we can.

I tend to believe this, and I think it's a good countermeasure against the kind of thing that Barbara Ehrenreich describes in Bright-sided: the notion that positive thinking is guaranteed to bring you everything you want, if you only work at it hard enough.

But I also note that it can be used to excuse behavior that causes harm.

Bazerman and Tenbrunsel:
we have found that much unethical conduct that goes on, whether in social life or work life, happens because people are unconsciously fooling themselves. They overlook transgressions — bending a rule to help a colleague, overlooking information that might damage the reputation of a client — because it is in their interest to do so.
...
When we fail to notice that a decision has an ethical component, we are able to behave unethically while maintaining a positive self-image. No wonder, then, that our research shows that people consistently believe themselves to be more ethical than they are.
...
In the run-up to the financial crisis, corporate boards, auditing firms, credit-rating agencies and other parties had easy access to damning data that they should have noticed and reported. Yet they didn’t do so, at least in part because of “motivated blindness” — the tendency to overlook information that works against one’s best interest. Ample research shows that people who have a vested self-interest, even the most honest among us, have difficulty being objective. Worse yet, they fail to recognize their lack of objectivity.
...
A solution often advocated for this lack of objectivity is to increase transparency through disclosure of conflicts of interest. But a 2005 study by Daylian M. Cain, George Loewenstein and Don A. Moore found that disclosure can exacerbate such conflicts by causing people to feel absolved of their duty to be objective. Moreover, such disclosure causes its “victims” to be even more trusting, to their detriment.

The Chris Mooney article addresses a lot of current research on why different political groups believe different things. I haven't quoted much of that per se because I'm not interested in the political angle so much as the general notion of how bias unconsciously affects people.
an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called "motivated reasoning" helps explain why we find groups so polarized over matters where the evidence is so unequivocal....It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
...
In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers
...
people's deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place
...
A key question—and one that's difficult to answer—is how "irrational" all this is. On the one hand, it doesn't make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information....Indeed, there's a sense in which science denial could be considered keenly "rational." In certain conservative communities, explains Yale's Kahan, "People who say, 'I think there's something to climate change,' that's going to mark them out as a certain kind of person, and their life is going to go less well."
...
political sophisticates are prone to be more biased than those who know less about the issues. "People who have a dislike of some policy—for example, abortion—if they're unsophisticated they can just reject it out of hand," says Lodge. "But if they're sophisticated, they can go one step further and start coming up with counterarguments." These individuals are just as emotionally driven and biased as the rest of us, but they're able to generate more and better reasons to explain why they're right—and so their minds become harder to change.

I wonder if there are ways people can learn to see their biases better and compensate for them? Or is this all leading to "We're all hoplessly biased and rationalizing animals, so we might as well not even try to get closer to an agreement on what's happening, what's causing what, and what causes harm?"

Date: 21 Apr 2011 11:01 pm (UTC)
snippy: Lego me holding book (Default)
From: [personal profile] snippy
We are not the rational animal, but the rationalizing animal.

Date: 21 Apr 2011 11:29 pm (UTC)
laughingrat: A detail of leaping rats from an original movie poster for the first film of Nosferatu (Default)
From: [personal profile] laughingrat
I wonder if there are ways people can learn to see their biases better and compensate for them?

I think so? I mean, it's something I'm actively working on at my morning job, and it's not like I'm a special snowflake or anything. I'm low-energy, crabby, and frankly, kind of a butthole. (Not proud, just, you know, being honest.) So...maybe people in general have more capabilities around that than we want to admit (or give ourselves credit for)?

Date: 22 Apr 2011 12:43 am (UTC)
jesse_the_k: Bambi fawn cartoon with two heads (Conjoined Bambi)
From: [personal profile] jesse_the_k
Thanks for making me think.

Recognizing my willingness to forgive my own ethical lapses helped me back away from citizen involvement in my municipality. (Now at least I can whine full throatedly.)

Date: 22 Apr 2011 11:36 am (UTC)
nancylebov: (green leaves)
From: [personal profile] nancylebov
I don't know if the site would suit you, but the folks at Less Wrong work on thinking more clearly and taking that into action.

Date: 22 Apr 2011 04:04 pm (UTC)
bitterlawngnome: (Default)
From: [personal profile] bitterlawngnome
well, to belabour the bovious, scientific method is pretty good for determining the objective reality™ of any assertion

Date: 22 Apr 2011 04:52 pm (UTC)
bitterlawngnome: (Default)
From: [personal profile] bitterlawngnome
I was just narrowly responding to your question - "I wonder if there are ways people can learn to see their biases better and compensate for them?" - and of course the answer is yes, that's what scientific method is supposed to do. So the problem is actually that people are actively rejecting the only tested method we have for rigorously challenging our assumptions.

I think I've probably said it to you before but my contention is that the most fundamental human drive is the urge to make sense of experience - also known as "why?". Science does not answer "why". I think people would be more accepting of science if it were presented as "electrons orbit the nucleus because jehova made them that way." But of course that wouldn't be science anymore.

Date: 22 Apr 2011 10:13 pm (UTC)
elainegrey: Inspired by Grypping/gripping beast styles from Nordic cultures (Default)
From: [personal profile] elainegrey
The Harvard Business news had an article about how the cog sci research is changing how ethics is taught so people *CAN* see their biases better!

goes to dig

http://hbswk.hbs.edu/item/6563.html

Recognizing why we do this and how we can get out of the trap is the subject of the new book, Blind Spots: Why We Fail to Do What's Right and What to Do about It, by Max H. Bazerman, a professor at Harvard Business School, and Ann E. Tenbrunsel, a professor of business ethics at the University of Notre Dame.

Date: 21 Apr 2011 11:49 pm (UTC)
ext_3172: (Default)
From: [identity profile] chaos-by-design.livejournal.com
Or is this all leading to "We're all hoplessly biased and rationalizing animals, so we might as well not even try to get closer to an agreement on what's happening, what's causing what, and what causes harm?"

I do think that honesty and a willingness to take a cold, hard look at yourself and acknowledge as much as you can your own biases is a good first step. If you're aware of your biases and emotional thinking, that at least gives you some tools to deal with them, even if you can't get rid of them completely.

Date: 22 Apr 2011 02:13 am (UTC)
ext_116349: (Default)
From: [identity profile] opalmirror.livejournal.com
Democracy: the tyranny of truthiness.

Date: 22 Apr 2011 07:18 am (UTC)
From: [identity profile] beaq.livejournal.com
Hrum. Institutionally, isn't science a way of compensating for biases? It's not going to decide what's important and what constitutes "harm", but it's at least a way to record what's happening. Over the long haul. Individually, I think maybe the only thing to do is uh. Yeah. Teach people early on about how interpretive bias and rationalization work?

Date: 22 Apr 2011 07:27 am (UTC)
From: [identity profile] beaq.livejournal.com
Right. I guess I wasn't sure what the conditions were. Do you mean, how do we, individual people, decide what science is good science?

Date: 22 Apr 2011 07:28 am (UTC)
From: [identity profile] beaq.livejournal.com
If so, then, uh. Good science education?

Date: 22 Apr 2011 07:43 am (UTC)
From: [identity profile] beaq.livejournal.com
Which I couldn't really define except insofar as it should make a person skeptical.

Linketies and life

Date: 24 Apr 2011 03:12 am (UTC)
From: [identity profile] pingback-bot.livejournal.com
User [livejournal.com profile] moominmuppet referenced to your post from Linketies and life (http://moominmuppet.livejournal.com/1560144.html) saying: [...] recent articles; "The Science of Why We Don't Believe Science" and "Stumbling Into Bad Behavior" [...]

Profile

firecat: damiel from wings of desire tasting blood on his fingers. text "i has a flavor!" (Default)
firecat (attention machine in need of calibration)

June 2025

S M T W T F S
1 234567
8910111213 14
15161718192021
22232425262728
2930     

Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated 24 Jun 2025 08:41 pm
Powered by Dreamwidth Studios